A professional software developer describes how he came to write software that helped the United States Army kill people. His first-person account is followed by a few similar anecdotes from other developers and observers and concludes with some lessons about how to avoid killing people with your software.
“Don't Get Distracted”
Caleb Thompson, November 16, 2017
The project owner conveniently left out its purpose when explaining the goals. I conveniently didn't focus too much on that part. It was great pay for me at the time. It was a great project. Maybe I just didn't want to know what it would be used for. I got distracted.
An attempt to identify and explain the ethical preconditions for replacing social policies with algorithmic models. It's incomplete, but the questions that are included are relevant and salient, and the cautionary tales and links are thought-provoking.
“Math Can't Solve Everything: Questions We Need to Be Asking Before Deciding an Algorithm Is the Answer”
Jamie Williams and Lena Gunn, Deeplinks, Electronic Frontier Foundation, May 7, 2018
In 2016, the Federal Bureau of Investigation felt so strongly that it needed to access the contents of a suspected terrorist's encrypted iPhone that it persuaded the Department of Justice to lean on Apple, threatening to prosecute under the All Writs Act of 1789 unless Apple agreed to develop a tool for breaking into encrypted iPhones and to provide it to the FBI. Apple declined, and eventually the FBI hired a company that had already developed such a tool to do the job for them, thus eliminating the threat against Apple. (The terrorist's iPhone contained nothing of interest.)
This episode struck people as sufficiently stupid and disgusting that the Department of Justice asked its Office of the Inspector General to prepare a report explaining exactly what happened and why. The report is now available (with redactions):
“A Special Inquiry Regarding the Accuracy of FBI Statements concerning Its Capabilities to Exploit an iPhone Seized during the San Francisco Terror Attack Investigation”
Oversight and Review Division, Office of the Inspector General, U.S. Department of Justice, March 2018
According to the report, one branch of the FBI, the Remote Operations Unit (ROU) of the Operational Technology Division, had already hired another outside company to develop a tool that would break into that iPhone, and this vendor successfully demonstrated the tool on March 16, 2016. However, the ROU didn't tell anyone else in the FBI about this accomplishment, and the separate branch of the FBI that was responsible for the investigation of the suspected terrorist never asked the ROU about it, partly, perhaps, because the FBI wanted to establish a legal precedent for bullying Apple and other tech companies into doing their work for them, but also because most of the stuff that the ROU develops is classified, and using classified tools to acquire key evidence in a criminal case is a generally a bad idea, since the discovery process can easily reveal the existence and nature of those tools.
In practice, the Department of Justice frequently uses classified tools to acquire key evidence in criminal cases because they can often get away with it, but it still isn't a good idea, and the FBI shouldn't promote it.
However, the Inspector General's report recommends that the various branches of the FBI shouldn't withhold information about hacking tools from one another and encourages the FBI to complete the reorganization that it has already begun “to consolidate resources to address the ‘Going Dark’ problem and improve coordination between the units that work on computer and mobile devices.”
The Cryptography Fellow at the Stanford Center for Internet and Society points out the foreseeable consequences:
“The Dark Side of the ‘Apple vs. FBI’ OIG Report”
Riana Pfefferkorn, Center for Internet and Society, April 18, 2018
If the OIG report prompts the FBI to give the CEAU [Cryptographic and Electronic Analysis Unit], which focuses on criminal matters, more access to tools developed or acquired by ROU, which focuses on national security matters, that could have a detrimental effect on federal criminal cases. When seeking search and seizure warrants, the FBI may not fully explain to judges that they are asking for authorization to use sophisticated, technological techniques to extract evidence from defendants' devices. In the resulting prosecutions, the government may refuse to disclose information about the classified technique, or even its existence, to defense counsel or experts. That secrecy will impair the court's truth-seeking function as well as the defendant's ability to mount a defense.
What is more, removing the divide between criminal and national security tools could ultimately hurt the FBI, too. If courts do order disclosure of the FBI's techniques in criminal cases, the FBI's national security and intelligence units might decide that they cannot risk using those techniques anymore. That is a significant reason why the wall was there in the first place: to protect those missions. …
It is ironic that the OIG report into the FBI's behavior during Apple vs. FBI may lead to the FBI's criminal investigators achieving that case's objective: getting more capabilities to crack into digital devices.
The ethical and prudential faults in this situation just go on and on: A company that discovers flaws in iPhone security has an ethical responsibility to report those flaws to Apple so that they can be fixed, instead of concealing the vulnerabilities and selling exploitation tools to other parties. The FBI certainly should not be hiring companies to produce such tools. If it does acquire such tools, the FBI also has an ethical responsibility to report the flaws to Apple instead of exploiting them. It also has an ethical responsibility to try to get them declassified before exploiting them, since a domestic law-enforcement organization does not need and should not have national-security clearances and should not rely on them in day-to-day operations if they do have them.
If the Remote Operations Unit does acquire and exploit classified system-cracking tools, it has a prudential obligation to make its resources available wherever they are needed within the agency and so should not conceal such tools from other branches of the FBI. But the CEAU should not use such tools in criminal investigations, for the reasons that Pfefferkorn explains: Doing so breaks the prosecution of such cases. Indeed, the Department of Justice should not even use evidence acquired through the use of classified system-cracking tools, precisely because judges should exclude such evidence and any inferences based on it.
Our institutions are so thoroughly shot through with unethical, unprofessional, and corrupt misbehavior that it is hard even to figure out where a reform project should begin.
“Ex-FBI Director Comey in New Book Says Trump Is ‘Unethical and Untethered to Truth,’ Demanded Loyalty Like a Mafia Boss”
Associated Press, April 12, 2018
Well, I suppose it had to happen sooner or later. If you spend seventy years electing one corrupt war criminal after another to the presidency of the United States, eventually you're bound to wind up with one who is unethical.