“Chinese Government Forces Residents to Install Surveillance App with Awful Security”
Joseph Cox, Motherboard, April 9, 2018
JingWang scans for specific files stored on the device, including HTML, text, and images, by comparing the phone's contents to a list of MD5 hashes. …
JingWang also sends a device's phone number, device model, MAC address, unique IMEI number, and metadata of any files found in external storage that it deems dangerous to a remote server. …
As for handling that data, … JingWang exfiltrated data without any sort of encryption, instead transferring it all in plaintext. The app updates are not digitally signed either, meaning they could be swapped for something else without a device noticing.
Dan Geer explains the social, political, and security risks of programmatically displacing manual processes and alternative algorithmic designs with interdependent, standardized, or centralized technologies. Such monoliths may or may not be fragile, but their only failure modes are catastrophic.
Daniel E. Geer, Jr., Hoover Institution, February 7, 2018
If an algorithm cannot be verified then do not trust it.
To be precise, algorithms derived from machine learning must never be trusted unless the “Why?” of decisions those algorithms make can be usefully examined on demand. This dictum of “interrogatability” may or may not be effectively design-assured while there is still time to do so — that is, to do so pre-dependence. Once the change to design-assure interrogatability is lost — that is to say once dependence on a non-interrogatable algorithm is consummated — going back to non-self-modifying algorithms will prove to be costly, if even possible. …
The central thesis of this essay is that an accessible, continuously exercised analog option is essential to the national security and to the inclusionary polity we hold dear. …
As a matter of national security, keeping non-technical exits open requires action and it requires it now. It will not happen by itself, and it will never again be as cheap or feasible as it is now. Never again will national security and individual freedom jointly share a call for the same initiative at the same time. In a former age, Dostoevsky told us, “The degree of civilization in a society can be judged by entering its prisons.” From this point on, that judgement will be passed on how well we preserve a full life for those opting out of digitalization. There is no higher embodiment of national security than that.
As a proponent and practitioner of technological skepticism, the practice of assessing technological innovations and judging whether they will make me any wiser, better, happier, or more helpful to others before deciding whether or not to adopt them, I'm gratified to see other people thinking along the same lines and trying to organize the outraged victims of thoughtlessly misdesigned technology.
“There Are No Guardrails on Our Privacy Dystopia”
David Golumbia and Chris Gilliard, Motherboard, March 9, 2018
Tech companies … have demonstrated that they are neither capable nor responsible enough to imagine what harms their technologies may do. If there is any hope for building digital technology that does not include an open door to wolves, recent experience has demonstrated that this must include robust engagement from the non-technical — expert and amateur alike — not just in response to the effects of technologies, but to the proposed functions of those technologies in the first place.
“What If Designers Took a Hippocratic Oath?”
Sanjena Sathian, Point Taken, PBS, January 1, 2016
What are we really looking at? A next-generation consumer advocacy battle, one in which a victory depends not on class action lawsuits or government oversight but on popular awareness and education.