Glider from the game of Life, rising from the left

Unity

Archives

Blogroll

Topic: #dystopia

Mass Surveillance of Uyghurs in China

2018-04-09⊺14:44:18-05:00

This particular work of oppression is conducted through a smartphone app, in an extravagantly inept way characteristic of government coders.

“Chinese Government Forces Residents to Install Surveillance App with Awful Security”
Joseph Cox, Motherboard, April 9, 2018
https://motherboard.vice.com/en_us/article/ne94dg/jingwang-app-no-encryption-china-force-install-urumqi-xinjiang

JingWang scans for specific files stored on the device, including HTML, text, and images, by comparing the phone's contents to a list of MD5 hashes. …

JingWang also sends a device's phone number, device model, MAC address, unique IMEI number, and metadata of any files found in external storage that it deems dangerous to a remote server. …

As for handling that data, … JingWang exfiltrated data without any sort of encryption, instead transferring it all in plaintext. The app updates are not digitally signed either, meaning they could be swapped for something else without a device noticing.

#surveillance #China #dystopia

Risks of Singleton Technologies

2018-03-20⊺14:13:05-05:00

Dan Geer explains the social, political, and security risks of programmatically displacing manual processes and alternative algorithmic designs with interdependent, standardized, or centralized technologies. Such monoliths may or may not be fragile, but their only failure modes are catastrophic.

“A Rubicon”
Daniel E. Geer, Jr., Hoover Institution, February 7, 2018
https://www.hoover.org/sites/default/files/research/docs/geer_webreadypdfupdated2.pdf

If an algorithm cannot be verified then do not trust it.

To be precise, algorithms derived from machine learning must never be trusted unless the “Why?” of decisions those algorithms make can be usefully examined on demand. This dictum of “interrogatability” may or may not be effectively design-assured while there is still time to do so — that is, to do so pre-dependence. Once the change to design-assure interrogatability is lost — that is to say once dependence on a non-interrogatable algorithm is consummated — going back to non-self-modifying algorithms will prove to be costly, if even possible. …

The central thesis of this essay is that an accessible, continuously exercised analog option is essential to the national security and to the inclusionary polity we hold dear. …

As a matter of national security, keeping non-technical exits open requires action and it requires it now. It will not happen by itself, and it will never again be as cheap or feasible as it is now. Never again will national security and individual freedom jointly share a call for the same initiative at the same time. In a former age, Dostoevsky told us, “The degree of civilization in a society can be judged by entering its prisons.” From this point on, that judgement will be passed on how well we preserve a full life for those opting out of digitalization. There is no higher embodiment of national security than that.

#opacity #technological-skepticism #dystopia

Advocacy for Technological Skepticism

2018-03-09⊺14:44:48-06:00

As a proponent and practitioner of technological skepticism, the practice of assessing technological innovations and judging whether they will make me any wiser, better, happier, or more helpful to others before deciding whether or not to adopt them, I'm gratified to see other people thinking along the same lines and trying to organize the outraged victims of thoughtlessly misdesigned technology.

“There Are No Guardrails on Our Privacy Dystopia”
David Golumbia and Chris Gilliard, Motherboard, March 9, 2018
https://motherboard.vice.com/en_us/article/zmwaee/there-are-no-guardrails-on-our-privacy-dystopia

Tech companies … have demonstrated that they are neither capable nor responsible enough to imagine what harms their technologies may do. If there is any hope for building digital technology that does not include an open door to wolves, recent experience has demonstrated that this must include robust engagement from the non-technical — expert and amateur alike — not just in response to the effects of technologies, but to the proposed functions of those technologies in the first place.

“What If Designers Took a Hippocratic Oath?”
Sanjena Sathian, Point Taken, PBS, January 1, 2016
http://www.pbs.org/wgbh/point-taken/blog/ozy-what-if-designers-took-hippocratic-oath/

What are we really looking at? A next-generation consumer advocacy battle, one in which a victory depends not on class action lawsuits or government oversight but on popular awareness and education.

#technological-skepticism #privacy #dystopia

Hashtag index

This work is licensed under a Creative Commons Attribution-ShareAlike License.

Atom feed

John David Stone (havgl@unity.homelinux.net)

created June 1, 2014 · last revised December 10, 2018