Glider from the game of Life, rising from the left




Topic: #technological-skepticism

Using Facebook Slightly Less Dangerously


We're starting to see a new genre of advice columns, featuring instructions on how to use some piece of modern technology safely, given of course that it's impossible to really use it safely, since the users' understanding of how it works and what they want to accomplish with it is flatly incompatible with its design and with the business model of its maker and licensor.

The journalists who write in this genre are people who know better than to try to use the technology, but use it anyway because their jobs require it and because they know that their readers are going to use it as well, even those who also know better than to try.

“The Motherboard Guide to Using Facebook Safely”
Lorenzo Franceschi-Bicchierai, Motherboard, March 21, 2018

You can't really stop all collection. In fact, even if you leave Facebook (or have never been part of the social network), the company is still gathering data on you and building a shadow profile in case you ever join. …

Facebook's entire existence is predicated on tracking and collecting information about you. If that concept makes you feel creeped out, then perhaps you should quit it. But if you are willing to trade that off for using a free service to connect with friends, there's still some steps you can take to limit your exposure.

#facebook #surveillance #technological-skepticism

Risks of Singleton Technologies


Dan Geer explains the social, political, and security risks of programmatically displacing manual processes and alternative algorithmic designs with interdependent, standardized, or centralized technologies. Such monoliths may or may not be fragile, but their only failure modes are catastrophic.

“A Rubicon”
Daniel E. Geer, Jr., Hoover Institution, February 7, 2018

If an algorithm cannot be verified then do not trust it.

To be precise, algorithms derived from machine learning must never be trusted unless the “Why?” of decisions those algorithms make can be usefully examined on demand. This dictum of “interrogatability” may or may not be effectively design-assured while there is still time to do so — that is, to do so pre-dependence. Once the change to design-assure interrogatability is lost — that is to say once dependence on a non-interrogatable algorithm is consummated — going back to non-self-modifying algorithms will prove to be costly, if even possible. …

The central thesis of this essay is that an accessible, continuously exercised analog option is essential to the national security and to the inclusionary polity we hold dear. …

As a matter of national security, keeping non-technical exits open requires action and it requires it now. It will not happen by itself, and it will never again be as cheap or feasible as it is now. Never again will national security and individual freedom jointly share a call for the same initiative at the same time. In a former age, Dostoevsky told us, “The degree of civilization in a society can be judged by entering its prisons.” From this point on, that judgement will be passed on how well we preserve a full life for those opting out of digitalization. There is no higher embodiment of national security than that.

#opacity #technological-skepticism #dystopia

Advocacy for Technological Skepticism


As a proponent and practitioner of technological skepticism, the practice of assessing technological innovations and judging whether they will make me any wiser, better, happier, or more helpful to others before deciding whether or not to adopt them, I'm gratified to see other people thinking along the same lines and trying to organize the outraged victims of thoughtlessly misdesigned technology.

“There Are No Guardrails on Our Privacy Dystopia”
David Golumbia and Chris Gilliard, Motherboard, March 9, 2018

Tech companies … have demonstrated that they are neither capable nor responsible enough to imagine what harms their technologies may do. If there is any hope for building digital technology that does not include an open door to wolves, recent experience has demonstrated that this must include robust engagement from the non-technical — expert and amateur alike — not just in response to the effects of technologies, but to the proposed functions of those technologies in the first place.

“What If Designers Took a Hippocratic Oath?”
Sanjena Sathian, Point Taken, PBS, January 1, 2016

What are we really looking at? A next-generation consumer advocacy battle, one in which a victory depends not on class action lawsuits or government oversight but on popular awareness and education.

#technological-skepticism #privacy #dystopia

Hashtag index

This work is licensed under a Creative Commons Attribution-ShareAlike License.

Atom feed

John David Stone (

created June 1, 2014 · last revised December 10, 2018