Glider from the game of Life, rising from the left




Topic: #opacity

Risks of Singleton Technologies


Dan Geer explains the social, political, and security risks of programmatically displacing manual processes and alternative algorithmic designs with interdependent, standardized, or centralized technologies. Such monoliths may or may not be fragile, but their only failure modes are catastrophic.

“A Rubicon”
Daniel E. Geer, Jr., Hoover Institution, February 7, 2018

If an algorithm cannot be verified then do not trust it.

To be precise, algorithms derived from machine learning must never be trusted unless the “Why?” of decisions those algorithms make can be usefully examined on demand. This dictum of “interrogatability” may or may not be effectively design-assured while there is still time to do so — that is, to do so pre-dependence. Once the change to design-assure interrogatability is lost — that is to say once dependence on a non-interrogatable algorithm is consummated — going back to non-self-modifying algorithms will prove to be costly, if even possible. …

The central thesis of this essay is that an accessible, continuously exercised analog option is essential to the national security and to the inclusionary polity we hold dear. …

As a matter of national security, keeping non-technical exits open requires action and it requires it now. It will not happen by itself, and it will never again be as cheap or feasible as it is now. Never again will national security and individual freedom jointly share a call for the same initiative at the same time. In a former age, Dostoevsky told us, “The degree of civilization in a society can be judged by entering its prisons.” From this point on, that judgement will be passed on how well we preserve a full life for those opting out of digitalization. There is no higher embodiment of national security than that.

#opacity #technological-skepticism #dystopia

The Opacity of Black-Box Metrics


This week, I've been reading The Tyranny of Metrics, a new book by the historian Jerry Z. Muller of the Catholic University of America. One of the themes of the book is that metrics lose their reliability when they are transparently tied to rewards. For example, a hospital might decide to give bonuses to surgeons whose operations have a higher rate of success, as measured by the percentage of those operations after which the patient survives for at least thirty days. The idea is to improve the overall quality and performance of surgical operations in the hospital by motivating surgeons to do better work. In practice, however, what often happens is that surgeons refuse to take on high-risk patients or arrange for their patients' post-op caretakers to use heroic measures to keep them alive for at least thirty-one days. The metrics award higher scores to the surgeons who successfully game the system, and they receive their bonuses but the overall quality and performance of surgical operations do not, in fact, increase as a result. The metric has lost any reliability it once had as a measure of overall quality and performance.

It occurs to me that, as black-box deciders take over the job of assessing the performance of workers and deciding which of them should receive bonuses, the opacity of the decision systems may block this loss of reliability, by making it much more difficult, perhaps impossible, for the workers to game the system. If there is no explanation for the black-box decider's assessments, there is no way for the workers to infer that any particular tactic will change those assessments in their favor.

Of course, this also means that there is no way for managers to devise rational policies for improving the work of their staff. Because the black-box deciders are opaque and their judgements inexplicable and unaccountable, there is no way to distinguish policy changes that will have positive results (as assessed by the black-box decider) from those that will have negative results.

#black-box-deciders #metrics #opacity

Hashtag index

This work is licensed under a Creative Commons Attribution-ShareAlike License.

Atom feed

John David Stone (

created June 1, 2014 · last revised December 10, 2018