Glider from the game of Life, rising from the left




A Bredlik Embedded in a Sonnet


“Poem: I Lik the Form”
O. Westin, Micro SF/F, March 21, 2018

#pushing-the-envelope #bredlik #humor

Senate Fails to Invoke War Powers Act to Stop War in Yemen


Forty-five Republicans and ten Democrats. *Sigh.* The Democrats were Coons of Delaware, Cortez Masto of Nevada, Donnelly of Indiana, Heitkamp of North Dakota, Jones of Alabama (*sigh*), Manchin of West Virginia, Menendez of New Jersey, Nelson of Florida, Reed of Rhode Island, and Whitehouse of Rhode Island.

“15 Years after the Invasion of Iraq, Here Are the Dems Who Just Voted for Endless War in Yemen”
Sarah Lazare, In These Times, March 20, 2018

#War-Powers-Act #Senate #Yemen

The CLOUD Act Is About to Become Law


The House of Representatives has now passed, and the Senate is on the verge of passing, the Clarifiying Overseas Use of Data Act, institutionalizing and giving notional legal cover to warrantless surveillance programs, both inside the United States and in other countries, both by American national-security and law-enforcement agencies and, if the governments agree, by their counterparts in dozens of other countries. It explicitly grants such agencies access “the contents of a wire or electronic communication and any record or any other information” about a target of investigation.

Congress is working this week on a massive budget bill. The CLOUD Act was embedded in the House version of that bill so as to ensure its passage. Microsoft, Facebook, Google, and Apple are on record as supporting the it, apparently because it would save them the cost of repeatedly litigating government demands for their users' information. Under the CLOUD Act, the grounds for such litigation would be removed, and those companies could simply yield up that information as soon as the government(s) requested it.

A coalition of advocates for privacy, civil liberties, and human rights, headed by the American Civil Liberties Union, is opposing the bill, but is unlikely to be able to block it.

“S.2383 – CLOUD Act”
Library of Congress, February 6, 2018

“H.R.4943 – CLOUD Act”
Library of Congress, February 6, 2018

“Tech Companies' Letter of Support for Senate CLOUD Act”
Apple, Facebook, Google, Microsoft, and Oath, Data Law, February 6, 2018

“CLOUD Act Coalition Letter”
CLOUD Act Coalition, American Civil Liberties Union, March 12, 2018

“A New Backdoor around the Fourth Amendment: The CLOUD Act”
David Ruiz, Deeplinks, Electronic Frontier Foundation, March 13, 2018

The CLOUD Act allows the president to enter an executive agreement with a foreign nation known for human rights abuses. Using its CLOUD Act powers, police from that nation inevitably will collect Americans' communications. They can share the content of those communications with the U.S. government under the flawed “significant harm” test. The U.S. governemnt can use that content against these Americans. A judge need not approve the data collection before it is carried out. At no point need probably cause be shown. At no point need a search warrant be obtained.

This is wrong. … The backdoor proposed in the CLOUD Act violates our Fourth Amendment right to privacy by granting unconstitutional access to our private lives online.

“Congress Could Sneak a Bill Threatening Global Privacy into Law”
Rhett Jones, Gizmodo, March 15, 2018

“House Stables Extraterritorial Search Permissions onto 2,232-Page Budget Bill; Passes It”
Tim Cushing, Techdirt, March 22, 2018

#Clarifying-Overseas-Use-of-Data-Act #law-enforcement #Fourth-Amendment

Personal Data Collected Through Facebook Leaks to Exploiters


Surveillance is essential to Facebook's business model. It collects and compiles enormous amounts of personal data on its users (and non-users), and it sells to its customers — advertisers, academics, political operatives, and others — the privilege of creating applications that collect and compile still more personal data.

In theory, Facebook doesn't actually sell its dossiers to its customers. It only licenses the data, or the right to collect data, retaining control over any further dissemination so as to maintain its ownership of its most valuable intellectual property. In practice, Facebook has no effective means of preventing its customers from copying and distributing any data they have legitimately obtained. The licenses that it relies on turn out to be quite difficult to enforce.

In 2014, a senior research associate at Cambridge University, Aleksandr Kogan, wrote a Facebook app called “thisismydigitallife.” Superficially, it was a personality quiz, but the people who signed up to take it gave Kogan permission to access their Facebook profiles and the Facebook profiles of the people they had friended. Facebook approved this arrangement but stipulated that the data that Kogan collected be used solely for the purpose of academic research.

Kogan agreed to this stipulation and proceeded to collect millions of Facebook profiles through the app. Instead of mining the data at Cambridge, however, he set up a company called Global Science Research and carried out his supposedly academic research there. Global Science Research had a million-dollar contract with another company, SCL Group. One of SCL's subsidiaries, SCL Elections, had recently secured funding to set up a new corporation, Cambridge Analytica, to explore the use of data-mining techniques to find reliable correlations between the personalities and “likes” of individual Facebook users on one hand and their political views and behaviors on the other. Because Kogan's research was funded, at least in part, by Cambridge Analytica, he apparently saw nothing wrong with sharing with his employers the data on which his research was based.

It's quite possible that sharing this data with a commercial enterprise violated Kogan's understanding with Facebook. It may also be a violation of UK data-protection laws, because Kogan asked the people who used his app only for their permission to collect and study their personal data, not for permission to share it with (or sell it to) third parties.

However, the only thing that prevented Cambridge Analytica from obtaining the same data directly from Facebook is that the license would probably have cost them much more money. Nothing in Facebook's notoriously lax, mutable, and labyrinthine privacy policies would have obstructed such a transaction if the price was right. Facebook's dossiers are their principal product, and selling access to them is their principal source of revenue.

Facebook now claims that Kogan and Cambridge Analytica have violated its terms of service and has closed their Facebook accounts. Lawsuits and threats of lawsuits are now flying in all directions, and some members of Congress are threatening to launch terrifying inquisitions into the monstrous abuse of the American electoral process that Cambridge Analytica supposedly perpetrated with the assistance of Kogan's data. However, there are now so many unlicensed copies of the data that there is no way to ensure that all of them will ever be erased, or even located. Now that arbitrarily large amounts of data can be copied quickly and inexpensively, and now that multiple backups of valuable data are the norm, the idea of restricting the distribution of data through licensing is a non-starter. It can't possibly work.

There's another reason why the lawsuits and the fulminations of member of Congress are idle, from the point of view of ordinary Facebook users (and non-users): Surveillance is essential to Facebook's business model. If Facebook stopped collecting and compiling personal data and erased its current stores, it would quickly go bankrupt. But once the dossiers exist, it is inevitable that they will be copied and disseminated, and once they are copied and disseminated, it is impossible ever to recover and destroy all of the copies, data-protection and privacy laws notwithstanding.

Instead (as Mark Zuckerberg's Facebook post on this subject makes clear), Facebook will continue to build up massive dossiers as fast as it can and will continue to use the information in those dossiers as it sees fit. The steps that Zuckerberg describes as “protecting users' data” are all designed to protect Facebook's proprietary interest in everyone's personal data, to prevent or at least obstruct the propagation of the dossiers to unworthy outsiders.

“Suspending Cambridge Analytica and SCL Group from Facebook”
Paul Grewal, Facebook Newsroom, March 16, 2018

“How Trump Consultants Exploited the Facebook Data of Millions”
Matthew Rosenberg, Nicholas Confessore, and Carole Cadwalladr, The New York Times, March 17, 2018

“Cambridge Analytica Responds to Facebook Announcement”
Cambridge Analytica, March 17, 2018

“‘I Made Steve Bannon's Psychological Warfare Tool’: Meet the Data War Whistleblower”
Carole Cadwalladr, The Guardian, March 18, 2018

“Cambridge Analytica's Ad Targeting Is the Reason Facebook Exists”
Jason Koebler, Motherboard, March 19, 2018

Though Cambridge Analytica's specific use of user data to help a political campaign is something we haven't publicly seen on this scale before, it is exactly the type of use that Facebook's platform is designed for, has facilitated for years, and continues to facilitate every day. At its core, Facebook is an advertising platform that makes almost all of its money because it and the companies that use its platform know so much about you.

Facebook continues to be a financially successful company precisely because its platform has enabled the types of person-specific targeting that Cambridge Analytica did. …

“The incentive is to extract every iota of value out of users,” Hartzog [Woodrow Hartzog, Professor of Law and Computer Science at Northeastern University] said. “The service is built around those incentives. You have to convince people to share as much information as possible so you click on as many ads as possible and then feel good about doing it. This is the operating ethos for the entire social internet.”

“Facebook's Surveillance Machine”
Zeynep Tufekci, The New York Times, March 19, 2018

Billions of dollars are being made at the expense of our public sphere and our politics, and crucial decisions are being made unilaterally, and without recourse or accountability.

“Then Why Is Anyone Still on Facebook?”
Wolf Richter, Wolf Street, March 20, 2018

So now there's a hue and cry in the media about Facebook, put together by reporters who are still active on Facebook and who have no intention of quitting Facebook. There has been no panicked rush to “delete” accounts. There has been no massive movement to quit Facebook forever. Facebook does what it does because it does it, and because it's so powerful that it can do it. A whole ecosystem around it depends on the consumer data it collects. …

Yes, there will be the usual ceremonies … CEO Zuckerberg may get to address the Judiciary Committee in Congress. The questions thrown at him for public consumption will be pointed. But behind the scenes, away from the cameras, there will be the usual backslapping between lawmakers and corporations. Publicly, there will be some wrist-slapping and some lawsuits, and all this will be settled and squared away in due time. Life will go on. Facebook will continue to collect the data because consumers continue to surrender their data to Facebook voluntarily. And third parties will continue to have access to this data. …

People who are still active on Facebook cannot be helped. They should just enjoy the benefits of having their lives exposed to the world and serving as a worthy tool and resource for corporate interests, political shenanigans, election manipulators, jealous exes, and other facts of life.

“Facebook Sued by Investors over Voter-Profile Harvesting”
Christie Smythe and Kartikay Mehrotra, Bloomberg Technology, March 20, 2018

“The Researcher Who Gave Cambridge Analytica Facebook Data on 50 Million Americans Thought It Was ‘Totally Normal’”
Kaleigh Rogers, Motherboard, March 21, 2018

Kogan said he was under the impression that what he was doing was completely normal.

“What was communicated to me strongly was that thousands and maybe tens of thousands of apps were doing the exact same thing and that this was a pretty normal use case and a normal situation for usage of Facebook data,” Kogan said.

“Facebook's Mark Zuckerberg Vows to Bolster Privacy amid Cambridge Analytica Crisis”
Sheera Frenkel and Kevin Roose, The New York Times, March 21, 2018

#Facebook #Cambridge-Analytica #data-mining #data-sharing

Using Facebook Slightly Less Dangerously


We're starting to see a new genre of advice columns, featuring instructions on how to use some piece of modern technology safely, given of course that it's impossible to really use it safely, since the users' understanding of how it works and what they want to accomplish with it is flatly incompatible with its design and with the business model of its maker and licensor.

The journalists who write in this genre are people who know better than to try to use the technology, but use it anyway because their jobs require it and because they know that their readers are going to use it as well, even those who also know better than to try.

“The Motherboard Guide to Using Facebook Safely”
Lorenzo Franceschi-Bicchierai, Motherboard, March 21, 2018

You can't really stop all collection. In fact, even if you leave Facebook (or have never been part of the social network), the company is still gathering data on you and building a shadow profile in case you ever join. …

Facebook's entire existence is predicated on tracking and collecting information about you. If that concept makes you feel creeped out, then perhaps you should quit it. But if you are willing to trade that off for using a free service to connect with friends, there's still some steps you can take to limit your exposure.

#facebook #surveillance #technological-skepticism

Invisible Adversarial Masks


It is possible to fool face-recognition (FR) systems into misidentifying one person A as some specified other person B by projecting an pattern of infrared light onto A's face when the recognizer's camera photographs it, creating a customized adversarial example. Since light in the near infrared can be detected by surveillance cameras but not by human eyes, other people cannot detect the masquerade, even at close range. To project the light patterns, researchers had person A wear a baseball cap with tiny infrared LEDs tucked up under the bill.

“Invisible Mask: Practical Attacks on Face Recognition with Infrared”
Zhe Zhou, Di Tang, Xiaofeng Wang, Weili Han, Xiangyu Liu, and Kehuan Zhang, arXiv, March 13, 2018

In this paper, we present the first approach that makes it possible to apply [an] automatically-identified, unique adversarial example to [a] human face in an inconspicuous way [that is] completely invisible to human eyes. As a result, the adversary masquerading as someone else will be able to walk on the street, without any noticeable anomaly to other individuals[,] but appearing to be a completely different person to the FR system behind surveillance cameras.

#adversarial-examples #face-recognition #impersonation

Risks of Singleton Technologies


Dan Geer explains the social, political, and security risks of programmatically displacing manual processes and alternative algorithmic designs with interdependent, standardized, or centralized technologies. Such monoliths may or may not be fragile, but their only failure modes are catastrophic.

“A Rubicon”
Daniel E. Geer, Jr., Hoover Institution, February 7, 2018

If an algorithm cannot be verified then do not trust it.

To be precise, algorithms derived from machine learning must never be trusted unless the “Why?” of decisions those algorithms make can be usefully examined on demand. This dictum of “interrogatability” may or may not be effectively design-assured while there is still time to do so — that is, to do so pre-dependence. Once the change to design-assure interrogatability is lost — that is to say once dependence on a non-interrogatable algorithm is consummated — going back to non-self-modifying algorithms will prove to be costly, if even possible. …

The central thesis of this essay is that an accessible, continuously exercised analog option is essential to the national security and to the inclusionary polity we hold dear. …

As a matter of national security, keeping non-technical exits open requires action and it requires it now. It will not happen by itself, and it will never again be as cheap or feasible as it is now. Never again will national security and individual freedom jointly share a call for the same initiative at the same time. In a former age, Dostoevsky told us, “The degree of civilization in a society can be judged by entering its prisons.” From this point on, that judgement will be passed on how well we preserve a full life for those opting out of digitalization. There is no higher embodiment of national security than that.

#opacity #technological-skepticism #dystopia

Campus Security Offices Monitor Social Media


It is now becoming commonplace for security offices at colleges and universities to monitor the social-media accounts of members of the College community for potential threats, crimes, and miscellaneous troublemaking. Sometimes they outsource the work to specialist companies (such as Social Sentinel, which curates and customizes a list of several thousand words whose appearance in posts can trigger investigations).

“Big Brother: College Edition”
Jeremy Bauer-Wolf, Inside Higher Ed, December 21, 2017

“Social Media Monitoring: Beneficial or Big Brother?”
Amy Rock, Campus Safety Magazine, March 12, 2018

“University Police Surveil Student Social Media in Attempt to Make Campus Safer”
Ryne Weiss, Foundation for Individual Rights in Education, March 16, 2018

Put yourself in the shoes of a student on campus. What would you do if you're aware that anything you post may be flagged by the school administration or police for containing one of the keywords in Social Sentinel's library of harm? Do you make the decision to tweet less? Do you restrict your posts to friends only? It seems hard to imagine how you could moderate your tweets to avoid thousands of words when you have no idea what they are.

And assume you do get flagged and questioned by police. Many people would probably change their behavior. And while people might want to be mindful of what they post publicly online, fear of police and their school monitoring them and misinterpreting their messages shouldn't be something students have to navigate. …

The free exchange of ideas on campus is an invaluable and irreplaceable part of the ideal college experience, and the chilling effect of student social media surveillance actively undermines that.

#surveillance #freedom-of-speech #social-media

Inadequate Record-Keeping in Machine-Learning Research


It appears that many researchers in machine learning, including some who profess to be scientists, are not keeping proper records of their experiments. Even with the assistance of version-control systems, they often fail to write down which versions of code libraries they are using, where their data sets come from and what they contain, how they massaged and cleaned their data sets, and what tweaks they made to their algorithms and to the configuration and initialization of their networks.

They redesign their experiments on the fly, interrupt and restart them, cherry-pick results from various runs, and reuse partially trained neural networks as starting points for subsequent experiments without properly documenting the process.

As a result, machine learning as a discipline is now facing a devastating crisis: researchers cannot reproduce one another's experiments, or even their own, and so cannot confirm their results.

“The Machine Learning Reproducibility Crisis”
Pete Warden, Pete Warden's Blog, March 19, 2018

In many real-world cases, the research won't have made notes or remember exactly what she did, so even she won't be able to reproduce the model. Even if she can, the frameworks the model code depend[s] on can change over time, sometimes radically, so she'd need to also snapshot the whole system she was using to ensure that things work. I've found ML researchers to be incredibly generous with their time when I've contacted them for help reproducing model results, but it's often [a] months-long task even with assistance from the original author.

#machine-learning #reproducibility #scientific-method

Cryptographic Crumpling Meets No One's Requirements


A research team at Boston University has discovered a technique for partially encrypting messages so as to make decryption extremely expensive, but not impossible. They call their partial-encryption system “cryptographic crumpling.” The computation required to decrypt a message prepared by this technique is useless in the decryption of any other message, so there are no economies of scale — the decryptor must pay the high computational price all over again for each new message.

The researchers offer their system as a way of resolving the “second crypto wars” between government officials, who insist that the makers of all commercial-grade encryption software must provide back doors for law-enforcement and national-security agencies, and privacy advocates, who insist that only strong, end-to-end encryption will protect their rights. The researchers argue that their system would allow well-funded government agencies to access the partially encrypted data in exceptional cases, but would force those agencies to choose their targets so carefully that the privacy rights of ordinary users would not be significantly affected.

But this proposal doesn't really accommodate either side. Government officials say they need to decrypt messages that could contain evidence of crime or terrorism regardless of how many such messages there are, and so would not be content with a system in which their budget constrains their ability to decrypt. And privacy advocates would surely note that if government officials with legitimate interests in the contents of communications were able to perform the decryptions, so too would corporations bent on industrial espionage, hostile foreign governments, and even well-funded hacking teams. A back door works equally well for everyone who has the resources to open it.

Such failed attempts at compromise reinforce the conclusion (which most security analysts reached long ago) that the requirements of government officials and privacy advocates are incompatible.

“Cryptographic Crumpling: The Encryption ‘Middle Ground’ for Government Surveillance”
Charlie Osborne, Zero Day, March 19, 2018

#crypto-wars #backdoors #privacy

Hashtag index

This work is licensed under a Creative Commons Attribution-ShareAlike License.

Atom feed

John David Stone (

created June 1, 2014 · last revised March 23, 2018