The Canadian province of Nova Scotia maintains a public database of government documents that have been released in response to freedom-of-information requests and have provided a Web interface to it. A nineteen-year-old Canadian student who was interested in learning about a labor dispute involving teachers in the province found some relevant files in that database but had difficulty searching for the ones he wanted. Since the Web pages for all of the documents had easily predictable URLs, he wrote a script to run through the URLs and download all of the documents, intending to go through them off line with better search tools.
It turns out that about two hundred fifty of the seven thousand documents in the database contained personally identifiable information that the provincial government had failed to remove before putting the documents on line.
Naturally, it's not the government that is in trouble as a result of this blunder. When the authorities discovered that the student had downloaded these published documents, they charged him with “unauthorized use of a computer.” He now faces up to ten years in prison.
He lives at home with his parents and younger siblings. The police staged a home invasion, tore up the house, confiscated the student's computers and gear, his father's work computers and cell phone, and his brother's computer, arrested his brother on the street, and detained and questioned his thirteen-year-old sister in a police car.
“Teen Charged in Nova Scotia Government Breach Says He Had ‘No Malicious Intent’”
Jack Julian, CBC News, April 16, 2018
If you use the Internet or interact regularly with people who do, companies like Google and Facebook have compiled dossiers on you, regardless of whether you have ever set up accounts with them or used their services.
“Facebook Is Tracking Me Even Though I'm Not on Facebook”
Daniel Kahn Gillmor, Free Future, American Civil Liberties Union, April 5, 2018
Nearly every Website you visit that has a “Like” button is actually encouraging your browser to tell Facebook about your browsing habits. Even if you don't click on the “Like” button, displaying it requires your browser to send a request to Facebook's servers for the “Like” button itself. That request includes information mentioning the name of the page you are vising and any Facebook-specific cookies your browser might have collected. (See Facebook's own description of this process.) …
This makes it possible for Facebook to created a detailed picture of your browsing history — even if you've never even visited Facebook directly, let alone signed up for a Facebook account.
Think about most of the web pages you've visited — how many of them don't have a “Like” button? If you administer a website and you include a “Like” button on every page, you're helping Facebook to build profiles of your visitors, even those who have opted out of the social network. …
The profiles that Facebook builds on non-users don't necessarily include so-called “personally identifiable information” (PII) like names or email addresses. But they do include fairly unique patterns. Using Chromium's NetLog dumping, I performed a simple five-minute browsing test last week that included visits to various sites — but not Facebook. In that test, the PII-free data that was sent to Facebook included information about which news articles I was reading, my dietary preferences, and my hobbies.
Given the precision of this kind of mapping and targeting, “PII” isn't necessary to reveal my identity. How many vegans examine specifications for computer hardware from the ACLU's offices while reading about Cambridge Analytica? Anyway, if Facebook combined that information with the “web bug” from the email mentioned above — which is clearly linked to my name and e-mail address — no guesswork would be required.
“Arizona's Anti-BDS Statute Lands Arizona State University in Federal Court”
Adam Steinbaugh, Foundation for Individual Rights in Education, March 12, 2018
Earlier this month, the Council on American-Islamic Relations filed a lawsuit against Arizona State University on behalf of Hatem Bazian, a Berkeley lecturer and chair of American Muslims for Palestine, who was invited to speak at ASU by the university's Muslim Students Association. The agreement provided to him by ASU contained a provision — required by Arizona state law — demanding that he affirm he will not boycott Israel. Bazian's planned presentation concerned the “Boycott, Divestment, and Sanctions” (BDS) movement targeting Israel.
Arizona's statute prohibits any “public entity” from entering into any “contract with a company to acquire or dispose of services … unless the contract includes a written certification that the company is not currently engaged in, and agrees for the duration of the contract to not engage in, a boycott of Israel.” The statute broadly defines “boycott,” in turn, to include not simply refusing to engage in business, but undertaking “other actions that are intended to limit commercial relations with Israel.”
The article reproduces the offending contract and includes some plausible legal argumentation explaining why the legislative language “contract with a company to acquire or dispose of services” really does apply to a contract with an individual to give a talk.
Arizona is only one of twenty-four states, including Iowa, that has passed ridiculous and patently unconstitutional legislation of this kind. Pulling this crap is going to get us into several different kinds of trouble:
“State Anti-BDS Laws Are Hitting Unintended Targets and Nobody's Happy”
Ron Kampeas, The Times of Israel, October 24, 2017
Software tools for searching immense quantities of surveillance data are increasingly relying on black-box deciders to extract and summarize search results.
“Artificial Intelligence Is Going to Supercharge Surveillance”
James Vincent, The Verge, January 23, 2018
For experts in surveillance and AI, the introduction of these sorts of capabilities is fraught with potential difficulties, both technical and ethical. And, as is often the case in AI, these two categories are intertwined. It's a technical problem that machines can't understand the world as well as humans do, but it becomes an ethical one when we assume the can and let them make decisions for us. …
Even if we manage to fix the biases in these automated systems, that doesn't make them benign, says ACLU policy analyst Jay Stanley. He says that changing CCTV cameras from passive into active observers could have a huge chilling effect on civil society.
“We want people to not just be free, but to feel free. And that means that they don't have to worry about how an unknown, unseen audience may be interpreting or misinterpreting their every movement and utterance,” says Stanley. “The concern is that people will begin to monitor themselves constantly, worrying that everything they do will be misinterpreted and bring down negative consequences on their life.”