For decades now, lazy programmers have relied on ever-faster processors and ever-larger memories to avoid learning about subtle but efficient algorithms and fast data structures with intricate invariants. Now the jig is up.
“Death Notice: Moore's Law. 19 April 1965 — 2 January 2018”
Mark Pesce, The Register, January 24, 2018
The computer science behind microprocessor design has therefore found itself making a rapid U-turn as it learns that its optimization techniques can be weaponized. The huge costs and Meltdown and Spectre — which no one can even guess at today — will make chip designers much more conservative in their performance innovations, as they pause to wonder if every one of those innovations could, at some future point, lead to the kind of chaos that has engulfed us all over the last weeks.
One thing has already become clear: in the short term, performance will go backwards. The steady … improvements every software engineer could rely on to make messy code performant can no longer be guaranteed. …
Going forward, the game changes from “cheaper and faster” to “sleeker and wiser.” Software optimizations — despite their Spectre-like risks — will take the lead over the next decades. …
From here on in, we're going to have to work for it.
A research team at Michigan State University analyzed documents relating to the spending of the Department of Defense and the Department of Housing and Urban Development from 1998 to 2015 and concluded that those two departments alone had somehow managed to spend $21,000,000,000,000 more than Congress authorized. In 2015, Congress provided the United States Army with a budget of $122,000,000,000, which the Army overspent by a factor of 54: They had $6,500,000,000,000 in “unsupported adjustments”.
“MSU Scholars Find $21 Trillion in Unauthorized Government Spending; Defense Department to Conduct First-Ever Audit”
MSU Today, December 11, 2017
Perhaps it's not so surprising that Congress finds it difficult to take seriously its Constitutional duty to pass a budget every year.
Software tools for searching immense quantities of surveillance data are increasingly relying on black-box deciders to extract and summarize search results.
“Artificial Intelligence Is Going to Supercharge Surveillance”
James Vincent, The Verge, January 23, 2018
For experts in surveillance and AI, the introduction of these sorts of capabilities is fraught with potential difficulties, both technical and ethical. And, as is often the case in AI, these two categories are intertwined. It's a technical problem that machines can't understand the world as well as humans do, but it becomes an ethical one when we assume the can and let them make decisions for us. …
Even if we manage to fix the biases in these automated systems, that doesn't make them benign, says ACLU policy analyst Jay Stanley. He says that changing CCTV cameras from passive into active observers could have a huge chilling effect on civil society.
“We want people to not just be free, but to feel free. And that means that they don't have to worry about how an unknown, unseen audience may be interpreting or misinterpreting their every movement and utterance,” says Stanley. “The concern is that people will begin to monitor themselves constantly, worrying that everything they do will be misinterpreted and bring down negative consequences on their life.”
An overview of technology patents for which Facebook has applied, with many imaginative ways for the company to add to your dossier and fill in details of your social graph:
“Facebook Knows How to Track You Using the Dust on Your Camera Lens”
Kashmir Hill and Surya Mattu, Gizmodo, January 11, 2018
Facebook claims that they aren't currently using this tactic. Uh huh.
One filed in 2015 describes a technique that would connect two people through the camera metadata associated with the photos they uploaded. It might assume two people knew each other if the images they uploaded looked like they were titled in the same series of photos — IMG_4605739.jpg and IMG_4605742.jpg, for example — or if lens scratches or dust were detectable in the same spots on the photos, revealing the photos were taken by the camera.
It would result in all the people you've sent photos to, who then uploaded them to Facebook, showing up in one another's “People You May Know.” It's be a great way to meet the other people who hired your wedding photographer.
You can't make this stuff up.
“NSA Deletes ‘Honesty’ and ‘Openness’ from Core Values”
Jean Marc Manach, The Intercept, January 24, 2018
The National Security Agency maintains a page on its website that outlines its mission statement. But earlier this month, the agency made a discreet change: It removed “honesty” as its top priority.
Since at least May 2016, the surveillance agency had featured honesty as the first of four “core values” listed on NSA.gov, alongside “respect for the law,” “integrity,” and “transparency.” The agency vowed on the site to “be truthful with each other.”
On January 12, however, the NSA removed the mission statement page — which can still be viewed through the Internet Archive — and replaced it with a new version. Now, the parts about honesty and the pledge to be truthful have been deleted. The agency's new top value is “commitment to service,” which it says means “excellence in pursuit of our critical mission.” …
In its old core values, the NSA explained that it would strive to be deserving of the “great trust” placed in it by national leaders and American citizens. It said that it would “honor the public's need for openness.” But those phrases are now gone; all references to “trust,” “honor,” and “openness” have disappeared.
“India Will Install Cameras in Classrooms amid a Rise of Surveillance Measures in Asia”
Rosie Perper, Business Insider, January 21, 2018
A polling organization that probably knows better has reported the results of a recent survey of college students to find out how they feel about finding a job after graduation. The head pollster for the project concluded that “students are not nearly as prepared as they could or should be, and they actually know it while they're in college.” The survey asked “more than 32,500 students from 43 randomly selected four-year institutions, both public and private” to express their level of agreement (on a five-position Likert scale), with a few canned assertions, such as “I am confident that I will graduate with the knowledge and skills I need to be successful in the job market.”
“Unprepared and Confused”
Jeremy Bauer-Wolf, Inside Higher Ed, January 17, 2018
Because the pollsters did not investigate anyone's actual success in the job market or anyone's actual preparedness to enter the job market, the results of the survey contribute nothing whatever to anyone's understanding of those issues.
Because the pollsters did not actually talk with any students about the variety or intensity of their sentiments with regard to entering the job market, but only asked them to assent to or dissent from words placed in their mouths by the pollsters, they didn't learn anything about that either.
Because they did not establish a meaningful scale for assent or dissent and did not provide the victims of the survey with a common understanding of the meanings of the numerals associated with the “points” of the so-called scale that they did use, they did not even learn anything significant about the students' attitudes towards the assertions presented.
The survey was a total waste of time, money, and effort, and no one should take the published factoids seriously.
Nonetheless, the report has occasioned some wringing of hands and gnashing of teeth among faculty members who are eager to impose a different spin:
“A Different Look at That Gallup Survey on Student Preparation”
John Warner, Just Visiting, Inside Higher Ed, January 18, 2018
Between graduate school and returning to teaching, I spent some time working for a marketing research company, which included designing and interpreting surveys and one thing I learned is there's a lot of different ways to slice data.
*Sigh.* Apparently spending time in a marketing research company is a good way to lose track of the difference between facts, on one hand, and on the other hand things that are mocked up to look like facts, but aren't. The whole reason for making empirical observations and record the observed facts accurately and impartially is that they may enable you decide which of two or more incompatible hypotheses is correct. If you can “slice data” in various ways, so as to support any hypothesis you like, then the so-called data are useless and you might just as well express your opinions about student preparation for the job market without trying to give them an empirical foundation at all.
It also intrigued me that Warner (who, as he notes, received a Bachelor of Arts degree in rhetoric) uses the words “Far be it from me to dispute Gallup's own interpretations of their data” to introduce the passage in which he explicitly and specifically controverts Gallup's own interpretations of their data. Far be it, indeed.
“The Senate Just Voted to Expand the Warrantless Surveillance of US Citizens”
Daniel Oberhaus, Motherboard, January 18, 2018
On Thursday afternoon, the US Senate voted in favor of the FISA Amendments Reauthorization Act of 2017, a bill that will expand the warrantless surveillance of US citizens. The bill passed by a vote of 65–34, with 43 Republicans and 21 Democrats voting in its favor.
The bill will now go to the White House to be signed into law by President Trump. It reauthorizes FISA Section 702 until 2024.
Among the prominent Democratic senators voting in favor of this patently unconstitutional bill were Tammy Duckworth, Diane Feinstein, Tim Kaine, Amy Klobuchar, Chuck Schumer, Jeanne Shaheen, and Debbie Stabenow. Nice work, fools.
“Congress Demanded NSA Spying Reform. Instead, They Let You Down”
Zack Whittaker, Zero Day, January 18, 2018
The Electronic Frontier Foundation responded by renewing their determination to pursue lawsuits against warrantless surveillance of Americans notionally justified by section 702 of FISA and to promote and support the development of strong encryption and other protocols and tools to ensure the privacy of documents and communications.
“An Open Letter to Our Community on Congress's Vote to Extend NSA Spying from EFF Executive Director Cindy Cohn”
Cindy Cohn, Deeplinks, Electronic Frontier Foundation, January 18, 2018
We offer this response to the National Security Agency and its allies in Congress: enjoy it while you can because it won't last.
Today's Congressional failure redoubles our commitment to seek justice through the courts and through the development and spread of technology that protects our privacy and security. …
We aim to bring mass surveillance to the Supreme Court. By showcasing the unconstitutionality of the NSA's collect-it-all approach to tapping the Internet, we'll seek to end the dragnet surveillance of millions of innocent people. We know that the wheels of justice turn slowly, especially when it comes to impact litigation against the NSA, but we're in this for the long run.
After a quarter century of dominance, the educationistic fad of assessment seems finally to be running out of steam.
“An Insider's Take on Assessment: It May Be Worse Than You Thought”
Erik Gilbert, The Chronicle of Higher Education, January 12, 2018
Because it's fairly obvious that assessment has not caused (and probably will not cause) positive changes in student learning, and because it's clear that this has been an open secret for a while, one wonders why academic administrators have been so acquiescent about assessment for so long.
Here's why: It's no accident that the rise of learning-outcomes assessment has coincided with a significant expansion in the use of adjunct faculty, the growth of dual enrollment, and the spread of online education. Each of these allows administrators to deliver educational product to their customers with little or no involvement from the traditional faculty. If they are challenged on the quality of these programs, they can always point out that assessment results indicate that the customers are learning just as much as the students in traditional courses.
Gilbert's explanation is somewhat plausible for large research universities, but the push for assessment started before the other trends he cites really took hold, and it has been equally powerful at liberal-arts colleges that have been much less influenced by those other trends.
On the other hand, research universities and liberal-arts colleges have suffered about equally from administrative hypertrophy and bloat. My guess is that the assessment fad caught on so well because it reinforced and seemed to justify the expansion of administrative power at the expense of faculty and cooperative governance.
Encountering this article has led me to resusciatate and republish an essay that I wrote almost twenty years ago, in response to a spectacularly cynical piece of advocacy.
Early attempts to patch operating systems and processor microcode in order to block Meltdown attacks and impede some known instances of the Spectre strategy have had adverse results (beyond slower performance, which was anticipated): incompatibility with some third-party anti-virus utilities, driver crashes, bricking of some AMD systems, and processor crashes on Intel systems that still use the Haswell and Broadwell designs.
Meanwhile, researchers are making progress in turning the proof-of-concept implementations described in the original papers on Meltdown and Spectre into practical attacks.
“Spectre and Meltdown Patches Causing Trouble As Realistic Attacks Get Closer”
Peter Bright, Ars Technica, January 15, 2018
This is all a mess. Some companies, such as cloud service providers, have no real option but to install all the updates, including the microcode updates, because their vulnerability is so great; their business is running untrusted third-party code. For the rest of us, there is urgency, but that needs to be balanced against reliability.
That urgency is growing each day, however, particularly when it comes to the Meltdown attack.
“Skygofree: Following in the Footsteps of HackingTeam”
Nikita Buchka and Alexey Firsh, Securelist, Kaspersky Labs, January 16, 2018
“Found: New Android Malware with Never-Before-Seen Spying Capabilities”
Dan Goodin, Ars Technica, January 16, 2018
Skygofree is capable of taking pictures, capturing video, and seizing call records, text messages, geolocation data, calendar events, and business-related information stored in device memory.
Skygofree also includes the ability to automatically record conversations and noise when an affected device enters a location specified by the person operating the malware. Another never-before-seen feature is the ability to steal WhatsApp messages by abusing the Android Accessibility Service that's designed to help users who have disabilities or who may temporarily be unable to fully interact with a device. A third new feature: the ability to connect infected devices to Wi-Fi networks controlled by attackers.
Skygofree also includes other advanced features, including a reverse shell that gives malware operators better remote control of infected devices. The malware also comes with a variety of Windows components that provide among other things a reverse shell, a keylogger, and a mechanism for recording Skype conversations.
“Speculation Considered Harmful?”
“Nemo”, Cryptography mailing list, January 16, 2018
Even if you eliminate speculative execution entirely, the cache still holds “footprints” of the execution of your privileged code. And it is hard to prove exactly what information that conveys (or does not convey).
There are two kinds of security. One is where you say “I do not see how an attacker can do X.” The other is where you say “I can prove the attacker cannot do X, assuming Y and Z.” The former leaves you vulnerable to people smarter and/or more motivated than you. The latter is what you want.
Cache timing attacks, given the implicit management of the cache by the CPU hardware, means that I do not know exactly what information I am leaking from privileged code no matter how I write that code.
And it is not just the cache. Consider performance counters, debug registers, register renaming (i.e. vastly more physical registers than architectural registers), etc. All of this implicitly-managed state might carry who-knows-what information across protection domains. …
To prove that nothing interesting passes across protection domains, given all those megabytes of implicitly managed state … is going to require some serious rethinking of CPU architecture.
“Amazon Won't Say If It Hands Your Echo Data to the Government”
Zack Whittaker, Zero Day, January 16, 2018
Amazon has been downright deceptive in how it presents the data, obfuscating the figures in its short, but contextless, twice-yearly reports. Not only does Amazon offer the barest minimum of information possible, the company has — and continues — to deliberately mislead its customers by actively refusing to clarify how many customers, and which customers, are affected by the data demands it receives.
Last Saturday morning, the Hawaii Emergency Management Agency sent out an alert warning residents of Hawaii that they were about to be struck by a ballistic missile and advising them to take immediate shelter. They reinforced this message with the flat statement “This is not a drill.” The alert was sent to radio and television statements to be broadcast and texted to cellphone users throughout the state.
The alert was false. A HEMA employee was supposed to be conducting an internal test of the missile alert system. The employee was supposed to bring up a drop-down menu and select the “test missile alert” option. Instead, the employee selected the immediately following option, “missile alert.”
At that point, a confirmation box appeared, asking whether the user wished to proceed with the “missile alert” option. The employee confirmed the operation and the alert went out.
“Hawaii Missile Alert: How One Employee ‘Pushed the Wrong Button’ and Caused a Wave of Panic”
Amy B. Wang, The Washington Post, January 14, 2018
“How a Poor User Interface Design Caused the Hawaii Missile Scare”
Adam Shepherd, IT PRO, January 15, 2018
Hmmm. In my opinion, the error in user interface design was having a "missile alert" option on the drop-down menu in the first place.
Even purchasers of the Amazon Echo and its rivals have trouble finding a use for their new gizmo. Amazon equips the Echo with more than thirty thousand “skills.” Only about one Echo user in five has ever run even one of those add-on applications.
“Alexa, We're Still Trying to Figure Out What to Do with You”
Daisuke Wakabayashi and Nick Wingfield, The New York Times, January 15, 2018
For some reason, the story doesn't mention the most important application of these devices: surveillance.