What does “personal style” mean when the Echo Look gives you fashion advice, and can tell you what you like but not why you like it?
“Style Is an Algorithm”
Kyle Chayka, Racked, April 17, 2018
If you know the source of the suggestion, then you might give it a chance and see if it meshes with your tastes. In contrast, we know the machine doesn't care about us, nor does it have a cultivated taste of its own; it only wants us to engage with something it calculates we might like. This is boring. …
We can decide to become a little more analog. I imagine a future in which our clothes, music, film, art, books come with stickers like organic farmstand produce: Algorithm Free. …
“Echo” is a good name for Amazon's device because it creates an algorithmic feedback loop in which nothing original emerges.
Alexa, how do I look?
You look derivative, Kyle.
In the absence of explicit guidance from the user, the black-box decider inside YouTube that chooses and queues up the videos that it fancies you'll be most interested in seeing next tends to make recommendations that are progressively more bizarre and disturbing. Perhaps it has learned something about human nature, but more likely its selections are the video-recommender analogue of the luridly colored fantasy images that a black-box classifier constructs when directed to search for the pixel pattern that maximizes its response to a given search term such as “octopus” or “mouth” or “waterfall”.
This writer suspects that the black-box decider's behavior reflects something sinister in its programming. It turns out that many of the bizarre and disturbing videos that the decider eventually queues up, not too surprisingly, are pro-Trump ads and right-wing loons promoting conspiracy theories.
“‘Fiction Is Outperforming Reality’: How YouTube's Algorithm Distorts Truth”
Paul Lewis, The Guardian, February 2, 2018