More #algorithmicEvil - The Day Yahoo Decided I Liked Reading About Child Murder

Excellent article from Sarah Kendzior at The Atlantic:

Algorithms are shaping how we see the world around us, with big consequences. What a machine thinks we need to know can become what we fear.

Filed under: "I told you so." But wait - there's more:

"If there is one unambiguous trend in how the Internet is developing today," writes Evgeny Morozov, "it's the drive toward the personalization of our online experience. Everything we click, read, search, and watch online is increasingly the result of some delicate optimization effort, whereby our previous clicks, searches, 'likes,' purchases, and interactions determine what appears in our browsers and apps."
Morozov was writing about algorithmic optimization, a concept outlined in Eli Pariser's "The Filter Bubble," which describes the way that websites like Yahoo and Google tailor what they show someone according to the previous online activity of that user. By capitalizing on what are assumed to be your pre-existing interests, it intends to make you more likely to read stories or click on ads.
Opponents of the practice, like Parisier, fear that filter bubbles prevent users from experiencing viewpoints other than their own. They strip online worlds of their serendipity, imprisoning users in an informational comfort zone. But I had the opposite experience: child murder was my presumed interest. Yahoo News had become my own personal Hunger Games, making me a spectator to violence I would never voluntarily seek out.  
Filter bubbles are usually criticized on material or political grounds: They reinforce pre-existing tastes, manipulate consumers into buying products, and limit knowledge of opposing views. But what if the filter is wrong? What if it's not a true reflection, but a false mirror ...?

The operative phrase is "what if the filter is wrong?"
The problem with the application of algorithms to predict human interests or behavior is that they will be wrong. We are much too complex, fickle, bold, cowardly and contradictory to be approximated by algorithms to any acceptable degree of certainty (at least to me). As such, algorithms will always conjure up a false, inaccurate model -- that, unfortunately, will be accepted without question as a true reflection of who we are. And this false model will directly or indirectly influence our real lives. That is evil.

Add new comment