YouTube, the Great Radicalizer

Zeynep Tufekci – New York Times

This article has been getting extensive and well-deserved coverage over the last few days. Essentially, it is demonstrating that the YouTube recommendation engine tends to lead to more extreme material, more or less whatever your starting point. In  short, “YouTube leads viewers down a rabbit hole of extremism, while Google racks up the ad sales.”

The reason for including it here is not because of the specific algorithm or the specific behaviour it generates. It is because it’s a very clear example of a wider phenomenon. It’s a pretty safe assumption that the observed behaviour is not the result of a cabal of fringe conspirators deep in the secret basements of Google setting out a trail to recruit people into extremist groups or attitudes. The pretty obvious motivation is that what they are actually trying to do is to tempt people into spending as long as possible watching YouTube videos, because that’s the way they can put most advertising in front of most eyeballs.

In other words, algorithmic tools can have radically unintended consequences. That’s made worse in this case because the unintended consequences are not a sign of the intended goal not being achieved; on the contrary, they are the very means by which that intended goal is being achieved. So it is not just the case that YouTube has some strong incentives not to fix the problem, the problem may not be obvious to them in the first place.

This is a clear example. But we need to keep asking the same questions about other systems: what are the second order effects, will we recognise them when we see them, and will we be ready to – and able to – address them?

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.