Culture

YouTube's crackdown on conspiracy theories has been (kind of) successful

Conspiratorial content has long thrived on YouTube, including Illuminati theories, UFO sightings, Sandy Hook denial videos.

Conspiracy theory concept. All Seeing Eye and Pyramid on USA dollar banknote, macro photo with motio...
Shutterstock

Back in early 2019, YouTube announced it would crack down on "borderline" content, including conspiracy videos, which violated its platform guidelines. Now, a new Berkeley study shows that conspiracy theories are 40 percent less likely to appear as recommendations. It's a success with some caveats attached.

Cherry picking targets — Berkeley researchers studied 8 million recommended videos over a year and three months. The study found that conspiracies about 9/11 as an insider job and flat Earth content were down. However, conspiracies about climate change still flourished on the network, which activists have repeatedly highlighted. Input has reached out to Google for more information on how it targets this type of content.

Why this matters — Almost one-third of the internet uses YouTube, Google says. That's more than two billion users on the platform. The company says the number of hours of content viewed on YouTube daily is at least one billion hours. Everyday. The majority demographic ranges between the ages of 18 to 34, while YouTube mobile reaches more American viewers than any television network in the country. The platform runs in over 100 countries, in more than 80 languages, and is frequently used as a news source by hundreds of thousands of people.

In other words, YouTube has its finger on the pulse of a sizable swath of the world. If unchecked, a platform with such incredible influence can radicalize its viewers, and has, researchers warn.

Curbing content doesn't cure radicalization — While the study does report that YouTube is trying to curb conspiratorial content, it notes that borderline content reduction does not mean the platform is rid of its radicalization pipeline. Researchers note that while conspiracy videos can create an echo chamber for the user via repeated recommendation, radicalization is more complicated as a phenomenon. It's an interesting nugget worth reading:

In general, radicalization is a more complex problem than what an analysis of default recommendations can scope, for it involves the unique mindset and viewing patterns of a user interacting over time with an opaque multi-layer neural network tasked to pick personalized suggestions from a dynamic and virtually infinite pool of ideas.

If the study tells us anything, it's that YouTube has a long way to go before its kingdom of video content is problem-free.