Tech

Call out harmful YouTube recommendations with Mozilla's RegretsReporter

If YouTube recommends a harmful or false video to you, Mozilla wants to know.

BJI / Blue Jean Images/blue jean images/Getty Images

Mozilla wants the public to help it police YouTube's algorithm for recommending videos with a new browser extension called RegretsReporter. The company behind Firefox says that its hope is to identify how frequently users are led to watching harmful videos of conspiracy theories or political disinformation. With this information, Mozilla thinks it could identify the specific patterns that lead to the content being recommended in the first place.

"YouTube is the second-most visited website in the world, and its AI-enabled recommendation engine drives 70% of total viewing time on the site," the company wrote in a blog post. "It’s no exaggeration to say that YouTube significantly shapes the public’s awareness and understanding of key issues across the globe."

Mozilla

Content cop — The extension is pretty simple. If your recommendations lead you to watching a questionable video, you can press the frowning extension icon in your browser and report the video. Mozilla will ask you why you regret watching it, such as that it contained untrue information. The company will then receive your report in an anonymous form including the video in question and the specific recommendations that led you to it, so it can try and analyze what type of recommended videos lead to racist, violent, or conspiratorial content.

YouTube often responds to scrutiny of its recommendations by saying that it's constantly working to suppress bad videos and promote those from more reputable sources. The argument Mozilla is making is that we don't really have a good gauge of how well it's doing, and crowdsourcing data on recommendations might provide some insight on the state of the platform that's independent of YouTube's own statements.

Algorithms are dumb — Algorithms that power YouTube's recommendations are based on various signals that the company tweaks in order to increase metrics like engagement and time on-site. Once a user watches one video on a topic, the algorithm tends to begin suggesting more videos in a similar vein, potentially locking users in a filter bubble of harmful or false content. Identifying what causes bad videos to be recommended in the first place could help front-run this issue. For example, watching an edgy comedian like Bill Burr might trigger the algorithm to begin recommending more right-wing videos — maybe there's a demographic that YouTube has identified as "angry 30-somethings" that it begins bucketing users in based on trends of other users who watch Bill Burr.

Computers don't have emotion or reasoning, so YouTube has to tune the algorithm in order to combat new trends like pandemic denialism. At a time when the world is dealing with a serious health crisis, Mozilla's new extension may be a real public health initiative.