Home > Business > Mozilla needs your help to expose YouTube’s recommendation algorithm

Mozilla needs your help to expose YouTube’s recommendation algorithm


YouTube‘s recommendation algorithm drives more than 70% of the videos we watch on the site. But its suggestions have attracted criticism from across the spectrum.

A developer who’s worked on the system said last year that it’s “toxic” and pushes users towards conspiracy videos, but a recent study found it favors mainstream media channels and actively discourages viewers from watching radical content.

Mine, of course, suggests videos on charitable causes, doctoral degrees, and ethical investment opportunities. But other users receive less noble recommendations.

If you’re one of them, a new browser extension from Mozilla could offer you some insights into the horrors lurking “Up next.”

[Read: Are EVs too expensive? Here are 5 common myths, debunked]

After installing the RegretsReporter and playing a YouTube video, you can click the frowning face icon in your browser to report the video, the recommendations that led you to it, and any extra details on “your regret.” Mozilla researchers will then search for patterns that led to the recommendations.

In a blogpost, Ashley Boyd, Mozilla’s VP of Advocacy & Engagement, gave three examples of what the extension could uncover:

  • What type of recommended videos lead to racist, violent, or conspiratorial content?

  • Are there patterns in terms of frequency or severity of harmful content? 

  • Are there specific YouTube usage patterns that lead to harmful content being recommended? 

“We ask users not to modify their YouTube behavior when using this extension,” said Boyd. “Don’t seek out regrettable content. Instead, use YouTube as you normally do. That is the only way that we can collectively understand whether YouTube’s problem with recommending regrettable content is improving, and which areas they need to do better on.”

Credit: Mozilla