Mozilla researchers analyzed seven months of YouTube activity from more than 20,000 participants to evaluate four ways in which YouTube says people can “tweak their recommendations” —hitting Dislike, Do not care, Delete from historyor Do not recommend this channel. They wanted to see how effective these controls really were.
Each participant who installed a browser extension added one Stop suggesting at the top of each YouTube video they’ve watched, along with the videos in their sidebar. Tapping it will trigger one of four algorithmic tuning responses each time.
Then, dozens of research assistants looked at those disapproved videos to see how similar they were to tens of thousands of subsequent recommendations from YouTube for the same users. They found that YouTube’s controls had a “negligible” effect on the recommendations participants received. Over seven months, on average, one disapproved video generated about 115 bad recommendations — videos that closely resembled videos participants told YouTube they didn’t want to see.
Previous research has shown that YouTube’s way of recommending videos where you’re likely to agree with and reward controversial content can make people’s point of view more difficult and steer them toward political radicalization. The platform has also been repeatedly criticized for promoting sexually explicit or sexually explicit videos about kids—Push content that violates its own policies to go viral. After due diligence, YouTube has committed to preventing hate speech, better enforcing its guidelines, and not using a recommendation algorithm to promote “limited” content.
However, research shows that content that appears to violate YouTube’s policies is still actively recommended to users even after they have submitted negative feedback.
To hit Dislikethe most visible way to provide negative feedback, stopping only 12% of bad suggestions; Do not care stop only 11%. YouTube advertises both options as ways to tweak its algorithm.
Elena Hernandez, a spokesperson for YouTube, said: “Our controls don’t filter out entire topics or opinions, as this could have negative effects for viewers, for example. like creating echo chambers”. Hernandez also said Mozilla’s report doesn’t take into account how YouTube’s algorithm actually works. But that’s something no one outside of YouTube really knows, given the billions of algorithmic inputs and the company’s limited transparency. Mozilla’s research attempts to peer into that black box to better understand its output.