|YouTube’s ‘Dislike’ Button Doesn’t Do What You Think|
YouTube creators often implore their viewers to ‘smash that Like button,’ believing its feedback to be vital to their future success on the algorithm-driven platform. But a new study from the Mozilla Foundation suggests that users who hit the Dislike button on videos to weed out content they don’t want to see are wasting their time.
The study used inputs from 22,722 users who had installed Mozilla’s RegretsReporter browser extension, who were tracked between December 2021 and June 2022. Researchers analyzed more than half a billion YouTube recommendations that were made after users clicked on one of YouTube’s negative feedback tools, such as the Dislike or Don’t Recommend Channel buttons. “These are the tools YouTube offers for people to control their recommendations, but how does that actually impact your recommended videos?” asks Becca Ricks, senior researcher at Mozilla, pointing to YouTube’s own support site on how to “manage your recommendations and search results.”
Different button inputs had different effects on the likelihood of being recommended similar content going forward. Pressing Don’t Recommend Channel would stop only 43 percent of unwanted video recommendations, according to Mozilla, while the Dislike button stopped only 12 percent of recommendations users did not like. “What we found was that YouTube’s control mechanisms do not really seem to be adequate for preventing unwanted recommendations,” says Ricks.
Mozilla’s investigation was prompted by YouTube’s increased public comments in recent years about its recommendation system. “They’ve been talking a lot about metrics like time well spent or user satisfaction as opposed to watch time,” says Ricks. “We were really curious to what degree some of those signals were being picked up by the algorithm, especially because in the previous YouTube report we worked on, we had heard from people that they didn’t feel like they were in control, or they didn’t really feel like taking actions on unwanted videos really translated well to the recommender system.”
For instance, one user in the Mozilla study responded negatively to this Tucker Carlson clip posted by Fox News on February 13. One month later, he was recommended another clip of Carlson’s TV show, again posted by Fox News’s official YouTube channel. A different user expressed a negative response to a video showing webcams focused on Ukraine’s conflict zones in late February; within a month, they were shown another video, this time from the WarShock YouTube channel, detailing how dead Russian soldiers are removed from Ukraine. Ricks has no qualms with the content of the videos, saying it doesn’t breach YouTube’s guidelines. “But if you as a user say you don't want to see it, it’s kind of shocking that it continues to be recommended,” she says.
“I’m not really surprised,” says Guillaume Chaslot, a former YouTube employee and founder of AlgoTransparency, a site that highlights the YouTube algorithm. “I feel, big picture, you should be able to choose and specify to the algorithm what you want, and YouTube absolutely doesn’t let you do that,” he adds.