YouTube’s algorithm doesn’t care if you do a ‘thumbs down’ video

A picture of a screen on YouTube with the mouse hovering over the dislike button.

YouTube has already stopped the video from displaying the number of dislikes But obviously giving a video a thumbs down doesn’t change how many similar videos the platform suggests to you.
photo, a chivy ,Shutterstock,

My YouTube recommendations are full of old re-runs of Gordon Ramsay’s Kitchen Nightmare. Watching the entire episode in one night while intoxicated may be partly my fault. Let me tell you, if there’s one thing I don’t want on my feed anymore, it’s famous blowhard Brit tearing down another chef while making the world’s most obnoxious sound effects (bra-rei) Change the background. I disliked a lot of these videos, but now I see Hell’s Kitchen on my page, and I feel more and more like a “raw” steak that Ramsay is simmering and harassing .

But apparently I’m not alone with my YouTube recommendation woes. A report from the Mozilla Foundation released on monday Based on a survey and crowdsourced data claims that “dislike” and “Do not recommend channel” feedback tools do not actually change video recommendations.

Well, there are two points here. The first is that users consistently feel like the control Google-owned YouTube doesn’t provide. Really makes a difference. Second, based on data received from users, that controls provide a “negligible” effect on recommendations meaning “most unwanted videos still get clipped.”

Foundation relied on its data remorse reporter Browser plugin tool that lets users selectively block YouTube videos from appearing on their feeds. Its analysis is based on approximately 2,757 survey respondents and 22,722 people, which allowed Mozilla to reach more than 567 million video recommendations from 2021 to June 2022, the report said.

Although researchers agree that survey respondents are not a representative sample of YouTube vast and diverse audienceOf those surveyed, a third said that using YouTube’s controls didn’t change their video recommendations at all. One user told Mozilla that they would report the video as misleading or spam and that they would return to their feed later. Respondents often said that blocking a channel would only result in recommendations from similar channels.

YouTube’s algorithm recommends users videos they don’t want to see, and it’s often worse than just old Ramsay cable. A 2021 report by Mozilla, again based on crowdsourced user data, claimed that folks surfing the video platform are regularly being recommended violent content, hate speech, and political misinformation.

In this latest report, Mozilla researchers found that pairs of videos including those users rejected, like a Tucker Carlson Here’s another video recommending Skade, the Fox News YouTube channel. Based on reviews of 40,000 video pairs, often when a channel is blocked the algorithm will recommend very similar videos from similar channels. Using the “Dislike” or “Not Interested” buttons prevented 12% and 11% of unwanted recommendations, respectively, compared to the control group. Using the “Don’t suggest channel” and “Remove from watch history” buttons was more effective at correcting users’ feeds, but only by 43% and 29%, respectively.

“In our analysis of the data, we found that YouTube’s user control mechanisms are inadequate as a tool to prevent unwanted recommendations,” the Mozilla researchers wrote in their study.

YouTube spokeswoman Elena Hernandez told Gizmodo in an emailed statement that “our controls do not filter out entire topics or perspectives, as this could have negative implications for viewers, such as creating echo chambers.” The company has stated that they do not prevent all content from being recommended from related topics, but they claim to push “official” content while pushing “borderline” videos that come close to violating content moderation policies. Huh.

one in 2021 blog post, Christos Gudrow—YouTube’s VP of Engineering—wrote that their system is “continuously evolving” but that providing transparency on their algorithms is “not as simple as listing a formula for recommendations” because their systems take clicks into account. , lets see time, survey responses, sharing, likes and dislikes.

Of course, like every social media platform, YouTube has struggled to create systems that can totally fight Bad or even violent content being uploaded to the site. a book to come Gizmodo. exclusively shared with said YouTube came close to generating billions of dollars in advertising revenue for dealing with strange and disturbing videos being recommended to children.

While Hernandez claimed that the company had expanded its API dataSpa“Mozilla’s report doesn’t take into account how our systems actually work, and so it’s difficult for us to get many insights,” Okperson said.

But it’s a criticism that Mozilla also lays at Google’s feet, saying the company doesn’t provide researchers with enough access to assess what impacts YouTube’s secret sauce, AKA their algorithms.

Be the first to comment

Leave a Reply

Your email address will not be published.