71% of YouTube's COVID-19 misinformation was recommended by algorithm, study says

YouTube’s recommendation algorithm still regularly suggests videos with COVID-19 misinformation, according to research published July 7 by nonprofit Mozilla Foundation.

The nonprofit crowdsourced its investigation by having more than 37,000 YouTube users report content that contained misinformation, violence or hate. Users reported the content using a browser extension, and their reports were then analyzed by researchers at the University of Exeter in England.

Seventy-one percent of all reported content came from videos YouTube’s algorithm recommended. Videos that were recommended by the algorithm were 40 percent more likely to be reported than videos that were searched for, the study said.

The study also revealed the reported content got 70 percent more views per day than other non-reported YouTube videos watched by the study's participants.

"When it's actively suggesting that people watch content that violates YouTube's policies, the algorithm seems to be working at odds with the platform's stated aims, their own community guidelines, and the goal of making the platform a safe place for people," Brandi Geurkink, Mozilla's senior manager of advocacy, told NBC News.

YouTube told NBC News it "constantly" improves its user experience and has launched more than 30 changes to reduce harmful content recommendations in the past year.

"Thanks to this change, consumption of borderline content that comes from our recommendations is now significantly below 1 percent," the tech giant said.

 

Copyright © 2024 Becker's Healthcare. All Rights Reserved. Privacy Policy. Cookie Policy. Linking and Reprinting Policy.

 

Featured Whitepapers

Featured Webinars

>