YouTube Algorithm Keeps Recommending ‘Regrettable’ Videos

0
>>Follow Matzav On Whatsapp!<<

YouTube users have reported potentially objectionable content in thousands of videos recommended to them using the platform’s algorithm, according to the nonprofit Mozilla Foundation.

The findings, released Wednesday, revealed many instances of YouTube recommending videos that users had marked as “regrettable” — a broad category including misinformation, violence and hate speech.

The 10-month long investigation used crowdsourced data gathered by the foundation using an extension for its Firefox web browser as well as a browser extension created for Chrome users to report potentially problematic content.

Mozilla gathered 3,362 reports submitted by 1,622 unique contributors coming from 91 nations between July 2020 and June of this year.

The nonprofit then hired 41 researchers from the University of Exeter to review the submissions and determine if they thought videos should be on YouTube and, if not, what platform guidelines they may violate.

Researchers found that 71 percent of videos flagged by users as regrettable came from YouTube’s own recommendations. Those videos also tended to be much more popular than others viewed by volunteers, suggesting the company’s algorithm favored objectionable content.

Read more at The Hill.

{Matzav.com}


LEAVE A REPLY

Please enter your comment!
Please enter your name here