
Sign up to save your podcasts
Or
Most of what people watch on YouTube is recommended by YouTube’s algorithm. Finish one video on how to save a dying houseplant, and it might suggest more. But that system can also send users down rabbit holes that radicalize and misinform. For almost a year, the Mozilla Foundation has been tracking the viewing habits of more than 37,000 volunteers who installed a browser extension letting them identify videos they called “regrettable.” Mozilla found YouTube’s algorithm recommended 70% of those problematic videos. Marketplace’s Kimberly Adams speaks to Brandi Geurkink, senior manager of advocacy at Mozilla, who led the research effort.
4.5
12351,235 ratings
Most of what people watch on YouTube is recommended by YouTube’s algorithm. Finish one video on how to save a dying houseplant, and it might suggest more. But that system can also send users down rabbit holes that radicalize and misinform. For almost a year, the Mozilla Foundation has been tracking the viewing habits of more than 37,000 volunteers who installed a browser extension letting them identify videos they called “regrettable.” Mozilla found YouTube’s algorithm recommended 70% of those problematic videos. Marketplace’s Kimberly Adams speaks to Brandi Geurkink, senior manager of advocacy at Mozilla, who led the research effort.
6,097 Listeners
1,647 Listeners
890 Listeners
1,747 Listeners
8,630 Listeners
30,821 Listeners
1,358 Listeners
32,252 Listeners
2,168 Listeners
5,490 Listeners
1,433 Listeners
9,555 Listeners
3,581 Listeners
6,244 Listeners
163 Listeners
2,695 Listeners
155 Listeners
1,319 Listeners
82 Listeners