
Sign up to save your podcasts
Or
Most of what people watch on YouTube is recommended by YouTube’s algorithm. Finish one video on how to save a dying houseplant, and it might suggest more. But that system can also send users down rabbit holes that radicalize and misinform. For almost a year, the Mozilla Foundation has been tracking the viewing habits of more than 37,000 volunteers who installed a browser extension letting them identify videos they called “regrettable.” Mozilla found YouTube’s algorithm recommended 70% of those problematic videos. Marketplace’s Kimberly Adams speaks to Brandi Geurkink, senior manager of advocacy at Mozilla, who led the research effort.
4.4
7171 ratings
Most of what people watch on YouTube is recommended by YouTube’s algorithm. Finish one video on how to save a dying houseplant, and it might suggest more. But that system can also send users down rabbit holes that radicalize and misinform. For almost a year, the Mozilla Foundation has been tracking the viewing habits of more than 37,000 volunteers who installed a browser extension letting them identify videos they called “regrettable.” Mozilla found YouTube’s algorithm recommended 70% of those problematic videos. Marketplace’s Kimberly Adams speaks to Brandi Geurkink, senior manager of advocacy at Mozilla, who led the research effort.
1,271 Listeners
1,647 Listeners
890 Listeners
8,630 Listeners
30,821 Listeners
1,358 Listeners
10 Listeners
38 Listeners
5,494 Listeners
1,433 Listeners
9,555 Listeners
3,581 Listeners
5,432 Listeners
1,319 Listeners
82 Listeners
222 Listeners
132 Listeners