
Sign up to save your podcasts
Or


Most of what people watch on YouTube is recommended by YouTube’s algorithm. Finish one video on how to save a dying houseplant, and it might suggest more. But that system can also send users down rabbit holes that radicalize and misinform. For almost a year, the Mozilla Foundation has been tracking the viewing habits of more than 37,000 volunteers who installed a browser extension letting them identify videos they called “regrettable.” Mozilla found YouTube’s algorithm recommended 70% of those problematic videos. Marketplace’s Kimberly Adams speaks to Brandi Geurkink, senior manager of advocacy at Mozilla, who led the research effort.
By Marketplace4.5
12471,247 ratings
Most of what people watch on YouTube is recommended by YouTube’s algorithm. Finish one video on how to save a dying houseplant, and it might suggest more. But that system can also send users down rabbit holes that radicalize and misinform. For almost a year, the Mozilla Foundation has been tracking the viewing habits of more than 37,000 volunteers who installed a browser extension letting them identify videos they called “regrettable.” Mozilla found YouTube’s algorithm recommended 70% of those problematic videos. Marketplace’s Kimberly Adams speaks to Brandi Geurkink, senior manager of advocacy at Mozilla, who led the research effort.

32,054 Listeners

30,726 Listeners

8,769 Listeners

926 Listeners

1,387 Listeners

1,708 Listeners

4,331 Listeners

2,179 Listeners

5,492 Listeners

56,523 Listeners

1,446 Listeners

9,530 Listeners

3,584 Listeners

6,444 Listeners

6,398 Listeners

163 Listeners

2,996 Listeners

5,526 Listeners

1,377 Listeners

90 Listeners