
Sign up to save your podcasts
Or


Most of what people watch on YouTube is recommended by YouTube’s algorithm. Finish one video on how to save a dying houseplant, and it might suggest more. But that system can also send users down rabbit holes that radicalize and misinform. For almost a year, the Mozilla Foundation has been tracking the viewing habits of more than 37,000 volunteers who installed a browser extension letting them identify videos they called “regrettable.” Mozilla found YouTube’s algorithm recommended 70% of those problematic videos. Marketplace’s Kimberly Adams speaks to Brandi Geurkink, senior manager of advocacy at Mozilla, who led the research effort.
By Marketplace4.5
12471,247 ratings
Most of what people watch on YouTube is recommended by YouTube’s algorithm. Finish one video on how to save a dying houseplant, and it might suggest more. But that system can also send users down rabbit holes that radicalize and misinform. For almost a year, the Mozilla Foundation has been tracking the viewing habits of more than 37,000 volunteers who installed a browser extension letting them identify videos they called “regrettable.” Mozilla found YouTube’s algorithm recommended 70% of those problematic videos. Marketplace’s Kimberly Adams speaks to Brandi Geurkink, senior manager of advocacy at Mozilla, who led the research effort.

32,084 Listeners

30,795 Listeners

8,772 Listeners

929 Listeners

1,388 Listeners

1,708 Listeners

4,350 Listeners

2,179 Listeners

5,488 Listeners

56,702 Listeners

1,446 Listeners

9,559 Listeners

3,575 Listeners

6,444 Listeners

6,423 Listeners

163 Listeners

2,998 Listeners

5,522 Listeners

1,376 Listeners

91 Listeners