
Sign up to save your podcasts
Or


On this week’s If Then, Will Oremus and April Glaser discuss California’s landmark decision to eliminate cash bail for defendants in criminal cases--and the controversial algorithmic “risk assessment” system that will partially replace it. They also hash out a fresh debate over who gets to fact-check the news that appears in your Facebook feed following an outcry in media circles on Tuesday, after Facebook flagged a story in the liberal outlet ThinkProgress as “false”--all because the conservative Weekly Standard had taken issue with its headline.
The hosts are then joined by Professor Safiya Umoja Noble, author of Algorithms of Oppression: How Search Engines Reinforce Racism. Lately, media coverage - and congressional hearings - have focused on potential anti-conservative bias among the big tech companies, but professor’s Noble’s work suggests we may actually have a much different problem.
17:50 - Interview with Safiya Umoja Noble36:36 - Don’t Close My Tabs
Don’t Close My Tabs:
Anatomy of an AI System by Kate Crawford and Vladan Joler
The New Yorker: Can Mark Zuckerberg Fix Facebook Before it Breaks Democracy?
Podcast production by Max Jacobs
If Then plugs:
You can get updates about what’s coming up next by following us on Twitter @ifthenpod. You can follow Will @WillOremus and April @Aprilaser. If you have a question or comment, you can email us at [email protected].
If Then is presented by Slate and Future Tense, a collaboration among Arizona State University, New America, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, follow us on Twitter and sign up for our weekly newsletter.
Listen to If Then via Apple Podcasts, Overcast, Spotify, Stitcher, or Google Play. or Google Play.
Learn more about your ad choices. Visit megaphone.fm/adchoices
By Slate Podcasts4.4
231231 ratings
On this week’s If Then, Will Oremus and April Glaser discuss California’s landmark decision to eliminate cash bail for defendants in criminal cases--and the controversial algorithmic “risk assessment” system that will partially replace it. They also hash out a fresh debate over who gets to fact-check the news that appears in your Facebook feed following an outcry in media circles on Tuesday, after Facebook flagged a story in the liberal outlet ThinkProgress as “false”--all because the conservative Weekly Standard had taken issue with its headline.
The hosts are then joined by Professor Safiya Umoja Noble, author of Algorithms of Oppression: How Search Engines Reinforce Racism. Lately, media coverage - and congressional hearings - have focused on potential anti-conservative bias among the big tech companies, but professor’s Noble’s work suggests we may actually have a much different problem.
17:50 - Interview with Safiya Umoja Noble36:36 - Don’t Close My Tabs
Don’t Close My Tabs:
Anatomy of an AI System by Kate Crawford and Vladan Joler
The New Yorker: Can Mark Zuckerberg Fix Facebook Before it Breaks Democracy?
Podcast production by Max Jacobs
If Then plugs:
You can get updates about what’s coming up next by following us on Twitter @ifthenpod. You can follow Will @WillOremus and April @Aprilaser. If you have a question or comment, you can email us at [email protected].
If Then is presented by Slate and Future Tense, a collaboration among Arizona State University, New America, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, follow us on Twitter and sign up for our weekly newsletter.
Listen to If Then via Apple Podcasts, Overcast, Spotify, Stitcher, or Google Play. or Google Play.
Learn more about your ad choices. Visit megaphone.fm/adchoices

9,184 Listeners

1,061 Listeners

8,485 Listeners

4,053 Listeners

7,862 Listeners

10,734 Listeners

1,378 Listeners

3,527 Listeners

1,026 Listeners

2,845 Listeners

997 Listeners

1,029 Listeners

502 Listeners

5,636 Listeners

1,873 Listeners

53 Listeners

2,064 Listeners

23,897 Listeners

10,231 Listeners

7,225 Listeners

2,402 Listeners

1,284 Listeners

5,450 Listeners

1,195 Listeners

436 Listeners

5,520 Listeners

15,948 Listeners

60 Listeners

48 Listeners

97 Listeners

7 Listeners

130 Listeners

0 Listeners