
Sign up to save your podcasts
Or


On Christmas Eve, Elon Musk’s X rolled out an in-app tool that lets users alter other people’s photos and post the results directly in reply. With minimal safeguards, it quickly became a pipeline for sexualized, non-consensual deepfakes, including imagery involving minors, delivered straight into victims’ notifications.
Renée DiResta, Hany Farid, and Casey Newton join Kara to dig into the scale of the harm, the failure of app stores and regulators to act quickly, and why the “free speech” rhetoric used to defend the abuse is incoherent. Kara explores what accountability could look like — and what comes next as AI tools get more powerful.
Renée DiResta is the former technical research manager at Stanford's Internet Observatory. She researched online CSAM for years and is one of the world’s leading experts on online disinformation and propaganda. She’s also the author of Invisible Rulers: The People Who Turn Lies into Reality.
Hany Farid is a professor of computer sciences and engineering at the University of California, Berkeley. He’s been described as the father of digital image forensics and has spent years developing tools to combat CSAM.
Casey Newton is the founder of the tech newsletter Platformer and the co-host of The New York Times podcast Hard Fork.
This episode was recorded on Tuesday, January 20th.
When reached for comment, a spokesperson for X referred us to a a statement post on X, which reads in part:
We remain committed to making X a safe platform for everyone and continue to have zero tolerance for any forms of child sexual exploitation, non-consensual nudity, and unwanted sexual content.
Questions? Comments? Email us at [email protected] or find us on YouTube, Instagram, TikTok, Threads, and Bluesky @onwithkaraswisher.
Learn more about your ad choices. Visit podcastchoices.com/adchoices
By Vox Media4.3
28992,899 ratings
On Christmas Eve, Elon Musk’s X rolled out an in-app tool that lets users alter other people’s photos and post the results directly in reply. With minimal safeguards, it quickly became a pipeline for sexualized, non-consensual deepfakes, including imagery involving minors, delivered straight into victims’ notifications.
Renée DiResta, Hany Farid, and Casey Newton join Kara to dig into the scale of the harm, the failure of app stores and regulators to act quickly, and why the “free speech” rhetoric used to defend the abuse is incoherent. Kara explores what accountability could look like — and what comes next as AI tools get more powerful.
Renée DiResta is the former technical research manager at Stanford's Internet Observatory. She researched online CSAM for years and is one of the world’s leading experts on online disinformation and propaganda. She’s also the author of Invisible Rulers: The People Who Turn Lies into Reality.
Hany Farid is a professor of computer sciences and engineering at the University of California, Berkeley. He’s been described as the father of digital image forensics and has spent years developing tools to combat CSAM.
Casey Newton is the founder of the tech newsletter Platformer and the co-host of The New York Times podcast Hard Fork.
This episode was recorded on Tuesday, January 20th.
When reached for comment, a spokesperson for X referred us to a a statement post on X, which reads in part:
We remain committed to making X a safe platform for everyone and continue to have zero tolerance for any forms of child sexual exploitation, non-consensual nudity, and unwanted sexual content.
Questions? Comments? Email us at [email protected] or find us on YouTube, Instagram, TikTok, Threads, and Bluesky @onwithkaraswisher.
Learn more about your ad choices. Visit podcastchoices.com/adchoices

6,817 Listeners

7,866 Listeners

10,708 Listeners

2,687 Listeners

9,748 Listeners

3,654 Listeners

3,143 Listeners

87,787 Listeners

113,219 Listeners

1,487 Listeners

32,361 Listeners

12,740 Listeners

2,163 Listeners

8,146 Listeners

5,993 Listeners

23,591 Listeners

5,643 Listeners

737 Listeners

5,547 Listeners

4,683 Listeners

6,493 Listeners

16,303 Listeners

2,312 Listeners

10,796 Listeners

1,217 Listeners

152 Listeners

3,181 Listeners

1,762 Listeners

1,455 Listeners

1,371 Listeners

431 Listeners

31 Listeners