
Sign up to save your podcasts
Or


A new report from Stanford University's Internet Observatory reveals that the National Center for Missing and Exploited Children is ill-prepared to combat child sexual abuse material (CSAM) generated by artificial intelligence (AI). Criminals are using AI technology to create explicit images, making it difficult for authorities to identify and rescue real victims. The CyberTipline, which collects reports on CSAM, is overwhelmed by incomplete and inaccurate tips and the sheer volume of reports. The report calls for updated technology and laws to address this crime, and lawmakers are already working to criminalize the use of AI-generated explicit content. The report emphasizes the urgent need for increased funding and improved access to technology for the National Center for Missing and Exploited Children.
By Dr. Tony Hoang4.6
99 ratings
A new report from Stanford University's Internet Observatory reveals that the National Center for Missing and Exploited Children is ill-prepared to combat child sexual abuse material (CSAM) generated by artificial intelligence (AI). Criminals are using AI technology to create explicit images, making it difficult for authorities to identify and rescue real victims. The CyberTipline, which collects reports on CSAM, is overwhelmed by incomplete and inaccurate tips and the sheer volume of reports. The report calls for updated technology and laws to address this crime, and lawmakers are already working to criminalize the use of AI-generated explicit content. The report emphasizes the urgent need for increased funding and improved access to technology for the National Center for Missing and Exploited Children.

91,069 Listeners

32,152 Listeners

229,110 Listeners

1,100 Listeners

341 Listeners

56,469 Listeners

154 Listeners

8,877 Listeners

2,049 Listeners

9,902 Listeners

506 Listeners

1,863 Listeners

76 Listeners

268 Listeners

4,245 Listeners