
Sign up to save your podcasts
Or


Megan Garcia’s son Sewell died by suicide when he was just 14 years old. In the months leading up to his death he had been in a relationship with a chatbot on a platform called Character.ai. Megan was convinced it had something to do with his death, and set out to hold the company to account.
In the third episode in this season, Cristina Criddle speaks to Megan about her story, and to Karandeep Anand, chief executive of Character.ai. Why has this technology been released to children before we understand the effects? Can chatbots capable of creating emotional bonds with users ever be safe for children?
Check out some of Cristina’s reporting on this subject on FT.com:
Character.ai and Google agree to settle lawsuits over teen suicides
AI start-up Character.ai bans teens from talking to chatbots
US regulator launches inquiry into AI ‘companions’ used by teens
Artificial Intimacy is presented by Cristina Criddle and produced by Persis Love and Edwin Lane. The executive producer is Flo Phillips. Sound design is by Breen Turner and Sam Giovinco. The FT’s global head of audio is Cheryl Brumley.
The FT does not use generative AI to voice its podcasts.
If you have been affected by the issues raised in this episode, you can reach out to a mental health helpline, such as the 988 Suicide and Crisis Lifeline in the US or Samaritans in the UK. Help for many other countries can also be found at Befrienders Worldwide.
Read a transcript of this episode on FT.com
Hosted on Acast. See acast.com/privacy for more information.
By Financial Times4.4
9090 ratings
Megan Garcia’s son Sewell died by suicide when he was just 14 years old. In the months leading up to his death he had been in a relationship with a chatbot on a platform called Character.ai. Megan was convinced it had something to do with his death, and set out to hold the company to account.
In the third episode in this season, Cristina Criddle speaks to Megan about her story, and to Karandeep Anand, chief executive of Character.ai. Why has this technology been released to children before we understand the effects? Can chatbots capable of creating emotional bonds with users ever be safe for children?
Check out some of Cristina’s reporting on this subject on FT.com:
Character.ai and Google agree to settle lawsuits over teen suicides
AI start-up Character.ai bans teens from talking to chatbots
US regulator launches inquiry into AI ‘companions’ used by teens
Artificial Intimacy is presented by Cristina Criddle and produced by Persis Love and Edwin Lane. The executive producer is Flo Phillips. Sound design is by Breen Turner and Sam Giovinco. The FT’s global head of audio is Cheryl Brumley.
The FT does not use generative AI to voice its podcasts.
If you have been affected by the issues raised in this episode, you can reach out to a mental health helpline, such as the 988 Suicide and Crisis Lifeline in the US or Samaritans in the UK. Help for many other countries can also be found at Befrienders Worldwide.
Read a transcript of this episode on FT.com
Hosted on Acast. See acast.com/privacy for more information.

4,230 Listeners

295 Listeners

52 Listeners

586 Listeners

154 Listeners

197 Listeners

672 Listeners

237 Listeners

2,586 Listeners

1,087 Listeners

154 Listeners

38 Listeners

371 Listeners

81 Listeners

139 Listeners

196 Listeners

10 Listeners

151 Listeners

31 Listeners