
Sign up to save your podcasts
Or
When people think about how AI might ‘go wrong’, most probably picture malevolent computers trying to cause harm. But what if we should be more worried about them seeking pleasure? Thomas Moynihan and Anders Sandberg at the University of Oxford explain why AI experts are worried about wireheading, a phenomenon strangely akin to addiction in humans.
You can read the text version of this in-depth article here. The audio version is read by Peter Hanly in partnership with Noa, News Over Audio. You can listen to more articles from The Conversation, for free, on the Noa app.
The music in In Depth Out Loud is Night Caves by Lee Rosevere. In Depth Out Loud is produced by Gemma Ware.
This story came out of a project at The Conversation called Insights supported by Research England. You can read more stories in the series here.
4.6
88 ratings
When people think about how AI might ‘go wrong’, most probably picture malevolent computers trying to cause harm. But what if we should be more worried about them seeking pleasure? Thomas Moynihan and Anders Sandberg at the University of Oxford explain why AI experts are worried about wireheading, a phenomenon strangely akin to addiction in humans.
You can read the text version of this in-depth article here. The audio version is read by Peter Hanly in partnership with Noa, News Over Audio. You can listen to more articles from The Conversation, for free, on the Noa app.
The music in In Depth Out Loud is Night Caves by Lee Rosevere. In Depth Out Loud is produced by Gemma Ware.
This story came out of a project at The Conversation called Insights supported by Research England. You can read more stories in the series here.
10 Listeners
40 Listeners
2 Listeners
1 Listeners
0 Listeners
4 Listeners
0 Listeners
0 Listeners
0 Listeners
14 Listeners
60 Listeners
0 Listeners
0 Listeners
0 Listeners
0 Listeners
0 Listeners
5 Listeners
45 Listeners
3 Listeners
783 Listeners