
Sign up to save your podcasts
Or


Listener Gordon is worried that as AI content spreads across the web there'll be proportionally less and less human content for the AI’s to be trained on with the result their output will just get blander and blander.
He’s right to be worried, Aleks and Kevin explore the phenomena of ‘model collapse’ the inevitable breakdown of an AI to give useful results if its training data is already AI produced. Speaking to NYU data scientist Professor Julia Kempe the pair discover that training on AI generated data also means a brick wall in terms of improving AI performance.
There is hop however according to Shayne Longpre of the Data Provenance Initiative the answer is to put humans back in the loop to curate the data for the AI’s and teaching them what’s good data from bad.
Presenters: Aleks Krotoski & Kevin Fong
By BBC Radio 45
11 ratings
Listener Gordon is worried that as AI content spreads across the web there'll be proportionally less and less human content for the AI’s to be trained on with the result their output will just get blander and blander.
He’s right to be worried, Aleks and Kevin explore the phenomena of ‘model collapse’ the inevitable breakdown of an AI to give useful results if its training data is already AI produced. Speaking to NYU data scientist Professor Julia Kempe the pair discover that training on AI generated data also means a brick wall in terms of improving AI performance.
There is hop however according to Shayne Longpre of the Data Provenance Initiative the answer is to put humans back in the loop to curate the data for the AI’s and teaching them what’s good data from bad.
Presenters: Aleks Krotoski & Kevin Fong

891 Listeners

2,119 Listeners

2,078 Listeners

347 Listeners

1,190 Listeners

406 Listeners

823 Listeners

632 Listeners

737 Listeners

3,033 Listeners

993 Listeners

857 Listeners

24 Listeners

65 Listeners

263 Listeners