
Sign up to save your podcasts
Or


By now a lot of us are familiar with chatbot “hallucinations” — the tendency of artificial intelligence language models to make stuff up. And lately we’ve been seeing reports of these tools getting creative with bibliography. For instance, last week The Washington Post reported on the case of a law professor whose name showed up in a list of legal scholars accused of sexual harassment. The list was generated by ChatGPT as part of a research project, and the chatbot cited as its source a March 2018 Washington Post article that doesn’t exist. People have taken to calling these fantasy references “hallucitations.” Marketplace’s Meghan McCarty Carino recently spoke with Bethany Edmunds, a teaching professor at Northeastern University, about why this is happening. Edmunds says this kind of result is to be expected.
By Marketplace4.4
7777 ratings
By now a lot of us are familiar with chatbot “hallucinations” — the tendency of artificial intelligence language models to make stuff up. And lately we’ve been seeing reports of these tools getting creative with bibliography. For instance, last week The Washington Post reported on the case of a law professor whose name showed up in a list of legal scholars accused of sexual harassment. The list was generated by ChatGPT as part of a research project, and the chatbot cited as its source a March 2018 Washington Post article that doesn’t exist. People have taken to calling these fantasy references “hallucitations.” Marketplace’s Meghan McCarty Carino recently spoke with Bethany Edmunds, a teaching professor at Northeastern University, about why this is happening. Edmunds says this kind of result is to be expected.

30,609 Listeners

8,801 Listeners

941 Listeners

1,390 Listeners

1,290 Listeners

3,228 Listeners

1,713 Listeners

9,724 Listeners

1,649 Listeners

5,480 Listeners

113,121 Listeners

1,448 Listeners

9,556 Listeners

10 Listeners

35 Listeners

5,576 Listeners

16,525 Listeners