
Sign up to save your podcasts
Or


By now a lot of us are familiar with chatbot “hallucinations” — the tendency of artificial intelligence language models to make stuff up. And lately we’ve been seeing reports of these tools getting creative with bibliography. For instance, last week The Washington Post reported on the case of a law professor whose name showed up in a list of legal scholars accused of sexual harassment. The list was generated by ChatGPT as part of a research project, and the chatbot cited as its source a March 2018 Washington Post article that doesn’t exist. People have taken to calling these fantasy references “hallucitations.” Marketplace’s Meghan McCarty Carino recently spoke with Bethany Edmunds, a teaching professor at Northeastern University, about why this is happening. Edmunds says this kind of result is to be expected.
By Marketplace4.4
7676 ratings
By now a lot of us are familiar with chatbot “hallucinations” — the tendency of artificial intelligence language models to make stuff up. And lately we’ve been seeing reports of these tools getting creative with bibliography. For instance, last week The Washington Post reported on the case of a law professor whose name showed up in a list of legal scholars accused of sexual harassment. The list was generated by ChatGPT as part of a research project, and the chatbot cited as its source a March 2018 Washington Post article that doesn’t exist. People have taken to calling these fantasy references “hallucitations.” Marketplace’s Meghan McCarty Carino recently spoke with Bethany Edmunds, a teaching professor at Northeastern University, about why this is happening. Edmunds says this kind of result is to be expected.

32,214 Listeners

8,783 Listeners

5,125 Listeners

927 Listeners

1,385 Listeners

1,278 Listeners

6,439 Listeners

5,492 Listeners

112,858 Listeners

56,917 Listeners

9,567 Listeners

10 Listeners

16,376 Listeners

35 Listeners

6,082 Listeners

5,535 Listeners

16,173 Listeners