
Sign up to save your podcasts
Or
By now a lot of us are familiar with chatbot “hallucinations” — the tendency of artificial intelligence language models to make stuff up. And lately we’ve been seeing reports of these tools getting creative with bibliography. For instance, last week The Washington Post reported on the case of a law professor whose name showed up in a list of legal scholars accused of sexual harassment. The list was generated by ChatGPT as part of a research project, and the chatbot cited as its source a March 2018 Washington Post article that doesn’t exist. People have taken to calling these fantasy references “hallucitations.” Marketplace’s Meghan McCarty Carino recently spoke with Bethany Edmunds, a teaching professor at Northeastern University, about why this is happening. Edmunds says this kind of result is to be expected.
4.5
12341,234 ratings
By now a lot of us are familiar with chatbot “hallucinations” — the tendency of artificial intelligence language models to make stuff up. And lately we’ve been seeing reports of these tools getting creative with bibliography. For instance, last week The Washington Post reported on the case of a law professor whose name showed up in a list of legal scholars accused of sexual harassment. The list was generated by ChatGPT as part of a research project, and the chatbot cited as its source a March 2018 Washington Post article that doesn’t exist. People have taken to calling these fantasy references “hallucitations.” Marketplace’s Meghan McCarty Carino recently spoke with Bethany Edmunds, a teaching professor at Northeastern University, about why this is happening. Edmunds says this kind of result is to be expected.
6,055 Listeners
884 Listeners
8,640 Listeners
30,874 Listeners
1,359 Listeners
32,237 Listeners
43,381 Listeners
2,168 Listeners
5,494 Listeners
1,437 Listeners
9,552 Listeners
3,595 Listeners
6,242 Listeners
163 Listeners
2,686 Listeners
1,321 Listeners
1,598 Listeners
82 Listeners
221 Listeners