
Sign up to save your podcasts
Or
By now a lot of us are familiar with chatbot “hallucinations” — the tendency of artificial intelligence language models to make stuff up. And lately we’ve been seeing reports of these tools getting creative with bibliography. For instance, last week The Washington Post reported on the case of a law professor whose name showed up in a list of legal scholars accused of sexual harassment. The list was generated by ChatGPT as part of a research project, and the chatbot cited as its source a March 2018 Washington Post article that doesn’t exist. People have taken to calling these fantasy references “hallucitations.” Marketplace’s Meghan McCarty Carino recently spoke with Bethany Edmunds, a teaching professor at Northeastern University, about why this is happening. Edmunds says this kind of result is to be expected.
4.4
7171 ratings
By now a lot of us are familiar with chatbot “hallucinations” — the tendency of artificial intelligence language models to make stuff up. And lately we’ve been seeing reports of these tools getting creative with bibliography. For instance, last week The Washington Post reported on the case of a law professor whose name showed up in a list of legal scholars accused of sexual harassment. The list was generated by ChatGPT as part of a research project, and the chatbot cited as its source a March 2018 Washington Post article that doesn’t exist. People have taken to calling these fantasy references “hallucitations.” Marketplace’s Meghan McCarty Carino recently spoke with Bethany Edmunds, a teaching professor at Northeastern University, about why this is happening. Edmunds says this kind of result is to be expected.
1,263 Listeners
1,647 Listeners
884 Listeners
8,641 Listeners
30,878 Listeners
1,359 Listeners
10 Listeners
37 Listeners
5,495 Listeners
1,440 Listeners
9,555 Listeners
3,595 Listeners
5,426 Listeners
1,322 Listeners
82 Listeners
221 Listeners
131 Listeners