
Sign up to save your podcasts
Or


In this episode of Generative AI 101, we’re tackling the curious case of AI hallucinations—when AI creates content that’s completely off the mark. We’ll explore how these digital daydreams happen, why humans aren’t immune to similar slip-ups, and the impact these hallucinations can have. Whether amusing or alarming, AI hallucinations are a phenomenon you’ll want to understand.
Check out this Paper: WildHallucinations: Evaluating Long-form Factuality in LLMs with Real-World Entity Queries
Connect with Us: If you enjoyed this episode or have questions, reach out to Emily Laird on LinkedIn. Stay tuned for more insights into the evolving world of generative AI. And remember, you didn't hallucinate this episode, no, this actually happened.
Connect with Emily Laird on LinkedIn
By Emily Laird4.6
1919 ratings
In this episode of Generative AI 101, we’re tackling the curious case of AI hallucinations—when AI creates content that’s completely off the mark. We’ll explore how these digital daydreams happen, why humans aren’t immune to similar slip-ups, and the impact these hallucinations can have. Whether amusing or alarming, AI hallucinations are a phenomenon you’ll want to understand.
Check out this Paper: WildHallucinations: Evaluating Long-form Factuality in LLMs with Real-World Entity Queries
Connect with Us: If you enjoyed this episode or have questions, reach out to Emily Laird on LinkedIn. Stay tuned for more insights into the evolving world of generative AI. And remember, you didn't hallucinate this episode, no, this actually happened.
Connect with Emily Laird on LinkedIn

334 Listeners

152 Listeners

211 Listeners

197 Listeners

154 Listeners

227 Listeners

610 Listeners

274 Listeners

107 Listeners

54 Listeners

173 Listeners

55 Listeners

146 Listeners

62 Listeners

24 Listeners