
Sign up to save your podcasts
Or


Welcome to Hippo Education's Practicing with AI, conversations about medicine, AI, and the people navigating both. This month, Rob and Vicky tackle a common pitfall of AI: hallucinations. What are hallucinations (and is that even the right term)? Why do these types of errors happen? And what can individuals do to reduce the hallucination rate? Plus, Rob and Vicky dive into OpenAI's most recent model release, ChatGPT 5, and analyze its performance against older GPT models.
For those who want to dive deeper into OpenAI's HealthBench benchmark:
Visit speakpipe.com/hippoed to leave a voice message about anything related to AI and medicine: your excitement, your concerns, your own experiences with AI… anything. Your voice might even make it onto a future episode.
By Hippo EducationWelcome to Hippo Education's Practicing with AI, conversations about medicine, AI, and the people navigating both. This month, Rob and Vicky tackle a common pitfall of AI: hallucinations. What are hallucinations (and is that even the right term)? Why do these types of errors happen? And what can individuals do to reduce the hallucination rate? Plus, Rob and Vicky dive into OpenAI's most recent model release, ChatGPT 5, and analyze its performance against older GPT models.
For those who want to dive deeper into OpenAI's HealthBench benchmark:
Visit speakpipe.com/hippoed to leave a voice message about anything related to AI and medicine: your excitement, your concerns, your own experiences with AI… anything. Your voice might even make it onto a future episode.