
Sign up to save your podcasts
Or


Here's something that should make you think twice before asking ChatGPT about that headache.
Oxford University ran a study earlier this year. They gave the same medical scenarios to AI chatbots, twice. First with 100 doctors at the keyboard. Then with 1,300 ordinary people.
The doctors got 95% accuracy. The ordinary people got 34%.
Same AI. Same scenarios. Different humans. Why the gap? And what does it tell us about how the rest of us should be using AI at work?
That's what this week's ThrowForward Thursday is all about. I get into the Dunning-Kruger problem with large language models, and why AI in the hands of a non-expert can be a liability rather than a tool.
I also introduce the 5T AI Impact Model, our team at TomorrowToday Global uses to help organisations get past the productivity hype and into the workflow redesign where AI starts to pay for itself.
More details about the 5T AI Impact Masterclass: https://graemecodrington.com/the-5t-ai-impact-masterclass/
And the links to the research:
By Graeme CodringtonHere's something that should make you think twice before asking ChatGPT about that headache.
Oxford University ran a study earlier this year. They gave the same medical scenarios to AI chatbots, twice. First with 100 doctors at the keyboard. Then with 1,300 ordinary people.
The doctors got 95% accuracy. The ordinary people got 34%.
Same AI. Same scenarios. Different humans. Why the gap? And what does it tell us about how the rest of us should be using AI at work?
That's what this week's ThrowForward Thursday is all about. I get into the Dunning-Kruger problem with large language models, and why AI in the hands of a non-expert can be a liability rather than a tool.
I also introduce the 5T AI Impact Model, our team at TomorrowToday Global uses to help organisations get past the productivity hype and into the workflow redesign where AI starts to pay for itself.
More details about the 5T AI Impact Masterclass: https://graemecodrington.com/the-5t-ai-impact-masterclass/
And the links to the research:

10,254 Listeners