
Sign up to save your podcasts
Or
"What if the first alien intelligence we truly understood... lived in our oceans?"
In this high-speed deep dive, we explore one of the most mind-bending frontiers in science and AI: decoding dolphin communication using artificial intelligence. And no, this isn’t science fiction—it’s happening right now.
For nearly 40 years, the Wild Dolphin Project has been documenting the lives and sounds of Atlantic spotted dolphins with extraordinary detail—capturing everything from signature whistles (think dolphin names) to aggressive “squawks” and flirtatious buzzes. Now, in partnership with Google and Georgia Tech, their decades of work is converging with cutting-edge tech: meet Dolphin Gemma, a new AI model trained to analyze—and even generate—dolphin vocalizations.
You’ll hear how researchers are using this model to spot hidden structures in dolphin “speech,” predict sound sequences, and possibly even talk back. We also explore the CHAT system, a wild experiment teaching dolphins a new, shared sound vocabulary—like a sci-fi phrasebook—for two-way interaction.
From the depths of the Bahamas to the processor of a Google Pixel 9, this episode unpacks the revolutionary tech making underwater communication with dolphins not just possible, but portable.
Could this be the beginning of interspecies dialogue? What might dolphins teach us about language, intelligence, and ourselves?
Listen now—and prepare to rethink what it means to be heard.
Read more: https://blog.google/technology/ai/dolphingemma/
"What if the first alien intelligence we truly understood... lived in our oceans?"
In this high-speed deep dive, we explore one of the most mind-bending frontiers in science and AI: decoding dolphin communication using artificial intelligence. And no, this isn’t science fiction—it’s happening right now.
For nearly 40 years, the Wild Dolphin Project has been documenting the lives and sounds of Atlantic spotted dolphins with extraordinary detail—capturing everything from signature whistles (think dolphin names) to aggressive “squawks” and flirtatious buzzes. Now, in partnership with Google and Georgia Tech, their decades of work is converging with cutting-edge tech: meet Dolphin Gemma, a new AI model trained to analyze—and even generate—dolphin vocalizations.
You’ll hear how researchers are using this model to spot hidden structures in dolphin “speech,” predict sound sequences, and possibly even talk back. We also explore the CHAT system, a wild experiment teaching dolphins a new, shared sound vocabulary—like a sci-fi phrasebook—for two-way interaction.
From the depths of the Bahamas to the processor of a Google Pixel 9, this episode unpacks the revolutionary tech making underwater communication with dolphins not just possible, but portable.
Could this be the beginning of interspecies dialogue? What might dolphins teach us about language, intelligence, and ourselves?
Listen now—and prepare to rethink what it means to be heard.
Read more: https://blog.google/technology/ai/dolphingemma/