
Sign up to save your podcasts
Or


What if your voice could be stolen? In Part Two, Dr. Tanusree Sharma reveals the hidden risks behind voice AI: how the same recordings that powered tools like Siri and Alexa are now being cloned, weaponized, and monetized without consent.
She introduces PRAC3-a bold new framework blending privacy, reputation, and accountability with traditional consent models-and calls AI leaders to rethink how they handle voice data before trust is lost for good.
From creative rights to biometric identity, this conversation is a must-listen for anyone shaping the future of synthetic speech.
Join us and explore why voice governance can't wait.
By The ADNA5
1212 ratings
What if your voice could be stolen? In Part Two, Dr. Tanusree Sharma reveals the hidden risks behind voice AI: how the same recordings that powered tools like Siri and Alexa are now being cloned, weaponized, and monetized without consent.
She introduces PRAC3-a bold new framework blending privacy, reputation, and accountability with traditional consent models-and calls AI leaders to rethink how they handle voice data before trust is lost for good.
From creative rights to biometric identity, this conversation is a must-listen for anyone shaping the future of synthetic speech.
Join us and explore why voice governance can't wait.