
Sign up to save your podcasts
Or
What can we expect from a world of deepfakes where anything you see or hear might be synthetic and the output of AI? Scientia Professor of Artificial Intelligence at UNSW, Toby Walsh unpacks untruths and warns of a future inundated with machine-generated content, predicting that soon, 99% of what we read, see, and hear will be created by AI. Listen as Toby discusses the urgent need for digital watermarks to authenticate online content, proposing that this technology can help restore trust. However, he cautions that building this infrastructure will take time, leaving us in a precarious situation where truth is increasingly contested.
Presented as part of The Ethics Centre's Festival of Dangerous Ideas, supported by UNSW Sydney.
See omnystudio.com/listener for privacy information.
What can we expect from a world of deepfakes where anything you see or hear might be synthetic and the output of AI? Scientia Professor of Artificial Intelligence at UNSW, Toby Walsh unpacks untruths and warns of a future inundated with machine-generated content, predicting that soon, 99% of what we read, see, and hear will be created by AI. Listen as Toby discusses the urgent need for digital watermarks to authenticate online content, proposing that this technology can help restore trust. However, he cautions that building this infrastructure will take time, leaving us in a precarious situation where truth is increasingly contested.
Presented as part of The Ethics Centre's Festival of Dangerous Ideas, supported by UNSW Sydney.
See omnystudio.com/listener for privacy information.
70 Listeners
769 Listeners
87 Listeners
853 Listeners
213 Listeners
102 Listeners
72 Listeners
9 Listeners
47 Listeners
15 Listeners
85 Listeners
119 Listeners
170 Listeners
245 Listeners
48 Listeners