
Sign up to save your podcasts
Or
AI won't kill us all — but that doesn't make it trustworthy. Instead of getting distracted by future existential risks, AI ethics researcher Sasha Luccioni thinks we need to focus on the technology's current negative impacts, like emitting carbon, infringing copyrights and spreading biased information. She offers practical solutions to regulate our AI-filled future — so it's inclusive and transparent.
For a chance to give your own TED Talk, fill out the Idea Search Application: ted.com/ideasearch.
Interested in learning more about upcoming TED events? Follow these links:
TEDNext: ted.com/futureyou
TEDAI Vienna: ted.com/ai-vienna
TEDAI San Francisco: ted.com/ai-sf
Hosted on Acast. See acast.com/privacy for more information.
4.2
387387 ratings
AI won't kill us all — but that doesn't make it trustworthy. Instead of getting distracted by future existential risks, AI ethics researcher Sasha Luccioni thinks we need to focus on the technology's current negative impacts, like emitting carbon, infringing copyrights and spreading biased information. She offers practical solutions to regulate our AI-filled future — so it's inclusive and transparent.
For a chance to give your own TED Talk, fill out the Idea Search Application: ted.com/ideasearch.
Interested in learning more about upcoming TED events? Follow these links:
TEDNext: ted.com/futureyou
TEDAI Vienna: ted.com/ai-vienna
TEDAI San Francisco: ted.com/ai-sf
Hosted on Acast. See acast.com/privacy for more information.
1,640 Listeners
1,870 Listeners
11,161 Listeners
378 Listeners
1,229 Listeners
1,109 Listeners
1,424 Listeners
1,058 Listeners
195 Listeners
1,446 Listeners
1,415 Listeners
9,174 Listeners
1,251 Listeners
585 Listeners
1,496 Listeners
105 Listeners
2,022 Listeners
772 Listeners
256 Listeners
88 Listeners
1,371 Listeners
1,470 Listeners
220 Listeners
55 Listeners
80 Listeners
212 Listeners
150 Listeners
14 Listeners
45 Listeners
4 Listeners