
Sign up to save your podcasts
Or
Artificial intelligence is letting us make predictive algorithms that translate languages and spot diseases as well or better than humans. But these systems are also being used to make decisions about hiring and criminal sentencing. Do computers trained on vast datasets of human experience learn human biases, like sexism and racism? Is it possible to create an algorithm that is fair for everyone? And should you have the right to know when these algorithms are being used and how they work?
For links to materials referenced in the episode, suggestions for further learning, and guest bios, visit bravenewplanet.org.
Learn more about your ad-choices at https://www.iheartpodcastnetwork.com
See omnystudio.com/listener for privacy information.
4.4
434434 ratings
Artificial intelligence is letting us make predictive algorithms that translate languages and spot diseases as well or better than humans. But these systems are also being used to make decisions about hiring and criminal sentencing. Do computers trained on vast datasets of human experience learn human biases, like sexism and racism? Is it possible to create an algorithm that is fair for everyone? And should you have the right to know when these algorithms are being used and how they work?
For links to materials referenced in the episode, suggestions for further learning, and guest bios, visit bravenewplanet.org.
Learn more about your ad-choices at https://www.iheartpodcastnetwork.com
See omnystudio.com/listener for privacy information.
43,800 Listeners
90,826 Listeners
31,987 Listeners
26,163 Listeners
1,704 Listeners
43,334 Listeners
59,415 Listeners
4,095 Listeners
15,875 Listeners
9,551 Listeners
572 Listeners
14,530 Listeners
5,107 Listeners
405 Listeners
72 Listeners
1,805 Listeners
3,700 Listeners
3,720 Listeners
2,118 Listeners
5,351 Listeners
80 Listeners
406 Listeners
1,208 Listeners
232 Listeners
263 Listeners
433 Listeners
148 Listeners
370 Listeners
332 Listeners
33 Listeners
96 Listeners
77 Listeners
246 Listeners