
Sign up to save your podcasts
Or
Dr. Nachshon (Sean) Goltz and Dr. John Zeleznikow join me to discuss the ethics of artificial intelligence:
Why it is important to understand how AI makes life and death decisions (such as autonomous cars navigating a busy street for example, or supply chain AI allocating scarce medicines to patients)
What happens when the AI makes a life and death decision different than the one we we would make personally?
We discuss the Trolley Car problem, which postulates that a trolley car is travelling uncontrollably and will kill one of two groups of people. How should the trolley (i.e. autonomous car, supply chain AI agent) decide which group lives and which dies? Would you make a different choice?
How does the Jewish tradition approach this question? How does the Holocaust inform how autonomous cars might make this decision?
Dr. Nachshon (Sean Goltz) is a Senior Lecturer in the School of Business and Law at Edith Cowan University in Perth, Australia . You can connect with Sean at his University (https://www.ecu.edu.au/schools/business-and-law/faculty/profiles/senior-lecturer/dr-nachshon-sean-goltz) or on LinkedIn (https://www.linkedin.com/in/nachshon-sean-goltz-b93744132/).
Dr. John Zeleznikow is a Professor at La Trobe University in Melbourne, Australia. You can connect with John at his University contact here - https://scholars.latrobe.edu.au/jzeleznikow.
This podcast was inspired by the article "From the Tree of Knowledge and the Golem of Prague to Kosher Autonomous Cars: The Ethics of Artificial Intelligence Through Jewish Eyes", which was authored by Dr. Sean Goltz, Dr. John Zeleznikow, and Dr. Tracey Dowdeswell, and can be found here (https://academic.oup.com/ojlr/article-abstract/9/1/132/5877403?redirectedFrom=fulltext) Please look for a separate podcast episode with Dr. Dowdeswell on "The 4 Legal Personas of AI".
*Any views and opinions expressed in this podcast are personal and belong solely to the podcast owner or guest and do not represent those of people, institutions or organizations that the owner or guest may or may not be associated with professionally.
5
33 ratings
Dr. Nachshon (Sean) Goltz and Dr. John Zeleznikow join me to discuss the ethics of artificial intelligence:
Why it is important to understand how AI makes life and death decisions (such as autonomous cars navigating a busy street for example, or supply chain AI allocating scarce medicines to patients)
What happens when the AI makes a life and death decision different than the one we we would make personally?
We discuss the Trolley Car problem, which postulates that a trolley car is travelling uncontrollably and will kill one of two groups of people. How should the trolley (i.e. autonomous car, supply chain AI agent) decide which group lives and which dies? Would you make a different choice?
How does the Jewish tradition approach this question? How does the Holocaust inform how autonomous cars might make this decision?
Dr. Nachshon (Sean Goltz) is a Senior Lecturer in the School of Business and Law at Edith Cowan University in Perth, Australia . You can connect with Sean at his University (https://www.ecu.edu.au/schools/business-and-law/faculty/profiles/senior-lecturer/dr-nachshon-sean-goltz) or on LinkedIn (https://www.linkedin.com/in/nachshon-sean-goltz-b93744132/).
Dr. John Zeleznikow is a Professor at La Trobe University in Melbourne, Australia. You can connect with John at his University contact here - https://scholars.latrobe.edu.au/jzeleznikow.
This podcast was inspired by the article "From the Tree of Knowledge and the Golem of Prague to Kosher Autonomous Cars: The Ethics of Artificial Intelligence Through Jewish Eyes", which was authored by Dr. Sean Goltz, Dr. John Zeleznikow, and Dr. Tracey Dowdeswell, and can be found here (https://academic.oup.com/ojlr/article-abstract/9/1/132/5877403?redirectedFrom=fulltext) Please look for a separate podcast episode with Dr. Dowdeswell on "The 4 Legal Personas of AI".
*Any views and opinions expressed in this podcast are personal and belong solely to the podcast owner or guest and do not represent those of people, institutions or organizations that the owner or guest may or may not be associated with professionally.