
Sign up to save your podcasts
Or


In this episode of Deep Learning Dialogues, hosts Whitney McKinley and Katrina Gouett explore the complex intersection of artificial intelligence and human values with Father Philip Larrey. Under the title "The Alignment Problem: Bridging the Gap Between Tech and Tradition," the conversation delves into the philosophical distinction between a machine's ability to "select" and a human's capacity to "choose". Father Larrey shares insights from his extensive work with Silicon Valley leaders and the Vatican, addressing critical issues such as the "black box" of AI decision-making, the ethical implications of the European AI Act, and how ancient wisdom from thinkers like Aristotle can provide a necessary framework for navigating the future of generative AI. Read the summary here.
Fr. Philip Larrey, Ph.D., is a Catholic priest and professor of philosophy at Boston College, where his work focuses on the philosophy of knowledge and the impact of the digital era on society. He previously spent over 20 years in Rome, serving as the Chair of Logic and Epistemology and Dean of the Philosophy Department at the Pontifical Lateran University in the Vatican. As the chairman of Humanity 2.0—a non-profit collaborating with the Vatican to promote human flourishing—he is a leading voice in global discussions on AI ethics and has authored several influential books, including Connected World and Artificial Humanity. Based in Boston, Fr. Philip continues to engage with industry giants and international organizations like the United Nations to bridge the gap between technological advancement and ethical responsibility.
Find more information about Father Philip Larrey:
Want to know more?
You can check out our: WCDSB GenAI Guidelines, infographics, and Innovation website: https://innovate.wcdsb.ca/
Feedback? You can ask your questions or give us feedback on the show here
Want to get in touch? Contact Katrina & Whitney by email at: [email protected] and [email protected] or on LinkedIn
Hosted on Acast. See acast.com/privacy for more information.
By Katrina Gouett and Whitney McKinleyIn this episode of Deep Learning Dialogues, hosts Whitney McKinley and Katrina Gouett explore the complex intersection of artificial intelligence and human values with Father Philip Larrey. Under the title "The Alignment Problem: Bridging the Gap Between Tech and Tradition," the conversation delves into the philosophical distinction between a machine's ability to "select" and a human's capacity to "choose". Father Larrey shares insights from his extensive work with Silicon Valley leaders and the Vatican, addressing critical issues such as the "black box" of AI decision-making, the ethical implications of the European AI Act, and how ancient wisdom from thinkers like Aristotle can provide a necessary framework for navigating the future of generative AI. Read the summary here.
Fr. Philip Larrey, Ph.D., is a Catholic priest and professor of philosophy at Boston College, where his work focuses on the philosophy of knowledge and the impact of the digital era on society. He previously spent over 20 years in Rome, serving as the Chair of Logic and Epistemology and Dean of the Philosophy Department at the Pontifical Lateran University in the Vatican. As the chairman of Humanity 2.0—a non-profit collaborating with the Vatican to promote human flourishing—he is a leading voice in global discussions on AI ethics and has authored several influential books, including Connected World and Artificial Humanity. Based in Boston, Fr. Philip continues to engage with industry giants and international organizations like the United Nations to bridge the gap between technological advancement and ethical responsibility.
Find more information about Father Philip Larrey:
Want to know more?
You can check out our: WCDSB GenAI Guidelines, infographics, and Innovation website: https://innovate.wcdsb.ca/
Feedback? You can ask your questions or give us feedback on the show here
Want to get in touch? Contact Katrina & Whitney by email at: [email protected] and [email protected] or on LinkedIn
Hosted on Acast. See acast.com/privacy for more information.