In Conversation: An OUP Podcast

The AI Mirror: How to Reclaim Our Humanity in an Age of Machine Thinking


Listen Later

For many, technology offers hope for the future―that promise of shared human flourishing and liberation that always seems to elude our species. Artificial intelligence (AI) technologies spark this hope in a particular way. They promise a future in which human limits and frailties are finally overcome―not by us, but by our machines. Yet rather than open new futures, today's powerful AI technologies reproduce the past. Forged from oceans of our data into immensely powerful but flawed mirrors, they reflect the same errors, biases, and failures of wisdom that we strive to escape. Our new digital mirrors point backward. They show only where the data say that we have already been, never where we might venture together for the first time. To meet today's grave challenges to our species and our planet, we will need something new from AI, and from ourselves. 

In The AI Mirror: How to Reclaim Our Humanity in an Age of Machine Thinking (Oxford UP, 2024), Shannon Vallor makes a wide-ranging, prophetic, and philosophical case for what AI could be: a way to reclaim our human potential for moral and intellectual growth, rather than lose ourselves in mirrors of the past. Rejecting prophecies of doom, she encourages us to pursue technology that helps us recover our sense of the possible, and with it the confidence and courage to repair a broken world. Professor Vallor calls us to rethink what AI is and can be, and what we want to be with it.

Our guest is: Professor Shannon Vallor, who is the Baillie Gifford Professor in the Ethics of Data and AI at the University of Edinburgh, where she directs the Centre for Technomoral Futures in the Edinburgh Futures Institute. She is a standing member of Stanford's One Hundred Year Study of Artificial Intelligence (AI100) and member of the Oversight Board of the Ada Lovelace Institute. Professor Vallor joined the Futures Institute in 2020 following a career in the United States as a leader in the ethics of emerging technologies, including a post as a visiting AI Ethicist at Google from 2018-2020. She is the author of The AI Mirror: Reclaiming Our Humanity in an Age of Machine Thinking; and Technology and the Virtues: A Philosophical Guide to a Future Worth Wanting; and is the editor of The Oxford Handbook of Philosophy of Technology. She serves as advisor to government and industry bodies on responsible AI and data ethics, and is Principal Investigator and Co-Director of the UKRI research programme BRAID (Bridging Responsible AI Divides), funded by the Arts and Humanities Research Council.

Our host is: Dr. Christina Gessler, who is the creator and producer of the Academic Life podcast.

Listeners may enjoy this playlist:

  • More Than A Glitch
  • Artificial Unintelligence: How Computers Misunderstand the World
  • Welcome to Academic Life, the podcast for your academic journey—and beyond! You can support the show by downloading and sharing episodes. Join us again to learn from more experts inside and outside the academy, and around the world. Missed any of the 250+ Academic Life episodes? Find them here. And thank you for listening!

    ...more
    View all episodesView all episodes
    Download on the App Store

    In Conversation: An OUP PodcastBy New Books Network

    • 5
    • 5
    • 5
    • 5
    • 5

    5

    1 ratings


    More shows like In Conversation: An OUP Podcast

    View all
    Tagesgespräch by Schweizer Radio und Fernsehen (SRF)

    Tagesgespräch

    18 Listeners

    Radio 1 - Roger gegen Markus by Radio 1 - Die besten Songs aller Zeiten.

    Radio 1 - Roger gegen Markus

    7 Listeners

    Focus by Schweizer Radio und Fernsehen (SRF)

    Focus

    14 Listeners

    The Panpsycast Philosophy Podcast by Jack Symes | Andrew Horton, Oliver Marley, and Rose de Castellane

    The Panpsycast Philosophy Podcast

    288 Listeners

    The Ancients by History Hit

    The Ancients

    3,263 Listeners

    Change My Mind by Alex Buxeda

    Change My Mind

    0 Listeners