
Sign up to save your podcasts
Or


In this episode, we spoke with Cornelia C. Walther about her three books examining technology's role in society. Walther, who spent nearly two decades with UNICEF and the World Food Program before joining Wharton's AI & Analytics Initiative, brings field experience from West Africa, Asia, and the Caribbean to her analysis of how human choices shape technological outcomes.
The conversation covered her work on COVID-19's impact on digital inequality, her framework for understanding how values get embedded in AI systems, and her concept of "Aspirational Algorithms"—technology designed to enhance rather than exploit human capabilities. We discussed practical questions about AI governance, who participates in technology development, and how different communities approach technological change. Walther's "Values In, Values Out" framework provided a useful lens for examining how the data and assumptions we feed into AI systems shape their outputs.
The discussion examined the relationship between technology design, social structures, and human agency. We explored how pandemic technologies became normalized, whose voices are included in AI development, and what it means to create "prosocial" technology in practice.
Learn more about your ad choices. Visit megaphone.fm/adchoices
Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/science-technology-and-society
By New Books Network3.7
3131 ratings
In this episode, we spoke with Cornelia C. Walther about her three books examining technology's role in society. Walther, who spent nearly two decades with UNICEF and the World Food Program before joining Wharton's AI & Analytics Initiative, brings field experience from West Africa, Asia, and the Caribbean to her analysis of how human choices shape technological outcomes.
The conversation covered her work on COVID-19's impact on digital inequality, her framework for understanding how values get embedded in AI systems, and her concept of "Aspirational Algorithms"—technology designed to enhance rather than exploit human capabilities. We discussed practical questions about AI governance, who participates in technology development, and how different communities approach technological change. Walther's "Values In, Values Out" framework provided a useful lens for examining how the data and assumptions we feed into AI systems shape their outputs.
The discussion examined the relationship between technology design, social structures, and human agency. We explored how pandemic technologies became normalized, whose voices are included in AI development, and what it means to create "prosocial" technology in practice.
Learn more about your ad choices. Visit megaphone.fm/adchoices
Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/science-technology-and-society

6,779 Listeners

301 Listeners

9,221 Listeners

4,045 Listeners

10,712 Listeners

5,474 Listeners

144 Listeners

1,460 Listeners

6,307 Listeners

7,227 Listeners

2,052 Listeners

576 Listeners

196 Listeners

5,527 Listeners

16,022 Listeners