
Sign up to save your podcasts
Or


In this episode, we spoke with Cornelia C. Walther about her three books examining technology's role in society. Walther, who spent nearly two decades with UNICEF and the World Food Program before joining Wharton's AI & Analytics Initiative, brings field experience from West Africa, Asia, and the Caribbean to her analysis of how human choices shape technological outcomes.
The conversation covered her work on COVID-19's impact on digital inequality, her framework for understanding how values get embedded in AI systems, and her concept of "Aspirational Algorithms"—technology designed to enhance rather than exploit human capabilities. We discussed practical questions about AI governance, who participates in technology development, and how different communities approach technological change. Walther's "Values In, Values Out" framework provided a useful lens for examining how the data and assumptions we feed into AI systems shape their outputs.
The discussion examined the relationship between technology design, social structures, and human agency. We explored how pandemic technologies became normalized, whose voices are included in AI development, and what it means to create "prosocial" technology in practice.
Learn more about your ad choices. Visit megaphone.fm/adchoices
Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/science-technology-and-society
By New Books Network3.7
3131 ratings
In this episode, we spoke with Cornelia C. Walther about her three books examining technology's role in society. Walther, who spent nearly two decades with UNICEF and the World Food Program before joining Wharton's AI & Analytics Initiative, brings field experience from West Africa, Asia, and the Caribbean to her analysis of how human choices shape technological outcomes.
The conversation covered her work on COVID-19's impact on digital inequality, her framework for understanding how values get embedded in AI systems, and her concept of "Aspirational Algorithms"—technology designed to enhance rather than exploit human capabilities. We discussed practical questions about AI governance, who participates in technology development, and how different communities approach technological change. Walther's "Values In, Values Out" framework provided a useful lens for examining how the data and assumptions we feed into AI systems shape their outputs.
The discussion examined the relationship between technology design, social structures, and human agency. We explored how pandemic technologies became normalized, whose voices are included in AI development, and what it means to create "prosocial" technology in practice.
Learn more about your ad choices. Visit megaphone.fm/adchoices
Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/science-technology-and-society

6,714 Listeners

289 Listeners

9,184 Listeners

4,001 Listeners

10,739 Listeners

5,430 Listeners

149 Listeners

1,448 Listeners

6,285 Listeners

7,055 Listeners

2,040 Listeners

553 Listeners

198 Listeners

5,470 Listeners

16,072 Listeners