
Sign up to save your podcasts
Or


In this episode, we spoke with Cornelia C. Walther about her three books examining technology's role in society. Walther, who spent nearly two decades with UNICEF and the World Food Program before joining Wharton's AI & Analytics Initiative, brings field experience from West Africa, Asia, and the Caribbean to her analysis of how human choices shape technological outcomes.
The conversation covered her work on COVID-19's impact on digital inequality, her framework for understanding how values get embedded in AI systems, and her concept of "Aspirational Algorithms"—technology designed to enhance rather than exploit human capabilities. We discussed practical questions about AI governance, who participates in technology development, and how different communities approach technological change. Walther's "Values In, Values Out" framework provided a useful lens for examining how the data and assumptions we feed into AI systems shape their outputs.
The discussion examined the relationship between technology design, social structures, and human agency. We explored how pandemic technologies became normalized, whose voices are included in AI development, and what it means to create "prosocial" technology in practice.
Learn more about your ad choices. Visit megaphone.fm/adchoices
Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/technology
By New Books Network5
22 ratings
In this episode, we spoke with Cornelia C. Walther about her three books examining technology's role in society. Walther, who spent nearly two decades with UNICEF and the World Food Program before joining Wharton's AI & Analytics Initiative, brings field experience from West Africa, Asia, and the Caribbean to her analysis of how human choices shape technological outcomes.
The conversation covered her work on COVID-19's impact on digital inequality, her framework for understanding how values get embedded in AI systems, and her concept of "Aspirational Algorithms"—technology designed to enhance rather than exploit human capabilities. We discussed practical questions about AI governance, who participates in technology development, and how different communities approach technological change. Walther's "Values In, Values Out" framework provided a useful lens for examining how the data and assumptions we feed into AI systems shape their outputs.
The discussion examined the relationship between technology design, social structures, and human agency. We explored how pandemic technologies became normalized, whose voices are included in AI development, and what it means to create "prosocial" technology in practice.
Learn more about your ad choices. Visit megaphone.fm/adchoices
Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/technology

3,355 Listeners

111 Listeners

211 Listeners

161 Listeners

64 Listeners

32 Listeners

31 Listeners

4,260 Listeners

188 Listeners

165 Listeners

23 Listeners

23 Listeners

60 Listeners

524 Listeners

417 Listeners

373 Listeners

998 Listeners

3,188 Listeners

375 Listeners

552 Listeners

198 Listeners

14,371 Listeners

349 Listeners

3,148 Listeners

25 Listeners