
Sign up to save your podcasts
Or


Can artificial intelligence reliably act in ways that benefit humans? This week I sit down with Greg Buckner, cofounder of AE Studio, to discuss the increasingly urgent world of AI safety. Together we discuss how Greg’s team is taking on the challenge of making powerful AI systems safer, more interpretable, and more aligned with humanity.
As one of the leading voices working on the alignment problem, Greg explains how AI systems can cheat, ignore instructions, or deceive users, and why these behaviors emerge in the first place. AE Studio’s research is laying the groundwork for a future where advanced AI strengthens human agency instead of undermining it.
About Greg Buckner:
Greg Buckner is the co-founder of AE Studio, an AI and software consulting firm focused on increasing human agency. At AE, Greg works on AI alignment research, ensuring advanced AI systems remain reliable and aligned with humanity as they become more capable - including collaborations with major universities, frontier labs, and DARPA. Greg also works closely with enterprise and startup clients to solve hard problems with AI, from building an AI-enabled school where students rank in the top 1% nationally to generating millions in incremental revenue for major companies.
Follow Greg on LinkedIn @gbuckner
Related Reading:
AE Studio's AI Alignment Work:
https://www.ae.studio/alignment
WSJ: AI Is Learning to Escape Human Control:
https://www.wsj.com/opinion/ai-is-learning-to-escape-human-control-technology-model-code-programming-066b3ec5
WSJ: The Monster Inside ChatGPT:
https://www.wsj.com/opinion/the-monster-inside-chatgpt-safety-training-ai-alignment-796ac9d3
See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.
By Cory Corrine and Dear Media4.1
4747 ratings
Can artificial intelligence reliably act in ways that benefit humans? This week I sit down with Greg Buckner, cofounder of AE Studio, to discuss the increasingly urgent world of AI safety. Together we discuss how Greg’s team is taking on the challenge of making powerful AI systems safer, more interpretable, and more aligned with humanity.
As one of the leading voices working on the alignment problem, Greg explains how AI systems can cheat, ignore instructions, or deceive users, and why these behaviors emerge in the first place. AE Studio’s research is laying the groundwork for a future where advanced AI strengthens human agency instead of undermining it.
About Greg Buckner:
Greg Buckner is the co-founder of AE Studio, an AI and software consulting firm focused on increasing human agency. At AE, Greg works on AI alignment research, ensuring advanced AI systems remain reliable and aligned with humanity as they become more capable - including collaborations with major universities, frontier labs, and DARPA. Greg also works closely with enterprise and startup clients to solve hard problems with AI, from building an AI-enabled school where students rank in the top 1% nationally to generating millions in incremental revenue for major companies.
Follow Greg on LinkedIn @gbuckner
Related Reading:
AE Studio's AI Alignment Work:
https://www.ae.studio/alignment
WSJ: AI Is Learning to Escape Human Control:
https://www.wsj.com/opinion/ai-is-learning-to-escape-human-control-technology-model-code-programming-066b3ec5
WSJ: The Monster Inside ChatGPT:
https://www.wsj.com/opinion/the-monster-inside-chatgpt-safety-training-ai-alignment-796ac9d3
See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

21,934 Listeners

32,320 Listeners

38,605 Listeners

30,762 Listeners

43,547 Listeners

113,168 Listeners

9,712 Listeners

2,726 Listeners

34,098 Listeners

16,478 Listeners

13,933 Listeners

2,018 Listeners

6,574 Listeners

3,154 Listeners

741 Listeners

1,049 Listeners

123 Listeners

1,485 Listeners

1,260 Listeners

836 Listeners

1,617 Listeners

1,478 Listeners

839 Listeners

1,654 Listeners

3,462 Listeners

270 Listeners

381 Listeners

1,089 Listeners

1,191 Listeners

61 Listeners

88 Listeners

90 Listeners

245 Listeners

140 Listeners