
Sign up to save your podcasts
Or


Can artificial intelligence reliably act in ways that benefit humans? This week I sit down with Greg Buckner, cofounder of AE Studio, to discuss the increasingly urgent world of AI safety. Together we discuss how Greg’s team is taking on the challenge of making powerful AI systems safer, more interpretable, and more aligned with humanity.
As one of the leading voices working on the alignment problem, Greg explains how AI systems can cheat, ignore instructions, or deceive users, and why these behaviors emerge in the first place. AE Studio’s research is laying the groundwork for a future where advanced AI strengthens human agency instead of undermining it.
About Greg Buckner:
Greg Buckner is the co-founder of AE Studio, an AI and software consulting firm focused on increasing human agency. At AE, Greg works on AI alignment research, ensuring advanced AI systems remain reliable and aligned with humanity as they become more capable - including collaborations with major universities, frontier labs, and DARPA. Greg also works closely with enterprise and startup clients to solve hard problems with AI, from building an AI-enabled school where students rank in the top 1% nationally to generating millions in incremental revenue for major companies.
Follow Greg on LinkedIn @gbuckner
Related Reading:
AE Studio's AI Alignment Work:
https://www.ae.studio/alignment
WSJ: AI Is Learning to Escape Human Control:
https://www.wsj.com/opinion/ai-is-learning-to-escape-human-control-technology-model-code-programming-066b3ec5
WSJ: The Monster Inside ChatGPT:
https://www.wsj.com/opinion/the-monster-inside-chatgpt-safety-training-ai-alignment-796ac9d3
See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.
By Cory Corrine and Dear Media4.1
4747 ratings
Can artificial intelligence reliably act in ways that benefit humans? This week I sit down with Greg Buckner, cofounder of AE Studio, to discuss the increasingly urgent world of AI safety. Together we discuss how Greg’s team is taking on the challenge of making powerful AI systems safer, more interpretable, and more aligned with humanity.
As one of the leading voices working on the alignment problem, Greg explains how AI systems can cheat, ignore instructions, or deceive users, and why these behaviors emerge in the first place. AE Studio’s research is laying the groundwork for a future where advanced AI strengthens human agency instead of undermining it.
About Greg Buckner:
Greg Buckner is the co-founder of AE Studio, an AI and software consulting firm focused on increasing human agency. At AE, Greg works on AI alignment research, ensuring advanced AI systems remain reliable and aligned with humanity as they become more capable - including collaborations with major universities, frontier labs, and DARPA. Greg also works closely with enterprise and startup clients to solve hard problems with AI, from building an AI-enabled school where students rank in the top 1% nationally to generating millions in incremental revenue for major companies.
Follow Greg on LinkedIn @gbuckner
Related Reading:
AE Studio's AI Alignment Work:
https://www.ae.studio/alignment
WSJ: AI Is Learning to Escape Human Control:
https://www.wsj.com/opinion/ai-is-learning-to-escape-human-control-technology-model-code-programming-066b3ec5
WSJ: The Monster Inside ChatGPT:
https://www.wsj.com/opinion/the-monster-inside-chatgpt-safety-training-ai-alignment-796ac9d3
See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

21,942 Listeners

31,967 Listeners

38,460 Listeners

30,674 Listeners

43,566 Listeners

112,027 Listeners

9,695 Listeners

2,701 Listeners

34,114 Listeners

16,339 Listeners

13,900 Listeners

2,017 Listeners

6,550 Listeners

3,152 Listeners

742 Listeners

1,049 Listeners

123 Listeners

1,456 Listeners

1,254 Listeners

836 Listeners

1,581 Listeners

1,488 Listeners

832 Listeners

1,669 Listeners

3,387 Listeners

268 Listeners

354 Listeners

1,086 Listeners

1,179 Listeners

61 Listeners

87 Listeners

89 Listeners

242 Listeners

142 Listeners