
Sign up to save your podcasts
Or


Varun Sivaram is Founder and CEO of Emerald AI, a company building software that makes AI data centers power flexible. As AI data centers become one of the fastest-growing sources of electricity demand, grid constraints are emerging as a critical bottleneck for compute deployment.
In this episode, the conversation focuses on why power availability — not GPUs — is increasingly the limiting factor for AI. Data centers concentrate massive electrical loads in specific locations, creating grid stress, long interconnection delays, and rising electricity costs for surrounding communities. Traditional grid expansion alone is too slow to meet near-term AI demand.
Emerald AI’s response is to treat AI data centers as flexible loads rather than fixed ones. Its software coordinates compute with grid conditions by shifting workloads across time, geography, and on-site energy resources like batteries. The episode walks through real-world demonstrations, including a published field trial showing a 25% power reduction during grid stress without breaking compute performance. The discussion frames flexible load as one of the fastest ways to unlock power for AI while improving grid stability.
Episode recorded on Feb 2, 2026 (Published on Feb 10, 2026)
In this episode, we cover:
Links:
Enjoyed this episode? Please leave us a review! Share feedback or suggest future topics and guests at [email protected].
Connect with MCJ:
*Editing and post-production work for this episode was provided by The Podcast Consultant
By an MCJ podcast4.8
165165 ratings
Varun Sivaram is Founder and CEO of Emerald AI, a company building software that makes AI data centers power flexible. As AI data centers become one of the fastest-growing sources of electricity demand, grid constraints are emerging as a critical bottleneck for compute deployment.
In this episode, the conversation focuses on why power availability — not GPUs — is increasingly the limiting factor for AI. Data centers concentrate massive electrical loads in specific locations, creating grid stress, long interconnection delays, and rising electricity costs for surrounding communities. Traditional grid expansion alone is too slow to meet near-term AI demand.
Emerald AI’s response is to treat AI data centers as flexible loads rather than fixed ones. Its software coordinates compute with grid conditions by shifting workloads across time, geography, and on-site energy resources like batteries. The episode walks through real-world demonstrations, including a published field trial showing a 25% power reduction during grid stress without breaking compute performance. The discussion frames flexible load as one of the fastest ways to unlock power for AI while improving grid stability.
Episode recorded on Feb 2, 2026 (Published on Feb 10, 2026)
In this episode, we cover:
Links:
Enjoyed this episode? Please leave us a review! Share feedback or suggest future topics and guests at [email protected].
Connect with MCJ:
*Editing and post-production work for this episode was provided by The Podcast Consultant

1,242 Listeners

568 Listeners

400 Listeners

127 Listeners

508 Listeners

128 Listeners

103 Listeners

79 Listeners

81 Listeners

647 Listeners

284 Listeners

203 Listeners

229 Listeners

119 Listeners

140 Listeners