Commentary on the environmental impact of AI often swings wildly between doom-and-gloom catastrophism and blind techno-optimism. But where’s the truth in all this? On July 24, 2025—the symbolic date of Earth Overshoot Day—we sat down with Yves Grandmontagne, founder and editor-in-chief of DCMAG (Data Centre Magazine*), to get his take on AI and its real environmental impact. It is worthy of note that Yves and I explored Silicon Valley’s infrastructure innovators together through extensive press tours some time ago. This provided us with firsthand insight into the tech industry’s approach to these challenges. The current annotated transcript of our interview is a summary of our thorough, nuanced, and let’s admit it, quite lengthy discussion. You are therefore encouraged to treat this article as your starting point for diving deeper into this extremely complex topic.
Exploring the Real Environmental Impact of AI
What’s the real environmental impact of AI? An employee keeps watch over the cooling units at Orange’s data centre in Val de Rueil in Normandy, France — Photo antimuseum.com
* DCMag is only available in French
This post summarises what turned out to be an incredibly rich hour-long conversation. The sheer complexity of this topic forced us to dig into multiple technical, economic, and environmental angles—making any kind of comprehensive analysis near inconceivable.
Drawing on his deep expertise in the data centre and AI sectors, Yves Grandmontagne gives us some much-needed factual perspective on a debate that’s often polarised between doomsday scenarios and over-the-top techno-optimism. To tackle this properly, we decided to take recent quotes—both positive and negative—and fact-check them with our expert.
Yves’s analysis helps us cut through the noise and understand what’s really at stake in this technological breakthrough.
Environmental Impact of AI: Reality Check Time
TLDR: Environmental Impact of AI
The electricity consumption issue is more nuanced than you think* – AI will represent 20-30% of data centre consumption (not twice that number), and only 2-4% of overall electricity consumption
Energy efficiency gains are actually remarkable – Over the last decade: number of data centres x2, floor space x4, but energy consumption up only 6%Beware of dubious comparisons – Comparing a ChatGPT query to Google search is methodologically flawed (completely different technologies and services)Water consumption varies massively by geography – Huge issue in the US, but Europe has been using smarter closed-loop systems for agesTech innovations look promising – New technologies (direct liquid cooling, immersion cooling) are slashing water and energy consumptionAI might actually be part of the solution – Can optimise energy mix management and electricity transport, which is currently our main bottleneckLet’s get some perspective here – Data centre impact remains pretty marginal compared to the chemical industry (32% of French energy consumption) or agriculture*All numbers by Yves Grandmontagne at Data Centre Magazine
Bottom line: The impact is “real but massively overstated”—we need to put things in context and remain cool and collected.
So here are the quotes about the environmental impact of AI that we wanted to fact-check with Yves.
Rumours vs Reality: Decoding the Doomsday Predictions
Predictions of Booming Electricity Consumption
The first claim I put to Yves Grandmontagne was Synth Media’s prediction [Fr] that “AI’s growth could double data centre electricity consumption by 2026.” His response immediately throws cold water on this alarmist take:
“It’s absolutely true that AI is rolling out infrastructure at breakneck speed and eating up more space in data centres. Sure, it’s going to significantly bump up their energy consumption—no question about that. But will consumption actually double? I seriously doubt it. AI should account for somewhere between 20 and 30% of global data centre consumption worldwide.”
A data centre server rack — photo antimuseum.com
AI should account for somewhere between 20 and 30% of global data centre consumption worldwide
This reality check reveals something consistent throughout Yves Grandmontagne’s analysis: the absolute need to put numbers in context. The expert points out that this increase is just part of the natural progression tied to our ever-growing digital habits. “What’s driving this consumption increase is our daily usage, whether that’s for work or for personal reasons,” he reminds us, highlighting our collective responsibility in this evolution. It’s a topic we’ve tackled before with a broader focus on digital consumption.
The most striking part of his analysis is about the energy efficiency angle. Contrary to popular belief, data centres aren’t following a consumption curve that mirrors data growth. This improving efficiency is something that gets completely overlooked in public debates about the environmental impact of AI.
ChatGPT’s Carbon Footprint: More Context Needed
When it comes to the 10,113 tonnes of CO2 equivalent attributed to ChatGPT usage in January 2023 (Basta Media – Data for Good, AI has the potential to destroy the planet), Yves Grandmontagne takes a refreshingly pragmatic approach:
“I can’t verify that exact figure. Getting that precise—down to 10,113 tonnes—represents a massive methodological challenge, especially when you’re dealing with AI infrastructures that are distributed systems.”
We asked Yves Grandmontagne, editor-in-chief of DCMAG (Data Centre Magazine) to give us the real story about the environmental impact of AI — He gave us facts and figures, which we’ve compiled in the infographic at the end of this post
This observation raises a crucial methodological point: just how tough it is to accurately measure the carbon footprint of distributed infrastructures. The expert does acknowledge this pollution is real, but puts it in perspective: “Those 10,113 tonnes of CO2 still represent volumes significantly smaller than what many other industries pump out.”
This contextualisation isn’t about downplaying the issue—it’s about keeping things proportional. Yves Grandmontagne reminds us of a basic truth that is often overlooked: “The moment we use our smartphones, we become CO2 producers.” This highlights the inconsistency in criticisms that single out AI from our overall digital consumption.
The Google vs ChatGPT Comparison: A Methodological Trap
Watch out for convenient shortcuts between pollution and digital tech—they’re everywhere and pretty handy when you want to hide overall industrial pollution — image created with Midjourney
MIT’s claim that “a ChatGPT query uses ten times more electricity than a Google search [p 9]” perfectly illustrates the danger of oversimplified comparisons, according to Yves Grandmontagne. His take is particularly eye-opening:
“This comparison just doesn’t work. I think we’re making a fundamental methodological error here because we’re comparing two completely different technologies. Google is a search engine that gives you results that you then have to sift through to find what you’re actually looking for. ChatGPT, on the other hand, serves up information that’s already structured and ready to use.”
“The more we use computing, and the more we use AI, the more we consume.” User responsibility matters — images antimuseum.com Orange data centre in Val de Rueil
This analysis shows just how sophisticated you need to be when evaluating the environmental impact of AI. A ChatGPT query might actually replace multiple Google searches plus visits to various websites. That makes direct comparison pretty meaningless. This distinction will probably become moot anyway with the rollout of Google AI Overviews, which will integrate similar functionality to ChatGPT.
The Water Issue: Geography Matters More Than You Think
New data centre cooling techniques work in closed loops. Image antimuseum.com Orange data centre in Val de Rueil
Massive Geographical Differences
One of the most revealing parts of our interview dealt with data centre water consumption. Yves Grandmontagne draws a crucial distinction between American and European practices:
In the United States, when you install a data centre, you’re a private company reaching out to other private companies for water on one side, electricity on the other. And the utility companies that manage energy or water are thrilled to have a massive client that’ll absorb a chunk of their production.
This geopolitical insight explains those 5.4 million litres of fresh water attributed to ChatGPT-3 training. The issue isn’t the technology itself—it’s local practices and regulations. In Europe, our expert reminds us, “we’ve been developing infrastructure cooling systems that work in closed circuits or are air-based rather than water-based for quite some time”.
Take Orange’s data centre in Val de Rueil, for instance—it’s cooled by the crisp Normandy air. The only exceptions are during heat waves, which are naturally time-limited.
How Cooling Actually Works
Yves Grandmontagne’s technical explanation demystifies the cooling process: “Water acts as a conductor to capture heat.” He then breaks down the dual-circuit system that protects infrastructure while managing thermal exchanges. This approach reveals that warm water discharge from data centres (~20-25°C) stays well below that of nuclear plants (27-35°C for the Gravelines nuclear plant in the North of France). That gives us a useful comparison point for our debate.
Warm water discharge from data centres (around 20-25 degrees) stays well below that of nuclear plants (27-35°C for Gravelines)
We should note that these figures vary depending on location, time of year, and technological choices. A 12°C increase in the North Sea, even if reports show nothing concerning at the macro level, probably isn’t neutral at the micro level over the long term. Similarly, warm water discharges from data centres probably aren’t completely neutral either. Another point worth nuancing, though we should be cautious here.
The expert also highlights emerging technologies like Direct Liquid Cooling (DLC) and immersion, which operate in closed circuits and drastically cut water consumption. This technological evolution shows the sector’s ability to adapt to environmental challenges.
More importantly, the above claim about water consumption was only related to the training of ChatGPT. This operation occurs only once, unlike the everyday usage by individuals worldwide, multiplied by hundreds of millions of users. This leads to staggering amounts of consumption. Once again, this highlights our individual responsibilities regarding energy and water consumption in data centres and AI.
Moratoriums and Regulations: When Saturation Forces Limits
A power generator at the Orange Data Centre in Normandy (Val de Rueil) — image antimuseum.com
The Amsterdam, Frankfurt, and Dublin Cases
Looking at data centre moratoriums reveals some pretty contrasting situations. Yves Grandmontagne particularly highlights the Irish case. “Remember, Ireland was Europe’s poorest country 30-40 years ago [Editor’s note: the podcast figure is off]. To build an industry and attract capital, they found this solution.”
And it worked—Ireland’s GDP per capita now exceeds $100,000, putting it amongst the world’s best in 2025. [Editor’s note: there’s an infographic with sources at the end of this article]
Today, 30% of Irish energy production goes to data centres. Yves’s reaction? “That’s clearly way too much!” This extreme situation shows the risks of an unregulated approach, but also how governments can respond intelligently. “The Irish government didn’t say ‘no more data centres’—they decided to make new construction conditional on implementing solutions.”
The pragmatic approach adopted by the Celtic Tiger contrasts sharply with outright bans. It offers a middle ground between economic development and environmental constraints.
The Real Problem: Moving Energy Around
One of the biggest revelations from our interview concerns identifying the real bottleneck. “The real problem isn’t production—it’s the grid. The energy grid, meaning getting energy to where it’s actually consumed.”
This analysis puts debates about the environmental impact of AI in a whole new light. France, Yves Grandmontagne reminds us, “is completely energy self-sufficient in production.” But it faces transport infrastructure challenges. This technical perspective shows that solutions don’t necessarily lie in cutting consumption, but in optimising distribution.
Power generators kick in when there are supply issues — Val de Rueil data centre image antimuseum.com
Putting Numbers in Context: AI in the Global Energy Picture
That Telling 2%
On global statistics, Yves Grandmontagne sets the record straight: “Currently, we’re more like 2%. And we’re heading pretty fast towards 4%.” These figures, relative to global electricity consumption, give us some useful perspective.
The expert insists on keeping these percentages in context: “When you pull out a graph showing energy consumption across all industries, if you put industry and data centres side by side, you realise that the latter don’t weigh much.”
The Real Energy Consumers
To provide further context, Yves Grandmontagne lists the truly energy-intensive sectors: “There’s transport, industries, steelworks, agriculture… The latter consumes enormous amounts of energy. We often forget to mention this.”
This perspective reveals according to our research that the chemical industry represents 32% of French industrial energy consumption, offering a striking comparison point with the 2-4% of data centres globally.
Techno-solutionism: Promises and Realities
Let’s now turn to techno-solutionists and examine their claims clinically as well.
Sam Altman’s Declarations: Between Ambition and Pragmatism
Sam Altman’s declarations that “1% of global electricity to train powerful AI would be a massive victory” also find nuanced resonance with Yves Grandmontagne: “It’s true that powerful AI called ChatGPT will consume perhaps 70% of global AI consumption worldwide.”
This analysis reveals the economic reality behind the declarations: Microsoft invested 10 billion in OpenAI, creating a temporary quasi-monopoly situation. The expert underlines the mechanical nature of this consumption: “You need computing. Computing consumes energy. And since we use more and more of it, it consumes more and more energy. That’s normal.”
AI as Solution to Energy Problems
Satya Nadella’s (Microsoft) vision that “AI can be a powerful accelerator for addressing the climate crisis” finds technical justification with Yves Grandmontagne: “The real problem isn’t production, it’s energy transport.”
The expert explains that managing a complex energy mix (nuclear, renewables, fossil fuels) requires sophisticated piloting tools: “We can only do this efficiently, we know today, by using AI tools.”
This technical perspective reveals that AI isn’t just an energy consumer, but potentially an optimiser of the global energy system. However, Yves Grandmontagne tempers ambitions: “It will be a combination. It’s not one element alone that can help.”
It also seems highly doubtful to us to claim that AI, however much a “pharmakon” it may be, can be a poison that serves as a universal remedy to this problem that far exceeds it, as we’ve written previously.
Technological Innovations: Towards More Efficient Data Centres
Microsoft’s “Zero Water” Plan: Revolution or Evolution?
Microsoft’s announcement regarding its “zero water” cooling plan illustrates the sector’s technological evolution. Yves Grandmontagne confirms: “This isn’t greenwashing. It’s not really an initiative either—this stuff already exists.”
The expert details the new technologies needed to cope with the dramatic increase in power requirements:
When we were using conventional technologies until now, we had between 5 and 10 kilowatts coming into a rack. When you add AI, at minimum, you’re consuming 80 kilowatts or you’re consuming 150 kilowatts.
This progression forces the adoption of Direct Liquid Cooling or immersion, technologies that operate in closed circuits. “That’s the direction of technological evolution for cooling AI data centres,” he concludes.
Energy Efficiency: Remarkable Progress
A particularly revealing statistic emerges from the interview:
Over the last decade, the number of data centres has doubled, their floor space has quadrupled, and their energy consumption has only increased by 6%. This data illustrates considerable progress in energy efficiency, often overlooked in debates about the environmental impact of AI. It reveals the sector’s capacity to reconcile growth with energy optimisation.
AI Geopolitics: American vs European Stakes
The Geographic Concentration of Infrastructure
An often-overlooked aspect of the debate lies in AI geography. Yves Grandmontagne reminds us: “For now, all AI production happens in the United States, not Europe.” This geographic concentration puts European criticism in perspective whilst highlighting our technological dependence.
The expert mentions the €109 billion in AI investments announced by President Macron, of which “a third, or even half will concern data centres.” This strategy aims to reduce our dependence whilst developing a more efficient European sector.
American Challenges: Infrastructure and Regulation
Yves Grandmontagne’s analysis reveals the weaknesses of the American model: “In the United States, they have major weaknesses” regarding energy transport. This infrastructure failure partly explains the criticised overconsumption and justifies the different approaches adopted in Europe.
Meta’s project in Texas (5 gigawatts across an area equivalent to Manhattan) illustrates this problem: “Meta doesn’t have the energy, so it’s looking for it.” This situation contrasts with Europe’s more integrated and regulated approach.
AI: A Revolution Comparable to Fire? (Pichai)
Sundar Pichai’s declaration (Google) that “AI is more important than fire or electricity” finds nuanced resonance with Yves Grandmontagne: “It’s a real revolution that will go much further than previous ones, because those were more industrial. This one is a revolution with an impact on our daily lives.”
This analysis highlights the specificity of the AI revolution: its speed and penetration into daily life. “In barely two years, it’s been a real wave that’s engulfed us,” observes the expert, contrasting with the hundreds of thousands of years it took to adopt fire.
Employment and Usage Challenges
Beyond environmental questions, Yves Grandmontagne expresses his social concerns:
I’m amongst those who still worry that it will massively impact jobs, and I think we need to be realistic about this.
This social dimension of the environment reveals the complexity of trade-offs to come. The expert nevertheless encourages adoption: “We mustn’t cut ourselves off from this potential either, because extraordinary things will come to fruition soon.” It’s not easy to form an opinion on this employment issue given how contradictory the viewpoints are.
Recommendations and Critical Vigilance
Yves Grandmontagne’s final message to students (since this post was created in preparation for our course on AI and content creation) particularly resonates:
Be careful about the discourse you listen to, who’s behind it, etc. It’s easy to twist information
This warning against disinformation highlights the political instrumentalisation of the environmental debate.
The Digital Economy: A Forgotten Dimension
An often-overlooked perspective emerges: “When a data centre sets up in a city, in its wake, an entire digital economy develops. But we forget to mention that.” The expert quantifies: “One job created in a data centre means 100 jobs behind it that will be developed in the region.”
This economic dimension reveals the complexity of territorial trade-offs and the need for global rather than sectoral approaches.
In Conclusion: Keep Calm and Take a Step Back
Yves Grandmontagne’s expertise reveals a landscape far more nuanced than polarised environmental debates suggest. His main conclusions deserve to be retained.
The environmental impact of AI is “real, but largely overestimated,” in his words. This reality can be explained by several factors: the remarkable progress in energy efficiency, the relativity of consumption compared to other industrial sectors, and the constant technological evolution towards greater efficiency.
Geography matters enormously in this equation. American practices, often criticised, don’t reflect the more regulated and efficient European approaches. This geographical distinction invites us to contextualise criticism and encourage good practices.
The technological future is heading towards more efficient solutions: closed-circuit cooling, proximity AI, energy optimisation by AI itself. These developments suggest that current problems are transitional and technical rather than structural.
Finally, the systemic dimension revealed by this analysis invites us to move beyond sectoral approaches. AI isn’t just an energy consumer, but potentially an optimiser of the global energy system. This systemic perspective may be the key to truly sustainable artificial intelligence development.
Yves Grandmontagne’s message resonates as a call for nuance and critical vigilance: “Let’s remain cautious and serene.” In an often emotional debate, this technical wisdom offers precious guidance for navigating between the pitfalls of denial and catastrophism. This more nuanced discourse is undoubtedly less marketable than extreme opinions, which fuel coffee shop chatter and social media.
This interview ultimately shows that the question of AI’s environmental impact doesn’t have a simple answer, but requires a systemic approach that’s geographically situated and technically informed. It makes up an essential starting point for all those who wish to move beyond preconceived ideas and contribute to a genuinely constructive debate about the future of our digital society.
Infographic on AI’s Environmental Impact
The post The Truth About the Environmental Impact of AI appeared first on Marketing and Innovation.