Sam Altman says ChatGPT water use claims are 'completely untrue' — but admits AI energy use is a concern

Sam Altman
(Image credit: Getty Images/Bloomberg)

  • Sam Altman dismisses claims about ChatGPT’s water usage as “totally fake”
  • Experts warn that scaling AI infrastructure is driving huge costs and increasing pressure on power, cooling, and resources
  • The real issue isn’t efficiency — it’s whether AI can grow at this scale without serious environmental impact

Speaking at an event hosted by The Indian Express, OpenAI CEO Sam Altman dismissed claims that AI’s water usage is high as “totally fake”, but he did acknowledge that it had been an issue in the past when “we used to do evaporative cooling in data centers.”

“Now that we don’t do that, you see these things on the internet like, ‘Don’t use ChatGPT, it’s 17 gallons of water for each query’ or whatever,” Altman said. “This is completely untrue, totally insane, no connection to reality.”

You can find this segment at around 27 minutes in the video of the event:

Sam Altman Unfiltered: ChatGPT, AI Risks & What’s Coming Next, 40 Questions in 60 Minutes - YouTube Sam Altman Unfiltered: ChatGPT, AI Risks & What’s Coming Next, 40 Questions in 60 Minutes - YouTube
Watch On

Altman did concede that concerns around AI’s overall energy consumption are “fair”, noting that “the world is now using so much AI” and that “we need to move towards nuclear or wind and solar very quickly”.

AI-specific data centers already leave a larger and more complex footprint than traditional facilities, and several groups have raised concerns about their environmental impact — particularly around rising electricity demand, water usage, and the construction of new infrastructure. That build-out is also having knock-on effects, including increased demand for components like RAM, which is pushing up prices across the industry.

IBM CEO Arvind Krishna has previously raised doubts about whether the current pace and scale of AI data center expansion is financially sustainable. He estimates that equipping a single 1GW site with compute hardware now costs close to $80 billion — and with plans for nearly 100GW of capacity dedicated to advanced AI training, the total potential spend could approach a staggering $8 trillion.

Meanwhile, AI’s new wave of ultra-powerful accelerators is pushing data centers breaking point, forcing a rethink of power, cooling, and connectivity. Hardware that felt cutting-edge just a few years ago can’t keep up, as modern AI workloads demand a complete overhaul of everything from rack design to thermal strategy.

Newsflash: humans require a lot of energy too

As well as dismissing claims about ChatGPT’s water usage, Altman also offered a more unusual defense of OpenAI’s overall energy use. He argued that discussions around AI’s energy consumption were “unfair” because they don’t account for how much energy it takes to train humans to perform similar tasks.

It also takes a lot of energy to train a human.

Sam Altman, CEO OpenAI

“But it also takes a lot of energy to train a human,” Altman said. “It takes like 20 years of life and all of the food you eat during that time before you get smart. And not only that, it took the very widespread evolution of the 100 billion people that have ever lived and learned not to get eaten by predators and learned how to figure out science and whatever, to produce you.”

He continued: “If you ask ChatGPT a question, how much energy does it take once its model is trained to answer that question versus a human? And probably, AI has already caught up on an energy efficiency basis, measured that way.”

I can see the argument Altman is making — that human intelligence also comes with an energy cost — but it feels reductive, and faintly cynical, to reduce the value of a human life to its energy consumption. More importantly, it sidesteps the real issue. The question isn’t whether humans also use energy (of course they do!) but whether scaling AI to billions of daily queries introduces entirely new levels of demand that we haven’t had to account for before. Comparing the lifetime energy cost of a human to the marginal cost of an AI response might be provocative, but it’s not especially useful.

What Altman’s comments highlight is a growing tension at the heart of the AI boom. The technology may be getting smarter and more efficient, but the scale at which it’s being deployed is growing even faster, raising fresh concerns about its long-term environmental impact, including pressure on global water supplies. The UN has already warned that the world has entered an “era of global water bankruptcy,” underlining just how fragile those resources have become.

Those questions aren’t going away. As AI adoption accelerates, the real challenge won’t just be how efficient the technology becomes, but whether it can scale sustainably at all.


Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button!

And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.

TOPICS
Graham Barlow
Senior Editor, AI

Graham is the Senior Editor for AI at TechRadar. With over 25 years of experience in both online and print journalism, Graham has worked for various market-leading tech brands including Computeractive, PC Pro, iMore, MacFormat, Mac|Life, Maximum PC, and more. He specializes in reporting on everything to do with AI and has appeared on BBC TV shows like BBC One Breakfast and on Radio 4 commenting on the latest trends in tech. Graham has an honors degree in Computer Science and spends his spare time podcasting and blogging.

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.