Sam Altman doesn’t think you should be worried about ChatGPT’s energy usage - reveals exactly how much power each prompt uses
How much water is billions of teaspoons?

- Sam Altman says a ChatGPT prompt uses "0.34 watt-hours" of electricity, roughly one second of an oven
- He also says a single ChatGPT prompt uses "0.000085 gallons of water; roughly one-fifteenth of a teaspoon"
- While that's not a lot in isolation, ChatGPT has over 400 million weekly users, with multiple prompts per day
OpenAI CEO, Sam Altman has revealed ChatGPT's energy usage for a single prompt, and while it's lower than you might expect, on a global scale, it could have a significant impact on the planet.
Writing on his blog, Altman said, "The average query uses about 0.34 watt-hours, about what an oven would use in a little over one second, or a high-efficiency lightbulb would use in a couple of minutes. It also uses about 0.000085 gallons of water; roughly one-fifteenth of a teaspoon."
While that might not sound like a lot as an isolated prompt, ChatGPT has approximately 400 million active weekly users, and that number is growing at a rapid rate. Bear in mind there's a growing amount of AI tools and chatbots on the market, including Google Gemini and Anthropic's Claude, so general AI energy usage will be even higher.
Last month, we reported on a study from MIT Technology Review which found that a five-second AI video uses as much energy as a microwave running for an hour or more. While Altman's ChatGPT prompt energy usage reveal is nowhere near as high as that, there are still concerns considering how much people interact with AI.
We rely on AI, so is this energy consumption a concern?
There's a constant concern about ChatGPT's energy consumption, and it is becoming increasingly vocal as AI usage continues to rise. While Altman's blog post will put some minds at ease, considering the relatively low energy and water usage in isolation, it could also spark more uproar.
Earlier this week, a mass ChatGPT outage led to millions of people unable to interact with the chatbot. Over the 10 hour plus period, I received emails from thousands of readers who gave me a new perspective on AI.
While I'd be lying if I said AI's energy consumption doesn't concern me, it would be unfair to overlook the positives of the technology and how it is improving the lives of millions.
Get daily insight, inspiration and deals in your inbox
Sign up for breaking news, reviews, opinion, top tech deals, and more.
The climate crisis is not limited to me and you, but unfortunately, it's the working class that ultimately pays the price. ChatGPT's energy consumption at a mass scale may be a severe problem in the future, but then again, so are the private jets flying 10-minute flights.
The AI climate concerns are not black and white, and those who criticise the impact of the technology on the planet are equally vocal about the impact of other technologies. That said, we're only at the beginning of the AI revolution, and energy consumption will continue to rise. At what point should we be worried?
You might also like

John-Anthony Disotto is TechRadar's Senior Writer, AI, bringing you the latest news on, and comprehensive coverage of, tech's biggest buzzword. An expert on all things Apple, he was previously iMore's How To Editor, and has a monthly column in MacFormat. John-Anthony has used the Apple ecosystem for over a decade, and is an award-winning journalist with years of experience in editorial.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.