You'll be as annoyed as me when you learn how much energy a few seconds of AI video costs

Data servers with colourful wires inside a meshed metal cupboard
(Image credit: Unsplash/Taylor Vick)
  • AI chatbots and videos use up a huge amount of energy and water
  • A five-second AI video uses as much energy as a microwave running for an hour or more
  • Data center energy use has doubled since 2017, and AI will account for half ot it by 2028

It only takes a few minutes in a microwave to explode a potato you haven't ventilated, but it takes as much energy as running that microwave for over an hour and more than a dozen potato explosions for an AI model to make a five-second video of a potato explosion.

A new study from MIT Technology Review has laid out just how hungry AI models are for energy. A basic chatbot reply might use as little as 114 or as much as 6,700 joules, between half a second and eight seconds, in a standard microwave, but it's when things get multimodal that the energy costs skyrocket to an hour plus in the microwave, or 3.4 million joules.

It's not a new revelation that AI is energy-intensive, but MIT's work lays out the math in stark terms. The researchers devised what might be a typical session with an AI chatbot, where you ask 15 questions, request 10 AI-generated images, and throw in requests for three different five-second videos.

You can see a realistic fantasy movie scene that appears to be filmed in your backyard a minute after you ask for it, but you won't notice the enormous amount of electricity you've demanded to produce it. You've requested roughly 2.9 kilowatt-hours, or three and a half hours of microwave time.

What makes the AI costs stand out is how painless it feels from the user's perspective. You're not budgeting AI messages like we all did with our text messages 20 years ago.

AI energy rethink

Sure, you're not mining bitcoin, and your video at least has some real-world value, but that's a really low bar to step over when it comes to ethical energy use. The rise in energy demands from data centers is also happening at a ridiculous pace.

Data centers had plateaued in their energy use before the recent AI explosion, thanks to efficiency gains. However, the energy consumed by data centers has doubled since 2017, and around half of it will be for AI by 2028, according to the report.

This isn’t a guilt trip, by the way. I can claim professional demands for some of my AI use, but I've employed it for all kinds of recreational fun and to help with personal tasks, too. I'd write an apology note to the people working at the data centers, but I would need AI to translate it for the language spoken in some of the data center locations. And I don't want to sound heated, or at least not as heated as those same servers get. Some of the largest data centers use millions of gallons of water daily to stay frosty.

The developers behind the AI infrastructure understand what's happening. Some are trying to source cleaner energy options. Microsoft is looking to make a deal with nuclear power plants. AI may or may not be integral to our future, but I'd like it if that future isn’t full of extension cords and boiling rivers.

On an individual level, your use or avoidance of AI won't make much of a difference, but encouraging better energy solutions from the data center owners could. The most optimistic outcome is developing more energy-efficient chips, better cooling systems, and greener energy sources. And maybe AI's carbon footprint should be discussed like any other energy infrastructure, like transportation or food systems. If we’re willing to debate the sustainability of almond milk, surely we can spare a thought for the 3.4 million joules it takes to make a five-second video of a dancing cartoon almond.

As tools like ChatGPT, Gemini, and Claude get smarter, faster, and more embedded in our lives, the pressure on energy infrastructure will only grow. If that growth happens without planning, we’ll be left trying to cool a supercomputer with a paper fan while we chew on a raw potato.

You might also like

Eric Hal Schwartz
Contributor

Eric Hal Schwartz is a freelance writer for TechRadar with more than 15 years of experience covering the intersection of the world and technology. For the last five years, he served as head writer for Voicebot.ai and was on the leading edge of reporting on generative AI and large language models. He's since become an expert on the products of generative AI models, such as OpenAI’s ChatGPT, Anthropic’s Claude, Google Gemini, and every other synthetic media tool. His experience runs the gamut of media, including print, digital, broadcast, and live events. Now, he's continuing to tell the stories people want and need to hear about the rapidly evolving AI space and its impact on their lives. Eric is based in New York City.

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.