If you know anything about renewable energy, you probably have heard about its intermittency problem: the sun doesn't always shine and the wind doesn't always blow, so energy production from renewables doesn't provide the kind of consistent power generation of fossil fuels or nuclear power.
There is also the issue of overproduction from renewables, which might threaten power surges in the electrical grid, and excess energy must essentially be dumped out of the grid, wasting its potential usefulness.
While battery technology has made strides in recent years, we aren't at the point where large-scale energy storage is possible. Until then, renewables' intermittency problems stand in the way of widespread adoption.
That is the challenge that researchers at the University of Southern California are trying to solve with a novel solution: information batteries.
The idea of an information battery isn't that strange if you think about it. The problem we are trying to solve is to make renewables more consistently productive. The whole reason we produce energy is to convert it into some kind of practical work, whether that's driving a car's motor, running your home's air conditioning system, or powering a Google data center.
While the long-term solution to providing all of these with renewable power is traditional battery technology, there are steps we can take to speed things along. That's where information batteries come into play.
"The way things are going, in five years, the amount of renewable power wasted in California each year will be equivalent to the amount of power L.A. uses each year," said Barath Raghavan, an assistant professor in computer science at USC's Viterbi School of Engineering.
Finding a productive use for the excess energy could go a long way toward balancing the power demands put on renewables during periods of low power generation. The idea is to effectively move that energy use from periods of underproduction forward into periods of overproduction and store the result of that work for later use.
You can't really do that with a car or an air conditioner, but you can do that with data processing, which is where the information battery concept comes in.
Storing energy from renewables is hard, but storing data is incredibly easy, so rather than store the energy from a solar panel that a data center might use at night, have the data center perform predictable computations during periods of overproduction. Then the data center can store those results until they are needed later, which is a much less energy-intensive operation.
"We had the observation that if we can predict possible computations that might occur in the future, we can do those computations now, while there is energy available, and store the results, which now have embodied energy," Raghavan said.
How could you predict computing work ahead of time?
One of the key features of the information battery idea, which Raghavan and Jennifer Switzer, a Ph.D. student at the University of California, San Diego, describe in a recent paper published in the ACM Energy Informatics Review, is that a lot of computational work is known in advance.
"Imagine a large computational task is like a big jigsaw puzzle, where each piece is a chunk of computation," Raghavan told TechRadar this week. "You could do it all at once – a one-piece puzzle – if you know what all the computation will be in advance. But often you don't know 100% of what a future task will be. So instead you could imagine fragmenting that large computation into many smaller puzzle pieces.
"While not all can be done in advance, many can be. So only a small amount needs to be done in real time (the few small pieces that weren't pre-computable), with the rest taking advantage of speculatively executed computed pieces."
What's more, many of those computations are likely to be reusable for different applications or computations, so the energy savings from the repetitive, real-time computing work can really start to add up.
Doesn't it cost energy to store all that data?
Yes, but not nearly on the same scale.
To read, write, or otherwise interact with the information battery, you would obviously need to expend energy. But with long-term storage, once data is written to the battery and indexed to make it easily accessible, the energy cost to use it is minuscule compared to the energy required to recompute that same data in real time.
"It depends on the storage medium and the type of computation, but we're talking far, far more efficient in general," Raghavan told us.
"As a very rough calculation (very much back of the envelope), a high-end server hard disk has an embodied energy of about 2 GJ, which is a little more than a smartphone and would work out to about 1 kJ/MB in steady state, and the MB here is the output data (which will have the embodied energy of the computation).
"Its operational power use is small – about 4W."
Would that energy savings add up enough to bridge renewable energy's intermittency problem? Maybe not enough on its own, given how the amount of energy required to run the world's computers is growing at a rapidly accelerating rate. But before we can really talk about producing more energy, making sure we fully utilize the energy we're producing is a major step in the right direction.
Sign up to receive daily breaking news, reviews, opinion, analysis, deals and more from the world of tech.
John (He/Him) is the Components Editor here at TechRadar and he is also a programmer, gamer, activist, and Brooklyn College alum currently living in Brooklyn, NY.
Named by the CTA as a CES 2020 Media Trailblazer for his science and technology reporting, John specializes in all areas of computer science, including industry news, hardware reviews, PC gaming, as well as general science writing and the social impact of the tech industry.
You can find him online on Threads @johnloeffler.
Currently playing: Baldur's Gate 3 (just like everyone else).