Data centers don’t need to apologize for energy use

A man standing in front of a rack of servers inside a data center
(Image credit: Shutterstock.com / Gorodenkoff)

Well before 2022’s energy shortage hit Europe in earnest, data centers were in the middle of the firing line for their energy consumption.

Much of this concern has been based on the contributions of data centers to global emissions, but we’ve also seen the energy requirements of data centers singled out as the source of some major domestic woes. One example was when west London’s data centers were blamed for a freeze in the region’s housebuilding, with some markets even seeing some calls for moratoriums on data center construction.

About the author

David Friend is the co-founder and CEO of Wasabi.

It’s certainly true that data centers have rapidly grown their energy consumption footprint, with the relatively young sector already consuming up to 1.5% of the world’s electricity.

However, this electricity consumption isn’t due to the sector being wasteful. Data centers are the best possible solutions for enabling the digital economy in a sustainable and energy-efficient way. Ultimately, this comes down to a simple reason: they enable very high utilization of computing and storage capacity.

Data centers and utilization

Because of the modern data center, we’ve been able to dramatically improve the utilization of computing and storage resources – that is, increasing the amount of work we can do with a given amount of computing and storage power.

To understand why this is important, let’s quickly look at the underlying economics of data centres. If an individual user wanted to install a new hard drive in their PC, that hard drive is going to cost the same regardless of whether they fully use its storage capacity or not.

By contrast, imagine that same person renting storage from a cloud provider running that hard drive in a data center. While their data will still end up stored on that hard drive, they’ll pay based on how much of that disk they’re utilizing - not the underlying value of the hardware itself.

For the data center operator, any unused space on their hard drives represents space that could be rented out to a customer. That means they have a clear incentive to eliminate any wasted storage capacity and to utilize their storage space as fully as possible. And the same rule applies to computing - data center operators want to ensure that processors waste the least amount of energy and time possible to maximize their utilization.

High utilization and sustainability

There are several reasons why high utilization means more sustainable operations. First, there’s a simple improvement in energy efficiency. The average hard drive uses roughly 7 watts of power, regardless of whether the disk is full or empty. Across the world’s roughly two billion PCs, that’s 14 billion watts - 14 gigawatts - of electricity.

Given that the average hard disk sitting in a personal computer is around two-thirds empty, we can deduce that globally over nine gigawatts of energy are wasted on unused storage. For reference, that’s equivalent to around three Sizewell C nuclear plants running at full capacity to power the wasted storage space on the world’s PCs.

The modern data center dramatically cuts this energy waste, since high utilization works to minimize the number of watts supporting empty storage space. And remember, we’ve been speaking so far of just storage. Storage is far less energy-intensive than processing and computing, which make up the bulk of the energy requirements of PCs and data centers alike. Under the cloud, this high utilization of both computing and storage means a dramatic reduction of electricity needed to perform the same work by an array of individual PCs.

Some of the most energy-intensive computing tasks don’t run in data centers or the cloud. One of the most outsized energy footprints comes from PC gaming, with personal gaming computers responsible for 75 terawatt-hours of electricity use annually. To put it in context, despite gaming computers making up only 2.5% of the global PC install base, their energy consumption exceeds the energy use of all of Bangladesh.

Second, there’s the e-waste issue. High utilization means that all else being equal, data centers will require fewer processors and hard drives than an equivalent number of PCs to produce the same outputs. As a result, when it comes to end-of-life phases for equipment, data centers have to dispose of far less. At the same time, the economies of scale implicit in modern data center operations mean they can also handle e-waste far more efficiently - both maximizing recycling and responsible disposal.

Finally, there’s the innovation potential to further improve energy performance. By helping to consolidate and standardize the storage and computing resources powering digital services, innovations that can reduce waste and energy consumption can be rolled out far more rapidly in data centers. Take liquid cooling, for example: liquid-cooled CPUs can reduce energy consumption by as much as 56%, with data centers providing an ideal standardized environment to deploy the technology at scale.

Data centers: a victim of their success?

If data centers are so effective at reducing costs and waste, doesn’t that just mean we risk pushing up the amount of energy and resources used because they become so inexpensive?

Because they’ve proved such an efficient and inexpensive mode of delivering services, data centers have indeed enabled massive use cases that are incredibly intensive on storage and computing. Whether it’s live document collaboration, tens of thousands of hours of entertainment at the push of the button, or intensive data science workloads: data centers have enabled many use cases that would have otherwise been unthinkable.

Data centers are the best means available to deliver digital services. If we’re concerned about energy consumption, sustainability, or e-waste, they shouldn’t be in the crosshairs.

Instead, we should focus on the applications we want to run in data centers and ask what is a justifiable use of the electricity and resources underlying the storage and processing at the data center. One example is crypto mining, with annual electricity consumption estimated to be 127 terawatt-hours of electricity: more energy use than all of Norway. Data centers may facilitate much of this crypto mining, but that’s because they’re central pieces of modern infrastructure. We should instead be collectively asking what is and isn’t an appropriate use of that infrastructure, rather than saying the infrastructure should be shut down.

Whether or not these workloads are worth the energy consumption and waste that goes with building more compute and storage capacity is not for the data center industry to answer. But so long as consumers demand these services, then data centers are the best way possible to address that demand - and most sustainably and economically possible.

We've featured the best colocation providers.

David Friend

David Friend, co-founder and CEO, Wasabi Technologies.