Cooling data centres when water is scarce
New innovations are needed to keep data centres cool
More than 2.7 billion people around the globe currently suffer from water scarcity, according to World Wildlife Fund. Unfortunately, due to changes in consumption and climate, over two-thirds of the population may face water shortages by 2025.
Water scarcity is a humanitarian issue, as well as a cause of social and political unrest. Soon, more than 5 billion people won’t reliably have enough water to meet their daily needs, yet the world is also striving to invite these same families into the global digital economy.
This puts the data centre industry on the horns of dilemma, experiencing a rapid expansion in demand as well as a pressing need to curb our water use while we grow.
The water status of the industry
Data centres are notoriously thirsty enterprises. When extreme drought hit California, for example, the industry came under attack for gulping 158,000 Olympic-sized swimming pools’ worth of already limited water supplies every year.
Sadly, the quest for energy efficiency has often tipped the balance toward more water use. Adiabatic cooling—evaporating water to remove heat—lowers overall power consumption but at a high water cost.. Massive cooling towers are particularly prevalent in hot climates and desert regions already experiencing water pressures. This is how in places like Maharashtra, India, the government can be forced to import drinking water for inhabitants even as large colocation facilities proliferate.
Cloud service providers, colocation vendors, and enterprise data centre managers are bowing to the environmental directive to “reduce, reuse, and recycle.” From redesigning HVAC systems for greater efficiency, to employing municipal wastewater in place of freshwater supplies, the industry is working to shrink the impact of its cooling needs. But we still must do more to stave off an accelerating water crisis. Fortunately, solutions are emerging.
Siting strategies
One of the best ways to avoid excessive cooling requirements, of course, is to avoid being where it’s hot. This means siting decisions play an important role in determining water consumption. Iceland, for one, has grown popular with data centre providers because of its climate, and Canada, Finland, and other chillier locales are also attracting interest.
Are you a pro? Subscribe to our newsletter
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
Because it would be impossible to serve the entire globe’s data demands from the Arctic, if for no other reason than latency, the data centre industry is also exploring other options. These include building underground. Facilities, like the Lefdal Mine Data Centre in Norway and the Guian Seven Stars Data Centre in China, located within a cavern, benefit from the consistently lower temperatures a few metres down. Other abandoned industrial sites and geologically interesting spots worldwide are being examined for potential build-outs.
Then there is Microsoft, which made headlines by going deep underwater. The company recently sunk a submarine-like pod with 864 servers capable of storing 27.6 petabytes of data off Scotland’s Orkney Islands. As much of the ocean is about 0° C, should this “Project Natick” continue to show promise, submerged data centres could offer a means for the industry to alleviate significant cooling-related water-use issues.
And there is another direction—one can go up to secure cooling advantages as well. Temperatures drop approximately 6.4° C for every kilometer of altitude. This is driving growth in the data centre market in places like the Rocky Mountains of North America. The world’s loftiest data centre currently stands above 5 kilometres in the Chilean Andes, although challenges to building and operating at those heights made it an expensive gambit, albeit a necessary one for its astronomy research purposes.
And we could go farther up, all the way into space. Progress is being made toward commercial data centres in orbit. Price reductions driven by privatised space flight, technology’s ongoing miniaturisation, and advancements in “lights out” data centres are transforming a sci-fi dream into a real possibility. Several companies are now testing software-based equipment hardening, distributed satellite backup clusters, and other concepts. This is to cost-efficiently protect vulnerable equipment and data from solar flares and radiation – another key step toward full-scale data centres in the final frontier.
Chasing efficiency
Adventurous siting strategies notwithstanding, there will be an ongoing need to locate data centres near user populations in order to deliver expected service levels. Edge computing will demand that high-density micro-facilities are placed essentially everywhere. This means additional water-efficient cooling solutions are urgently needed.
Fortunately, it’s not necessary to bury or sink an entire data centre. Geothermal heat pumps also leverage natural cooling. Closed systems continually recycle their coolant, and now open-loop pump-and-injection systems allow groundwater used for cooling to be returned to the aquifer from whence it came. This is much better than older “pump and dump” approaches that left valuable freshwater in aboveground ponds to evaporate.
Taking advantage of local water in various ways has become quite prevalent. PlusServer in Germany is using the 12° to 14° C groundwater there for cooling. Google’s data centre in Hamina, Finland, pumps in frigid seawater. And an Aruba data centre near Milan, Italy, which claims to be 100% green, relies in part on river water.
These options won’t fit every location, however, so the industry is also reworking traditional cooling systems. Ironically, liquid cooling—the technology used for mainframes of yore—is generally far easier on water supplies than HVAC systems that blow cold air. Liquids not only have 50 to 1,000 times the capacity to remove heat, but the systems typically use engineered coolants, not water.
Facebook published the results it achieved by using outside air to cool water, which in turn provides direct evaporative cooling to its data centre equipment. With such systems in place, they’re using about half the water, on average, of a typical data centre, and up to 90 percent less in cooler climates.
SSDs and sealed storage hardware are good candidates for full immersion cooling, but there are many variations on the liquid theme appropriate to different use cases. Some experts estimate 20 percent of edge data centres could use liquid cooling, partly answering the question of how we will remain water-efficient in the IoT era.
Predictive analytics and another type of offset
A final area of rapid progress is machine learning or predictive analytics. Today, companies such as Ecolab and Romonet are using predictive analytics specifically to enhance transparency of water usage. The granular data improves decision-making and enables automation. In a similar vein, California company Vigilent is building machine learning software to determine relationships between variables like rack temperature, cooler unit settings, and power use, so managers can leverage potential savings opportunities with full knowledge of expected impacts. Other solutions are also taking into account such factors as weather conditions within their predictive models.
The fast evolving specialty is headed toward end-to-end cooling systems supercharged with environmental sensors and power metres to optimise performance and sustainability across numerous metrics.
There can be no doubt that more capable DCIM solutions will enable better water management within the data centre, but data centres will also contribute beyond their premises. This, in fact, may result in our greatest impact on water scarcity.
While data centres use 0.7 percent of California’s water, for example, agriculture uses 60.7 percent, and the world average for farming is actually higher at 69 percent. Additionally, every manufactured product uses water, and industrial consumption can comprise 50 percent of water demand in many regions. This is an especially important consideration, as manufacturing is expected to rise 400 percent by 2050.
These pressures are putting the world on a trajectory of extreme water scarcity, but analytics projects hosted in our data centres can highlight where and how water use can be reduced. With AI, agriculture will more precisely apply water to maximise food outputs, and manufacturing will reduce water waste throughout their processes.
It’s here where announcements, such as the one from an Ecolab-Microsoft partnership, are most encouraging. They employ 24/7 monitoring, machine learning analytics, and integrated engineering support. With these capabilities, they collect, analyse, and act on insights from 28 billion data points across 36,000 customer systems per year, saving 582 billion litres of water in 2017 alone. That’s equivalent to the drinking needs of over 530 million people. They’ve set even more ambitious targets for the future.
Although such achievements are no free pass for data centre water waste, which we’ll continue to combat, they do demonstrate how the technologies we support will help welcome more of the world into the digital future without overstepping the bounds of earth’s limited water supplies.
Paul Mercina, Vice President of Innovation at Park Place Technologies
- Interested in cooling? Check out how cold computing is helping to take compute performance to the next level