Developing a nuclear strategy to power data centers
How data centers can be operated with nuclear
Microsoft is looking to recruit a “Principal Program Manager Nuclear Technology” with the view to this person developing a strategy for how Microsoft’s own data centers hosting cloud and AI can be operated with small nuclear reactors. Other public and private operators of large cloud infrastructure are facing the same questions, as hungry AIs are adding to an already exponentially growing data mountain, and with it the need for digital computing power, storage, and yet more electricity. What can companies and individuals do to slow down or even reduce the appetite for more resources?
Microsoft is the first cloud giant to publicly state it will have a strategy featuring its own nuclear energy to make itself more independent of fossil fuels, and provide the sort of focused energy that it envisages requiring for future cloud and AI. The job advertisement specifies that the manager should focus on so-called Small Modular Reactors (SMRs) and develop a microreactor energy strategy. These SMRs are cheaper, more mobile and less risky than conventional reactors and do not emit CO2. They can also generate up to 35 MW. Four such reactors would likely be enough to power a data center.
This intermediate step seems to be necessary because two effects reinforce each other; the global energy crisis, triggered by the war in Ukraine, has made dependencies transparent and made energy prices more expensive. At the same time, the rapid success of ChatGPT and other large language model-based AIs has fueled the desire for data and more computing resources. Sales of hardware by Nvidia - specializing in data centers and AI - rose to $10.32 billion, a growth of 171 percent over the previous year.
Now, Microsoft is the first major software provider to look for solutions to power its growing infrastructure without compromising its own CO2 targets. Every major provider, be it Apple, Alibaba, AWS, Google, IBM etc, will have to ask themselves the same question about how to address that challenge. No-one will want to miss the AI trend, and all of them have publicly-stated sustainability targets - often driven by their own governments - to hit.
But private companies with their own large cloud infrastructure are also in the same dilemma. Their AI is trained with their own data in order to offer customers intelligent services or deliver software-driven offers. Tesla's large AI project on autonomous driving may be the most popular of these projects. “It's like ChatGPT, but for cars,” said Dhaval Shroff, describing the approach. He is a member of the autopilot team at the manufacturer. For numerous reasons - with the protection of intellectual property being the biggest one - these projects use their own in-house resources, so that the learning AI and the essence of the company remain within their own walls.
The exponentially growing energy requirement of this new infrastructure runs counter to the political goals and objectives of numerous global and European initiatives such as COP26 and the “European Green Deal” of 2020. The aim of the Green Deal is to make Europe climate neutral by 2050. This initiative is being driven forward with the “European Digital Strategy”, looking to ensure that data centers are climate-neutral by 2030. The Executive Vice President of the European Commission, Margrethe Vestager, said: “We cannot let our electricity consumption run uncontrolled.” The International Energy Agency says that emissions from data centres worldwide need to be at least halved by 2030. This was before the sudden expansion of AI started to push compute and data volume. Data center owners need to take both challenges seriously.
CTO for EMEA at Cohesity.
Address a cause
The growth of data is picking up speed again, as an AI simply learns faster the more information it is able to evaluate. And today the amount of data is already growing by an average of 50 percent per year in more than half of all companies. However, most companies have their IT infrastructure crammed with data, where on average they don't even know 70 percent of the content. In this unstructured dark data, cat videos can be found as well as the menu from the last Christmas party, aged copies of databases, research results, all mixed with data that must be retained for regulatory and commercial purposes.
Are you a pro? Subscribe to our newsletter
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
This data needs to be cleaned, and not only to reduce its risk of litigation. Anyone who cleans up and disposes of data waste will be able to feed their AI with high quality content and free up space for new data. To do this, the data must be indexed and classified according to its content and value for the company. AI also plays a key role here in classifying the content very accurately and at pace.
Companies should consolidate their data on a common platform instead of continuing to operate dozens or even hundreds of separate silos. There, this data can be further reduced using standard techniques such as deduplication and compression. Reduction rates of 96 percent are possible in everyday life.
What each individual can do
Every user can help reduce overall power consumption and slow down data growth, because everyone can search through their data in the cloud and delete what is useless. This can be X-fold versions of the same photo with a slightly different perspective, or videos that you once found funny and haven't watched since. That cat video, perhaps. Every bit we can save through reducing our stored data, will reduce energy consumption. So let’s start cleaning up.
Mark Molyneux is EMEA CTO at Cohesity.