Developing a nuclear strategy to power data centers

An abstract image of a database
(Image credit: Image Credit: Pixabay)

Microsoft is looking to recruit a “Principal Program Manager Nuclear Technology” with the view to this person developing a strategy for how Microsoft’s own data centers hosting cloud and AI can be operated with small nuclear reactors. Other public and private operators of large cloud infrastructure are facing the same questions, as hungry AIs are adding to an already exponentially growing data mountain, and with it the need for digital computing power, storage, and yet more electricity. What can companies and individuals do to slow down or even reduce the appetite for more resources?

Microsoft is the first cloud giant to publicly state it will have a strategy featuring its own nuclear energy to make itself more independent of fossil fuels, and provide the sort of focused energy that it envisages requiring for future cloud and AI. The job advertisement specifies that the manager should focus on so-called Small Modular Reactors (SMRs) and develop a microreactor energy strategy. These SMRs are cheaper, more mobile and less risky than conventional reactors and do not emit CO2. They can also generate up to 35 MW. Four such reactors would likely be enough to power a data center.

The exponentially growing energy requirement of this new infrastructure runs counter to the political goals and objectives of numerous global and European initiatives such as COP26 and the “European Green Deal” of 2020. The aim of the Green Deal is to make Europe climate neutral by 2050. This initiative is being driven forward with the “European Digital Strategy”, looking to ensure that data centers are climate-neutral by 2030. The Executive Vice President of the European Commission, Margrethe Vestager, said: “We cannot let our electricity consumption run uncontrolled.” The International Energy Agency says that emissions from data centres worldwide need to be at least halved by 2030. This was before the sudden expansion of AI started to push compute and data volume. Data center owners need to take both challenges seriously.

Mark Molyneux

CTO for EMEA at Cohesity.

Address a cause

The growth of data is picking up speed again, as an AI simply learns faster the more information it is able to evaluate. And today the amount of data is already growing by an average of 50 percent per year in more than half of all companies. However, most companies have their IT infrastructure crammed with data, where on average they don't even know 70 percent of the content. In this unstructured dark data, cat videos can be found as well as the menu from the last Christmas party, aged copies of databases, research results, all mixed with data that must be retained for regulatory and commercial purposes.

This data needs to be cleaned, and not only to reduce its risk of litigation. Anyone who cleans up and disposes of data waste will be able to feed their AI with high quality content and free up space for new data. To do this, the data must be indexed and classified according to its content and value for the company. AI also plays a key role here in classifying the content very accurately and at pace.

Companies should consolidate their data on a common platform instead of continuing to operate dozens or even hundreds of separate silos. There, this data can be further reduced using standard techniques such as deduplication and compression. Reduction rates of 96 percent are possible in everyday life.

What each individual can do

Every user can help reduce overall power consumption and slow down data growth, because everyone can search through their data in the cloud and delete what is useless. This can be X-fold versions of the same photo with a slightly different perspective, or videos that you once found funny and haven't watched since. That cat video, perhaps. Every bit we can save through reducing our stored data, will reduce energy consumption. So let’s start cleaning up.

We've featured the best green web hosting.

TOPICS

EMEA CTO at Cohesity.