Sustainability is a now seen as a strategic business imperative, so much so that 74% of companies consider Environmental, Social and Governance (ESG) factors to be very important to the value of their company. We know that almost three in four organizations have set a net-zero goal with an average target date of 2044 and that 50% of organizations are seeking more energy efficient products and services.
Stumbling blocks for achieving net-zero
Worldwide, it's estimated that data centers consume about 3% of the global electric supply and account for about 2% of total Greenhouse Gas emissions. But only 63% of the world’s population is ‘online’ as yet, and the global data center construction market is expected to hit $369.6 billion by 2030 with a CAGR of 6.7% from 2022 to 2030. With such growth, the IT industry has a major problem on its hands as its energy consumption is set to increase not decrease.
Improving sustainability while improving IT efficiency
Data centers already use a huge amount of energy, but as businesses grapple with the transformative potential of AI tools, the extra energy required poses a significant problem environmentally. According to TechTarget, the total consumption of one AI model over nine days was no less than 27,648-kilowatt hours (kWh). This is the same amount of energy that three households use in an entire year – by only one program in little more than a week.
While there have previously been concerns about the energy consumption of any new technology, the computational power is nothing when compared to the resources that AI requires. The trouble is that AI is being adopted by many different sectors, each with its own maintenance and IT infrastructure requirements. AI training is necessary for the use of this technology; a highly repetitive process that requires substantial processing power.
With many businesses striving to optimize their energy consumption to meet sustainability goals, AI represents a challenge. The technology to support it must be adequate to support this elevated level of data processing, yet the associated carbon footprint can’t be an afterthought. With few businesses looking to be left behind in the AI race, how can these extensive energy needs be met in a sustainable way?
Chief Revenue Officer, CSI Ltd.
Data centers provide the computational power for AI technologies
AI tasks are often known as a “transaction” between a memory storage unit and a processor. There are many data protection concerns that are preventing organizations from trusting their data in the cloud, which means that data centers are much more relied upon as the trusted source. Data centers have become much more energy efficient over the last decade, but the additional processing power of AI will have a consequential impact.
As AI capabilities advance to even more areas of our lives, it's unclear how much energy will be required to maintain the data centers required, but the number is likely staggering. So how can businesses help?
Energy efficient servers
Getting the right server and storage solution can deliver an energy-efficient, high-performance outcome for most of the AI tasks. IBM Power Systems servers, which have been long been recognized for their reliability and performance have also emerged as the ideal platform for achieving sustainability goals. Offering the same performance to meet today’s workloads, they have been designed with energy efficiency in mind. For example, IBM Power E1080 provides 54% more performance and uses 15% less energy at maximum input power than the compared x86-based server.
By consolidating your network environment to the latest IBM servers, fewer physical servers are required to run multiple workloads simultaneously, leading to reduced power and cooling requirements. Not only does this consolidation save space, but it also reduces the carbon footprint of the data center. For this reason, businesses can increase uptime and capacity, making way for more bandwidth-hungry workloads.
The ability to match compute power to workload demands is a critical factor in ensuring that your IT estate is environmentally sustainable, delivering energy efficiencies without sacrificing performance during peak periods. Adjusting server performance to workloads minimizes power consumption and means that servers are used more efficiently.
The latest energy monitoring solutions help to reallocate resources and decide which workloads have lower energy demands during peak times. This allows the organisation to spot where energy is being wasted and reduce consumption. Critical workloads will receive the necessary resources, while reducing overprovisioning in other areas.
By gaining insight into power consumption patterns, the organization has a much better grip at future capacity planning with sustainability goals in mind. This consumption data can be used to track seasonal patterns of variation and begin to estimate growth projections to scale an IT estate without overbuilding or underestimating.
Responsible disposal and recycling of old technology
Of course, it’s not only energy consumption that should be factored into any technology investment, it also relies on effective end-of-life management. A responsible company will understand the volume and associated impact of sending end-of-life products to landfill, which has its own carbon emission implications. The goal should be to increase the amount that can be resold, reused or recycled and reduce the amount going to landfill or being incinerated. For example, in 2021, IBM only sent 0.3% of more than 18,000 metric tonnes of end-of-life products to landfill. This far exceeded its goal of sending 3% or less.
By switching to servers that optimize energy consumption, you are reducing hardware requirements which will not only lower the total cost of ownership, but also reduce the amount going to landfill.
Business’ environmental impact
Many businesses concerned about their environmental impact can follow some practical steps as they advance towards AI capabilities.
1. Think about the use cases of AI and the specific outcome you want them to achieve. Different types of AI will have different energy consumption costs.
2. Don’t get swept up by the hype of advanced deep learning systems that can do it all. These are expensive and use a lot of data, meaning they have a high carbon footprint. Instead, take a focused approach that is trained on a much smaller quantity of data.
3. Focus on creating and analyzing high quality, rather than high quantity data and delete data that is no longer in use or needed in the future.
Optimizing energy use and consolidating servers and storage infrastructure form a strong basis for shaping a more environmentally friendly and efficient IT estate meaning it no longer needs to be the Achilles Heel of an ESG policy.
Are you a pro? Subscribe to our newsletter
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
Andy Dunn is CRO at CSI Ltd.