By enhancing its PUE metric to include other critical data measurements with the Performance Indicator (PI), The Green Grid has taken its data center efficiency measurement tool and turned it into a critical business risk tool that relies on simulation when it is used to look into the future.
PI has transformed the way people look at how data centers operate. The change has taken a one dimensional approach to efficiency and literally made it 3D. This means it is no longer simply a technical tool, but has become a critical business tool.
For many years, data centers have proved how efficient they are by using The Green Grid’s PUE measurement. While great at highlighting how effectively a data center uses power, by focusing on energy efficiency alone, Facility Managers may have inadvertently put IT at risk of thermal failure.
For example, raising temperatures in the data center would improve PUE but ultimately could put critical IT at a higher risk of downtime.
The reverse is also true when hot spots arise limiting the amount of capacity that can be deployed. Traditionally, Facility Managers would reduce temperatures and overcool the space at the detriment of PUE to maximize the usable capacity.
This balance of energy, capacity and risk has been a constant battle for Facility Managers and using PUE in isolation potentially exacerbates this real challenge.
Building on the success of PUE helping to drive down energy use, PI looks in more detail at all of these elements together, allowing the business for the first time to align their KPIs to the data center performance.
Whether the business requires un-paralleled levels of resilience or needs to meet strict green targets, PI allows the business to choose how to balance risk against waste depending on their operational priorities.
Alongside PUE, the second dimension of PI, IT Thermal Conformance, identifies how much of the data center’s equipment is operating within the recommended air temperatures during normal operation.
IT Thermal Resilience, the third and final dimension of PI, measures the risk of thermal shutdown during failure or planned maintenance of the cooling infrastructure. This allows managers to understand how their facility will respond under the stresses of reduced cooling.
The use of simulation
As well as utilizing live performance data for current conditions, one important additional element makes PI potentially four dimensional – the use of simulation.
By introducing simulation, data center managers can not only understand their current performance but also predict how the data center will respond to changes in the future.
PI delivers insight at four levels, the most basic require only real-time monitoring from the current state of the data center. At its most advanced, PI evaluates events that can’t be tested without introducing unacceptable risk; predicting the consequences of events requires reliable engineering simulation.
How will the reliability of a data center be impacted by increased demand? the introduction of new hardware technology? Equipment failure beyond the parameters most managers are happy to test in the data center? These are the challenges that data centers face every day, challenges that are now addressed by PI from the Green Grid.
The wider value of PI as a critical business tool stems from its versatility to fit to any data center application and decide the parameters within which a facility should operate. Every data center has different challenges due to its role, environment and age.
For example, large investment banks and high frequency traders need their data centers operating reliably, to make financial transactions milliseconds earlier than their competition, to gain a business advantage.
Their focus will be on conformance and resilience with less emphasis on energy efficiency - the economics of these businesses mean the value of their trade will far outweigh energy costs.
Hyperscale data centers serving web pages, however, will have different motivations. There is no real financial consequence of a webpage loading a few seconds slower, but reliability and efficiency will be more important to reduce costs and establish consistency. The application is different, so the way PI is used is adapted accordingly.
With the recent implementation of the new Investigatory Powers Bill, which demands that data centers keep far more customer data for longer than was previously required, new pressures on data centers continue to emerge.
Take this alongside the increasing use of IoT and the certainty that smart cities and vehicles of the future will rely upon local data center infrastructure, there is a further question of capacity on the horizon.
Can we build capacity fast enough? Or can we be more efficient in our use of existing space and computing power?
Whatever the pressures, challenges or environment; PI and Engineering Simulation will play a critical role. The new performance indicator is no longer simply a badge to say your data center operates at an award winning standard.
It is a tool that allows you to decide where your priorities lie, and with engineering simulation, allows you to reliably predict how your facility will cope with the new challenges that arise in the future.
- Mark Seymour is the CTO at Future Facilities
- Check out the best dedicated servers (opens in new tab)