Deciding if Flash Storage is Right for your Data Center

Servers
(Image credit: https://pixabay.com/en/sever-digitization-3100049/)

Flash storage is one of the most promising new technologies to affect data centers in decades. Similar to virtualization, flash storage will be deployed in nearly every data center over the next decade--its performance, footprint, power and reliability benefits are simply too compelling.

But every data center must be uniquely architected to meet the specific applications, user access and response time requirements and no one storage vendor can produce a single product that is best for every application workload. 

Although storage systems that incorporate flash promise to relieve all storage performance problems, determining which applications justify the need for flash and exactly how much flash to deploy are fundamental questions. 

If flash is not provisioned correctly and tested against the actual applications that run on your infrastructure, the price of your flash storage can cost 3X-10X the price per GB of traditional spinning media (HDDs).

Before deploying any flash storage system, IT architects need a way to proactively identify when performance ceilings will be breached and how to evaluate the technology options for best meeting application workload requirements of their own networked storage. 

Workload analytics

Workload analytics is a process whereby intelligence is gathered about the unique characteristics of application workloads in a given environment. By capturing all of the attributes of real-time production workloads, highly accurate workload models can be generated which enable application and storage infrastructure managers to stress test storage product offerings using THEIR specific workloads. 

The concept is to extract statistics on production workloads from the storage environment to establish an I/O baseline and simulate I/O growth trends.

Truly understanding the performance characteristics of flash storage is very different than traditional hard drive based storage. Flash vendors claim they are extremely fast—some vendors are claiming over a million IOPS, but the assumptions and configurations to achieve such results vary greatly and can be very misleading. 

To make flash storage affordable, inline deduplication and compression (i.e. don’t store zeroes or repeated patterns) are essential. Unfortunately, enabling such features can have a dramatic impact on performance. This makes workload modeling of such traffic even more complicated. 

Workload models must accurately capture these attributes and incorporate the ability to model data compression and inline deduplication. Accurate workload modeling for flash will need to emulate your workload, control the duplicability of the data content, control the compressibility of the data content, and generate millions of IOPS.

Deciding if and when flash or hybrid storage systems are right for your data center is a complex task these days. Relying on vendor-provided benchmarks will usually be irrelevant as they can’t determine how flash storage will benefit your specific applications. 

Workload modeling, combined with load generating appliances, is the most cost-effective way to make intelligent flash storage decisions and to align deployment decisions with your specific performance requirements.

There is a new breed of storage performance validation tools available on the market today. Tools such as Load DynamiX allow you to create realistic workload profiles of production application environments and generate workload analytics that give insight into how workloads interact with the infrastructure. 

To help you test which is the right mix of SSD and HDD for your environment, these innovative new storage validation tools can help you produce configuration/investment scenarios—now and in to the future.

  • Gavin Tweedie is the Operations Director for EMEA region for Load DynamiX. He has over 20 years experience in growing EMEA operations of successful enterprise software and cloud-enabled startups, and has held sales and system engineering management positions with IBM, HP, and Sun Microsystems.
Gavin Tweedie

Gavin Tweedie is the Chief Executive Officer of Global Surface Intelligence. Prior he was the Director of EMEA Sales & Operations at Load DynamiX. He has over 20 years experience in growing EMEA operations of successful enterprise software and cloud-enabled startups, and has held sales and system engineering management positions with IBM, HP, and Sun Microsystems.