Software-defined everything or SDx is currently one of the big trends taking hold of the enterprise technology space, and reading through any prediction list for 2015 worth its weight reveals this as one of the things to watch out for.
The idea is that focusing on the software running the data centre instead of the hardware is a far more efficient way to run things, and that savings will start to come from a range of different areas.
The fact it is a buzzword means that software-defined is being used almost willy-nilly by companies trying to piggyback on a trend that could play a big part in the future of almost every organisation the world over. Here are five things you should already know about the technology industry's next booming sector.
What is it?
Before getting down to the nitty-gritty of what it can do for your organisation it's worth working out what software-defined actually stands for. The term has been around for some time and describes a vision where all resources are brought together and managed by complex pieces of software that have automation and virtualisation at their heart.
Even though it will still be there, the idea of SDx is that hardware will become "invisible" to the IT department thanks to the virtualisation that will be applied to everything inside the data centre and anything connected to it. All parts of the system will be treated like utilities and it's predicted companies will be able to make significant savings by only using exactly what they need.
Data centres are the largest part of the organisation that will be affected by the shift to SDx and the next slide will look at why this is the case.
SDx or software-defined everything covers a wide range of different software-defined elements and to get a better understanding of the whole concept it's worth defining exactly what the major parts actually do.
At the heart of the entire conversation is the idea of a software-defined data centre (SDDC) that connects in to various other related systems such as networking, servers, storage, compute power and everything else that has an effect.
SDDC was a term originally coined by Steve Herrod, former CTO of VMware, to describe the future of data centres that have become "a history museum". SDDC sees the entire operation of the data centre handed over to software and this manages the infrastructure consumption and processes down to the component level without having to involve a human being.
The second major example of SDx in action is software-defined networking and servers. HP unveiled its Moonshot server back in April 2013, which was dubbed one of the first software-defined servers on the market. The high density server uses a lot less energy and space whilst being less complex but can handle huge data workloads. In practice this means you can fit a system that supports two network switches, 45 servers and the other components into a 4.3U space. The result is a very small amount of hardware and in its place software runs the show.
The last major component to all this is software-defined storage (SDS), which once again involves separating the hardware and software layers to allow enterprises to purchase storage closer to their requirements. Gone are the days of inflexible pieces of hardware that are often under-utilised by a firm and in their place other companies handle the hardware side of things.
Marrying all these together inside the data centre means a much lower level of hardware and explains what software-defined has come to mean.
Look beyond 2015
Everyone's getting very excited by the three words that make up SDx right now, but 2015 might not even be the year when it starts to pick up some real pace. IDC has already predicted that the software-defined networking sector on its own will grow from under $1 billion (around £675 million, AU$1.3 billion) in 2014 to $3.7 billion (around £2.5 billion, AU$4.85 billion) in 2016 before shooting on to $8 billion (around £5.4 billion, AU$10.5 billion) by 2017.
Gartner, meanwhile, thinks that by 2017 over half of all enterprises will have adopted an architectural approach to their data centre and networking. This isn't a view shared by all analysts.
"Our research indicates that actual deployments are still very low – especially at mission-critical enterprise level – and are likely to pick up only gradually during 2015. Nonetheless, the number intending to deploy software-defined architectures is rising rapidly despite there also being a substantial group which, for now, has no strategic plans in that direction," Andy Lawrence, VP of research for data centre technology at 451 Research said to Computer Weekly.
There's no doubt that SDx is here and will grow at a very fast rate. Just possibly not in 2015.
What are the savings?
Research conducted by Computer Economics and reported by Deloitte shows that data centre operations and infrastructure account for 18% of IT spending, on average, and lowering this to bring savings is a huge advantage of the software-defined data centre (SDDC).
A separate analysis conducted by Deloitte on normalised data from SDDC business cases among Fortune 50 companies showed that spending can be reduced by 20%, and that's using technology that is currently available. Once new technology emerges these savings will only increase.
Using the SDDC as an example, the savings will ultimately come from the decommissioning of old equipment like servers, racks, disk and tape, routers and switches, and the smaller size of the data centre footprint including less power use. This will in turn bring down maintenance costs and make system operation an altogether more slick proposition.
Flexibility is essential
Moving over to a system that is completely software-defined is not as simple as you may think, and if your organisation is overly reliant on existing legacy infrastructure then it likely won't deliver the kind of savings and efficiencies you're looking for. When implementing the plan it's crucial that firms pay meticulous attention to legacy hardware and try to move away from it.
Dave Husak, founder of software-defined networking firm Plexxi, has been working on virtualising computing infrastructure, big data and the Internet of Things since the early days, and explained that lumping SDN solutions on top of legacy hardware fails to address the real problems.
What needs to happen is for SDN software to be combined with new hardware that is streamlined and ready to handle huge data sets. Using older hardware simply isn't an option as it won't be compatible enough to deliver savings, yet companies shouldn't worry as the savings in the long run make getting rid of old hardware in favour of newer solutions a no-brainer.
Are you a pro? Subscribe to our newsletter
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!