With rare exceptions, there's little question about whether you will adopt cloud computing in some form. You will. Rather, the questions you are probably asking revolve around how best to get started or, perhaps, how to start managing the cloud use that is already happening within your organisation.
For many, one good answer is to start with a cloud for application development and test (dev/test) because it:
- Can deliver benefits even at relatively modest scale points, offering the opportunity to start small
- Provides an internal alternative to current public cloud use and thus can streamline and standardise development workflows
- Can be independent of production systems and processes – although they can be integrated over time whether through adopting DevOps methodologies or other approaches
Perhaps most importantly though, a dev/test cloud directly addresses one of the most pressing needs at many organisations: improving developer productivity and accelerating application development. In a world rapidly turning to digital, delivering new applications faster and more efficiently isn't just a nice-to-have element; it's a key component of your business agility and, ultimately, your ability to win against the competition.
Building sandcastles in the sky
When an organisation decides to implement a cloud, its ultimate ambitions probably go beyond application development and test. But dev/test is often the starting point – and for good reason.
Dev/test tends to be a dynamic activity within an organisation. New environments are stood up and torn down all the time – certainly much more frequently than is normally the case with production applications. Therefore, the agility and flexibility that a cloud can provide is especially valuable here.
At the same time, dev/test also tends to be, within some constraints, something of a sandbox with new technologies and approaches – the subject of ongoing experimentation. What better place to try out new infrastructure approaches before putting them into production?
Of course, we've seen this story play out before. When virtualisation went mainstream in the late 1990s, it was pitched as a tool to let you develop and test on a variety of operating system versions and types without needing a physical system under your desk for each unique instance. Furthermore, if your buggy code corrupted an environment, you could just wipe the slate clean by firing up a new image rather than rebuilding an entire system.
Over time, virtualisation has gone on to be widely used in production. Cloud environments go beyond virtualisation by being more dynamic. By being more scaleable. By being hybrid. Most fundamentally, clouds introduce concepts such as catalogues of standardised services – defined as part of a software-defined infrastructure – and offers them to consumers, such as developers, through a low-touch, self-service interface.
Cloud adoption has parallels to the introduction of virtualisation. Frequently, it first takes place within organisations where delivering new IT services rapidly is both a strategic concern and directly connected to revenue. And it will first be adopted within those organisations at the point where those new IT services are created – in the application development groups.
Why dev/test matters today
There have long been companies whose business depends on custom software. That being the case, their software development processes were hopefully something of a core competency. What's different today is that differentiating based on applications and other aspects of IT isn't an outlier; it's increasingly the norm.
The fact that routine functions – think customer relationship management, for example – are now often outsourced in the form of Software-as-a-Service (SaaS) has only intensified the trend. Companies can now focus their resources on delivering services that can make a difference rather than on the mundane tasks that nonetheless somehow have to get done.
This reflects how we are seeing a great reimagining of business processes, manufacturing, using data, and the connections between businesses and consumers, consumers and businesses, and among consumers themselves.
Whether widespread automation of factories, 3D printing, mobile applications, predictive analytics, or even the latest development, the Internet of Things, the most successful businesses are doing innovative things with IT. This means the application is ever more central to the business. And in a world where the pace of change is seemingly on an ever upward trajectory, the difference between success and failure may come from how many applications can be put into service and how quickly they can be put in service.