How to derive value from a cloud investment faster

Cloud computing
Image credit: Pixabay

I watched my son play football at the weekend and, yet again, noticed the behaviour of several parents on the sidelines. Not content with simple encouragement, they yell advice to their child to do things that, frankly, the world’s greatest players might struggle with. The gap between where their child’s skills lie and where they want them to be is wildly misaligned.

Similarly, when I talk to organisations about their move to the cloud. So enamoured are they with the potential that the cloud offers that they conveniently ignore the reality of where their current infrastructure really is. Their desire to get to the cloud immediately is understandable, but in most cases -- like the parent who wants their child to play like Ronaldo -- it's totally unrealistic.

Moving to the cloud overnight simply isn’t going to happen.  A combination of available skills, investment and ingrained processes means that, in some cases, a complete migration to the cloud is not realistic, practical or even desired. Perhaps a more practical solution would be to work in a hybrid environment for the short-, medium- or even long-term in order to better achieve the ultimate goal of a full move to a cloud-based environment. For me, success is determined by one key objective that the business needs to benefit from: time to value, or TTV.

How can you improve the all-critical TTV as you start your transition to the cloud? The answer lies in retooling your processes to take advantage of new technologies while also leveraging the best practices in agile data warehousing that your existing environment and historical approach have held you back from implementing. That’s the theory, but how do you turn this into practice? Let me take a step back and explain.

Image credit: Pixabay

Image credit: Pixabay (Image credit: Image Credit: Pixabay)

Benefits of cloud computing

By paying only for what you need, cloud-based infrastructure enables you to do a special project for a few months or maybe a proof-of-concept trial and simply fire up the necessary capacity for the duration of the project. Once it is complete, you can just shut down the machine(s). Not having to invest in new hardware that is only going to be used for a short period of time for a particular project means it is far easier to make a compelling business case.

With increased security and trust, we are shifting more critical workloads onto the cloud, but we still often spend more money than is needed. While storage is cheap, compute power is expensive and often virtual machines sit unused, draining money that can now be spent on growing the business elsewhere. People often talk of the cloud being expandable to grow with a company based on their needs, but it’s this contracting ability enabling us to only spend what we need to spend which is equally, if not more, important.

As you look at the advantages the cloud offers, elastic compute is a key area to focus on in particular: the ability to scale up and down as workload demands. No longer do you need to purchase hardware based on the peak workload requirements (e.g., end-of-month processing, nightly batch ETL jobs); rather, you can increase and decrease the compute as needed and for the time it is needed. Overnight batch loads and month-end end processing can be accommodated by adding more compute capacity to perform the processing, and then, once completed, you shut down the unneeded capacity.

Image credit: Shutterstock

Image credit: Shutterstock (Image credit: Shutterstock)

Dealing with data warehouses

Despite the long list of advantages of cloud platforms from a sysadmin standpoint, however, you still have to build and manage that data warehouse. Cloud platforms are not going to help you there. Without automating the development and DevOps aspects of your data warehouse, you are only solving half of the TTV problem. 

Data warehouse automation software allows you to maximise your development resources by reducing the time it takes to design, develop, deploy and operate the data warehouse while also reducing the cost and risk associated with that development. Your resources can then spend their valuable time working with the business users to deliver new business value in a much shorter timeframe while letting the automation software do the tedious and time-consuming work of generating the code, using built-in best practices and industry standards to ensure that the code is optimised for the platform in use.

Despite the need to find data warehouse automation software that can work with cloud-native platforms such as Amazon Redshift, Azure SQL Data Warehouse or Snowflake, many companies will find that a hybrid environment is going to be the norm for a period of time and, in some cases, forever. Therefore, in this scenario, it is critical to have software that can manage these hybrid environments seamlessly and treat them as a single logical data warehouse. This will allow you to manage your data warehouse ecosystem more effectively, manage the transition over time, control workloads and provide a consistent view of the system regardless of location of data. For hybrid environments, this will be a significant factor in improving time to value.

Neil Barton, Chief Technology Officer at WhereScape

Neil Barton
Neil Barton is the Chief Technology Officer of WhereScape, a provider of data infrastructure automation software, where he leads the long-term architecture and technology vision for the company's software products.