IT leaders troubled by security vulnerability lag in cloud

A person at a laptop with a secure lock symbol floating above it.
(Image credit: Shutterstock / laymanzoom)

Cloud adoption has delivered a raft of benefits for organizations, not least during the pandemic when companies needed the flexibility and agility of public, private, and hybrid clouds to support their operations. Cloud services enabled businesses to maintain remote operations during the Covid-lockdown, and are now driving the continuing shift to hybrid working.

About the author

Ian Wood is the Senior Director and Head of Technology for UK&I at Veritas.

However, the demands of the past two years have resulted in hyper-speed roll out of these cloud systems, often far beyond any plans that businesses might have had. Implementations that were expected to be part of five-year transformation programs were sometimes undertaken in mere weeks. Circumstances necessitated rapid mass adoption of new applications, systems, or workloads, many of which operate in increasingly complex multi-cloud environments. 

In fact, a recently released research report, surveying a global panel of senior IT decision-makers, found that 80% had implemented new cloud capabilities or expanded elements of their cloud infrastructure beyond their original plans as a result of the pandemic. 

Almost half (47%) of those surveyed for Veritas' Vulnerability Lag Report said that dealing with the knock-on effects of cloud adoption is one of their top three priorities. A fifth cite the fallout from increased use of cloud services as the single most important challenge facing their organization.

Vulnerability lag leaving gates wide open

One key challenge of this expanded and accelerated implementation is that many organizations now suffer from a 'vulnerability lag'. As described in Veritas' report, cybersecurity measures have failed to keep pace with the rapid rate of transformation during the pandemic - particularly within cloud infrastructures. 

There is a gap between the additional cloud technologies implemented during the pandemic, and the surrounding security policies, practices, and procedures required to protect them. Cloud technology (56%) and security (51%) are the two most commonly reported gaps that now exist in respondents' organizations' IT strategies, that are leaving them open to attack

Of course, it is not only organizations that have leveled up the sophistication of their systems. Cybercriminals have been innovating too. As Canalys  summarized, there has been a 'surge' in ransomware attacks, which will only grow - 'cloud infrastructure and software services will be targeted as more data is stored across multiple cloud services, beyond the traditional IT perimeter.' 

This increased focus on cloud targets puts more pressure on organizations that have growing multi-cloud operations to implement robust data-protection strategies that have been designed for such sophisticated environments.

Lack of insight

The Veritas' 2020 Ransomware Resiliency Report also found that the more cloud services enterprises have in their infrastructure, the less capable they were of restoring mission-critical data in the event of a ransomware attack. It seems that the more complex the makeup of a business's cloud infrastructure becomes, the harder it is to consistently manage data protection across it.

The issue is compounded by a significant lack of clarity around what technology has been introduced into organizations. In this latest research, just 58% of IT leaders said that they could confidently and accurately state the exact number of cloud services currently in use in their organization. How can businesses be confident that everything is secure, when they don't even know how many environments they need to protect?

Not only does the IT department lack the information they need about which locations they need to protect, they are blind to what data needs to be protected too. On average, respondents believe that only 65% of their organization's stored data is classified or tagged, leaving 35% that is 'dark' – which is the data on their network that they do not know anything about. 

Half of their data is redundant, obsolete, or trivial (ROT), and only 16% is business-critical. 

This means that despite the average spend for data risk management initiatives – such as security, data protection, and resiliency – increasing by +6.72% in 2020 compared to the previous year, IT decision-makers are unaware of the value of more than a third of their data. 

This lack of insight could prove fatal in a landscape where cybercriminals relentlessly pursue weaknesses to initiate sophisticated attacks. Data protection relies on a thorough understanding of the value and location of the data that needs to be protected. Before cloud data sets can be properly protected from threats like ransomware, IT teams need to know exactly what data has been sent to which cloud service. 

Such muddiness lays a poor path when seeking to deliver on heightened customer expectations and changing working models. While organizations might have rapidly adjusted to a ‘new normal’, there is no slowing down. There is a continued reliance on sophisticated cloud infrastructure to answer an array of competing priorities. In the midst of this, mapping the detail of organizational data could seem like a heavy burden to place on the shoulders of already-stretched IT teams. It doesn't have to be. 

Advancements in AI and machine learning can quickly clear the fog, shining a light on what and where data is stored, its value and use. Not only does this allow for robust security implementations but also enables greater insight, powering data-driven decision-making.

Taking a holistic approach to data protection, these new workloads and data sets can be simply captured. Working with an experienced partner to implement such technologies ensures alignment with objectives while alleviating the burden on IT staff. 

Whose responsibility is cloud data protection anyway?

What has become clearer throughout the pandemic is that cloud security comes with an added complication - what should organizations be expecting from Cloud Service Providers (CSPs)? 

Worryingly, Veritas' Truth in Cloud study found that 69% of enterprises believed their cloud service provider was responsible for data protection and data privacy. However, the majority of CSPs' end-user license agreements contain clauses that make the customer responsible for most data protection. This confusion over whose job it is to secure data means that, often, no one is doing it – allowing gaps in cloud-based data protection to widen. This is amplified with each additional CSPs that is brought into the equation. 

The question is, how can organizations start to rectify the situation when they don't know which CSPs they've engaged and what data is held where? Mapping data to identify its location and value is the first step in closing the vulnerability gap left in the wake of rapid transformation. 

Aligning security measures with cloud deployments will undoubtedly be a challenge for organizations continuing to evolve, but help is available.

The important thing is for businesses to be pragmatic about how they address the issues. Many companies now have petabytes of data spread across dozens of hosted applications and cloud services. Getting that under control is unlikely to be achievable as a manual process. It’s important to lean into autonomous data management solutions to bear the brunt of identifying, classifying, managing and protecting data.

Closing this vulnerability gap need not drain resources. Instead, security can be assured while IT teams are free to focus on the innovation and transformation that will ensure recovery, resurgence, and growth, even in the face of continued external uncertainty.

At TechRadarPro, we've featured the best malware removal.

Ian Wood

Ian Wood is the Senior Director and Head of Technology at Veritas, a global leader in data management. He has over 29 years of working experience and is a passionate of technology.