Consider this situation: your organization was just hit by a cyberattack and all the data you rely on to function is suddenly inaccessible. Your files are locked up, encrypted by the hacker, who is now demanding a hefty payment to restore them. Do you pay up? Or do you have a backup?
Unfortunately for IT professionals, this is becoming an all-too-common scenario. Last year, ransomware once again ranked as the most common form of cyberattack globally, costing victims an average of $4 million per breach. These attacks—which employ malware to lock an organization’s data until a ransom is paid—are rapidly growing in prominence: according to some analysts, ransomware is on pace to top $250 billion in cumulative damages by 2031.
In response, organizations have been devoting increasing security resources to the prevention of ransomware and other data breaches; however, many fail to properly plan for the recovery of their data after an attack has occurred. This is a critical oversight as the recovery time from a breach can stretch into months, crippling business operations along the way.
Minimizing the impact of ransomware attacks demands a shift in mindset: instead of simply shoring up defenses with security solutions and services and hoping a breach never happens, organizations must begin treating these attacks as an inevitability. This requires a data resilience plan—specifically, one where files are protected in a way that makes them invulnerable to cyber criminals. Thanks to modern technology, including artificial intelligence (AI), such a defense is easier to establish and manage than you might think.
Protecting your business from the biggest threats online
Perimeter 81's Malware Protection intercepts threats at the delivery stage to prevent known malware, polymorphic attacks, zero-day exploits, and more. Let your people use the web freely without risking data and network security.
Preferred partner (What does this mean?)
Denis Kennelly is General Manager for IBM Storage.
For decades, the common practice for data resilience has consisted of systems of snapshots. Data is backed up to an archive at set intervals, and when things go wrong with the local copy, administrators can simply go back and pull from the most recent clean data to replace any problematic files.
But, over time, ransomware attacks have become more sophisticated, and can infect not only the working data but the snapshots, as well. This leaves compromised organizations with a difficult decision: spend countless hours poring through backups in search of uninfected data, or simply pay the ransom.
For organizations like hospitals where downtime is particularly problematic, the most prudent choice is often to just pay. Even in sectors that operate under less stress, the cost of carrying out a protracted recovery effort can exceed the price of simply ponying up to the attackers. And even when successful, there’s bound to be a significant amount of data lost between the start of the breach and the end of recovery.
Assistive remediation and intelligent recovery
Fortunately, new technologies are emerging to help reduce the extent of damage attackers can inflict on data backups, such as the use of AI to aid in the detection of cyberattacks on data. When data is written to a storage device, the pieces fall into place with a consistent randomness, referred to as entropy. When ransomware infections occur, the pattern of randomness changes as the attackers begin to corrupt files. By applying AI and entropy algorithms in real time as the data is being written (otherwise known as inline data corruption detection), it’s possible to detect alterations in data writing patterns at the disk drive level, allowing for administrators to immediately seal off the backups to prevent further damage.
The best backup systems, however, employ what’s known as immutable snapshots to preserve data in a way that’s impervious to cyberattack. This method involves the creation of a permanent, unalterable copy within the storage array that no machine or individual can directly access. These files can only be altered by means of taking a copy of the snapshot, leaving the original always intact. Machine learning also comes into play to check the copy of the immutable snapshot and validate that there is no corruption. In the event of a breach, administrators can simply pull a clean copy of the immutable snapshot to restore their data set. Immutable backups are also extremely valuable in the event of accidental alterations to data like file corruption and human error.
Naturally, this process can consume a significant amount of storage space; to make it feasible, immutable snapshots are only captured at set intervals, such as once an hour. Therefore, the ideal data resilience strategy involves a combination of entropy algorithms and immutable snapshots. This way, recovery efforts can begin immediately following a breach, sparing the loss of any new data created in the interval between the creation of immutable backups.
Covering your bases
As data continues to grow in importance to organizations of all kinds, the stakes are far too high to rely on proactive security methods alone. The first step toward covering your bases should involve a thorough assessment of your data backup infrastructure, followed by an investment in modern solutions including AI that employ safeguards like immutable snapshots, entropy algorithms, and validated data restores that can be automated. Then, you can begin to develop a comprehensive strategy for getting your organization back on track.
Are you a pro? Subscribe to our newsletter
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
Denis Kennelly is General Manager for IBM Storage, where he is responsible for delivering industry-leading Storage offerings and leading the IBM Storage business.