The amount of data in existence is literally growing by the second. In fact, 90% of the world's data was created in just the last two years. It's not just the amount that's increasing, but the value too. Businesses of all sizes have found data to be amongst their most important assets, which is why a loss event can be a truly frightening proposition.
For a company like Amazon, data loss and its associated downtime can be especially pricey (a 2013 estimate put the cost at more than $65M per minute), but you don't have to be the biggest retailer in the world to feel the pain.
Article continues below
A recent study conducted by my company, in conjunction with research firm IDC, found that downtime can be a killer for SMBs as well. Of the 400 companies surveyed, nearly 80% estimated that downtime costs them at least $20,000 per hour or more, and more than 20% estimate it costs them at least $100,000.
Loss events can be felt by all areas of the business —from sales and marketing to HR and R&D —and can have long-term negative consequences on productivity, revenues and even public perception in some instances.
A few recent high-profile incidents have made the potential impacts of data loss abundantly clear. A computer crash caused the loss of thousands of emails sent by former IRS official Lois Lerner, who is currently the target of a congressional probe, resulting in a serious setback for investigators.
Similarly, code hosting company CodeSpaces recently fell victim to an attack and extortion attempt, resulting in partial or complete deletion of most of their data and forcing the company to shut down. The reason? All of its data and backups were stored on Amazon Web Services, which were completely wiped out by the attack.
The 3-2-1 rule
These incidents prove that today, more than ever, all companies need a comprehensive data protection plan, so that at least one copy of data will always survive. A tried and true method to follow is the 3-2-1 rule: Make three copies of every piece of important data, store that data in two different formats and keep one copy offsite.
This ensures complete protection in the event of a data disaster or breach. The 3-2-1 rule guarantees that no matter what— whether it's human error, hardware failure or natural disaster — your company is protected and able to focus on the bottom line.
So how exactly does the 3-2-1 rule work?
In addition to the original copy, you should always have two additional copies of your data, whether it's on a server, network attached storage, hard drive, the cloud or somewhere else. This ensures that no single event will wipe out all of your important data.
The second law of the 3-2-1 rule states that you should keep copies of your data on at least two different media or storage types. This likely includes an internal drive as well as an external media such as a disk, tape, flash, and network or cloud storage.
Ideally, one of your local backups should be created using image technology, which backs up the entire operating system, applications and files, so you don't have to reinstall or reconfigure your systems and preferences again. In essence, image backup enables you to restore an entire system to how it was a specific point in time. Each medium has different failure modes and this ensures that there is no common failure mode.
1 copy offsite
Storing at least one copy off premise is essential to ensure protection against physical disasters like fire, flood or theft. When creating multiple copies, preserving the integrity of the initial copy becomes of paramount importance, otherwise every replica built off of that copy will have the same flaws. If you're storing to multiple locations, all files must be checked for consistency. From there, deduplication techniques will ensure that you can eliminate redundant data blocks, while encryption will add security and cataloging and indexing will allow for quick retrieval.
Today's IT managers face a significant challenge. The amount of data each company generates grows every second, and with it, the need for a solid back up strategy. Finding such a strategy requires research and careful consideration. Managers should be on the lookout for solutions that can adapt to rapidly changing and high-velocity data, and that they meet a company's needs for security, deduplication, replication, speed and efficiency.
And while it's tempting to put these tasks off until tomorrow, planning is essential to preventing a disaster. For every company, data loss is just a question of when, not if.
- Joel Berman is a Fellow at Acronis