A zettabyte might not be a word you've heard of – even Word's spellchecker doesn't recognise it – but consider it in terms of a more familiar unit. A standard smartphone today will have around 32 gigabytes of memory. To get to one zettabyte you would have to completely fill the storage capacity of 34,359,738,368 smartphones.
At this current rate of production, by 2016 the world will be producing more digital information than it can easily store. By 2020, we can predict a minimum capacity gap of over six zettabytes - nearly double all the data that was produced in 2013.
TRP: If the world is running out of storage, why can we not simply increase production of hard drives and build more data centres?
MW: Unfortunately, the imminent breach between storage demand and production is not a problem that can so easily be solved. The fact of the matter is that it's far harder to manufacture capacity than it is to generate data. Building factory capacity that is capable of meeting such stratospheric demand would take hundreds of billions in investment. It's simply not a realistic option.
Another factor is the technology in use by the storage industry today. Even if the investment was there and thousands of new data centres could be commissioned, it's becoming more difficult on a molecular level to squeeze increasingly dense volumes of information onto the same amount of space.
Seagate produced its first ever hard drive in 1979: it had 5MB of storage and would have cost a few months' wages. What it could store today is about 2 seconds of low resolution video shot on a smartphone, or 2 high resolution photos. A modern 5TB hard drive will set you back less than £200 and is capable of storing 2 million photos, 2.5 million songs and about 1,000 movies. Although it's not physically any larger than our oldest hard drive, in capacity it's actually 1,000,000 times bigger.
So, while the ability to squeeze ever more dense data onto the same amount of space is a real testament to human ingenuity and engineering, it's starting to reach the point where new technologies will have to take over.
TRP: What are some of the latest innovations in data storage that could help heal the data capacity gap in 2020?
MW: Silicon may be the work-horse that has helped us get to where we are today, but it's starting to show its age. Fortunately, there is an impressive amount of innovation taking place in the industry at the moment and a number of these advances could help us to seal the data storage breach over the next five to 10 years.
RRAM (resistive random access memory) is one such example. A smart type of computer memory, this could, in theory, let us store tens or even hundreds of times as much data on our smartphone. The high difficulty and costs of production have meant that many companies have overlooked it in the past s, but researchers at Rice University have recently had a break-through. They have shown a way to make RRAM at room temperature and with far lower voltages. Some prototypes have even been proven to store data densely enough to enable a terabyte chip the size of a postage stamp.
If RRAM doesn't seem quite far enough removed from the world of silicon-based storage, there's also DNA to consider. Last year, a team of scientists from the European Bioinformatics Institute reportedly stored a complete set of Shakespeare's sonnets, a PDF of the first paper to describe DNA's double helix structure, a 26-second mp3 clip from Martin Luther King Jr.'s "I Have a Dream" speech, a text file of a compression algorithm, and a JPEG photograph in a strand of DNA, no bigger than a speck of dust. Another forward-looking team at Harvard University's Wyss Institute later brainstormed their way to successfully storing 5.5 petabytes, or 700 terabytes, of digital data into a single gram of DNA.