The silence is shattered; a nuclear bomb has been detonated. A storm of screams, broken glass and panic rains down. As particles collide and congeal, atoms crash into one another and a wave of energy rolls into a tight coil. It's time to say your last prayer. Or is it?
Thankfully this is just a simulation running on a supercomputer at the Lawrence Livermore National Lab in California. Here, with 500 petaflops of processing power at their fingertips, researchers have gazed straight into the eye of an atomic explosion. What they've seen is classified.
Happily, researchers aren't too frustrated, as simulating a nuclear explosion can provide more telling and useful results than letting one off for real.
But what if we gave scientists machines that dwarf today's most powerful systems? What could they tell us about the nature of nuclear explosions then? Indeed, what else could they discover about the world?
This is the story of the quest for an exascale computer – and how it might change our lives.
What is exascale?
One exaflop is 1,000 times faster than a petaflop. The fastest computer in the world is currently the IBM-based Roadrunner, which is located in Los Alamos, New Mexico. Roadrunner runs at an astounding one petaflop, which equates to more than 1,000 trillion operations per second.
Get daily insight, inspiration and deals in your inbox
Sign up for breaking news, reviews, opinion, top tech deals, and more.
The supercomputer has 129,600 processing cores and takes up more room than a small house, yet it's still not quite fast enough to run some of the most intense global weather simulations, nuclear tests and brain modelling tasks that modern science demands. For example, the lab currently uses the processing power of Roadrunner to run complex visual cortex and cellular modelling experiments in almost realtime.
In the next six months, the computer will be used for nuclear simulation and stockpile tests to make sure that the US nuclear weapon reserves are safe. However, when exascale calculations become a reality in the future, the lab could step up to running tests on ocean and atmosphere interactions. These are not currently possible because the data streams involved are simply too large.
The move to exascale is therefore critical, because researchers require increasingly fast results from their experiments. "Current models represent a balance between the resolution of the model, which can be represented as the distance between the geographical data points used, and the time the model takes to run," says Ed Turkel, a product manager for scalable computing and infrastructure at HP.
"Increasing the resolution – moving the data points closer together – increases the accuracy of the models but dramatically increases the time to compute a solution with a system of a given size. So you need bigger and bigger systems to cope with the resolution and accuracy, while making sure you have the result in a reasonable amount of time."
Being demanding sorts, scientists don't just want the answers more quickly; they also want more accurate answers. Equations covering wind turbulence, material strength and how substances behave under stress are all begging for improvement, explains Mark Seager, the Assistant Department Head for Advanced Technology at Lawrence Livermore.
"We want to make predictive statements about the safety of the nuclear stockpile. As it ages, it behaves differently from the conditions under which it was originally tested. It's like your car in the garage – if you leave it in the garage it starts to rust. We have to understand the changes [just like] you really need that car to start. Parts decay over time, and we have to predict the performance of the changes. We need high-resolution full-physics simulations in order to do that."
Of course, a car not starting and a bomb detonating due to rust and the ravages of time are very different things. In terms of how exascale computing would help scientists to understand better how nuclear bombs age, Seager gave an example of how the nuclear data from before 1995 was presented.
It was set out and interpreted on a flat map that you could lay on a table. Back then, scientists were only analysing the 2D data. After the weapons have aged for 30 years, however, they end up in different states of deterioration. If one faulty part of the stockpile is not detected and remedied, it could explode – and then set the other weapons off. Suddenly the problem of projecting behaviour becomes a whole lot more complex.
In order to understand how one part of the stockpile impacts on another, the data must be analysed in 3D. Working at petascale levels, researchers were finally able to simulate a nuclear stockpile and see the data as a whole. At exascale speeds, researchers would not only be able to examine all of the data, they would also be able to see the complex relationships between one part of the nuclear stockpile and the others.
The current situation can be thought of as a 3D 'tube': today, petascale computers can see the entire tube in 3D, but they can't compare one end of it with the other. Exascale processing would enable visualisation of the 3D data as a whole, as well as all of the different interrelationships between various components.
"We see a lot of opportunity to do multiscale simulations," says Seager. "In biology, if you're interested in cell division, the action takes place in tenths of seconds. Modelling the same thing at a molecular level takes femtoseconds (a tenth to a fifteenth of a second). It's 16 to 17 orders of magnitude faster [in simulation]. If you want to do cell division, you have to model how the DNA splits up. It's a complicated molecular reaction, and multiscale helps you resolve the biology as fast as possible."