The difference between mean solar time and UTC time determines whether a leap second should be added or subtracted. Just think of UTC as universal time that every country sticks to and UT1 as an additional time measurement that some scientists use in their experiments.
Louis Essen and Jack Parry invented the world's first atomic clock, Caesium 1, in 1955 at NPL and changed timekeeping forever. You can see the actual clock on display in the Making the Modern World Gallery at the Science Museum London.
Using the new atomic clock enabled us to clearly define what a second is, and thus, create universal standards that the world could adopt. Clocks had existed before Essen's atomic clock based on other atoms, but his device was the first to show significant improvements over the quartz clocks that had been the standard up until that time.
Atomic clocks basically work by measuring the oscillation of an atom's nucleus and its surrounding electrons. The 'resonance' of the caesium atom (the current standard) forms the basis of UTC time. Atomic clocks can use hydrogen or rubidium atoms, but the most stable and accurate has been found to be caesium 133. However, the story of the atomic clock doesn't stop there.
Keeping time in the US
One of the most accurate clocks in the world is the NIST-F1 Caesium Fountain that is the primary timekeeper for the United States. It's described as a fountain atomic clock because of the way in which the caesium atoms are delivered into the clock's main chamber. Scientists have, however, been experimenting with other elements to try and improve the accuracy of atomic clocks even further.
Optics have also recently been used and combined with strontium atoms to produce an atomic clock that has a 'tick' that is more accurate than the current caesium-based atomic clocks. The mercury ion clock has also been produced that has proven to be even more accurate than the strontium clock.
Earlier this year, physicists at JILA, a joint institute of the Commerce Department's National Institute of Standards and Technology (NIST) and the University of Colorado at Boulder, successfully demonstrated a new strontium clock that is much more accurate than the NIST-F1. The strontium clock's accuracy is astonishing with its designers stating that it would not gain or lose a second in 200 million years.
Two hundred million year question
The key to the new clock's accuracy is that it uses lasers instead of microwaves in the clock, enabling it to divide time into smaller slices. These, in turn, give a more accurate reading when the time is observed. Optical atomic clocks look set to become the future of accurate timekeeping, but research into quantum computers and how they can be applied to timekeeping by NIST could produce the world's first quantum clock that could be even more accurate than optical atomic clocks.
The atomic clocks that contribute to the UTC time we all use in our lives are large and bulky. What if you could shrink an atomic clock and place it on your desk, or miniaturise it even further and place it on a chip? This pioneering work has produced its first prototype that could replace the existing quartz-based clocks that currently offer similar accuracy for their size and power consumption.
Expect to see the standard RTC (Real-Time Clock) in your PC replaced with atomic clock technology as the miniaturisation process continues.
Accurately tracking the time is not just an area of science that is reserved for the world's top research laboratories. Your computer's desktop can also be transformed into an accurate timepiece that will enable you to measure local time, see what the time is anywhere in the world, and use atomic clocks to ensure that your own timepieces are always accurate.