7 key tech innovators you've never heard of
Unsung heroes behind the gear and gadgets we use every day
Real genius is often slow to develop. No one would have guessed that Albert Einstein would become the foremost mathematician of our age when he flunked out of French class (but not calculus, as everyone assumes), or that Richard Feynman - who was known to visit topless bars in his youth - would help invent the atomic bomb.
Intelligence sometimes has to germinate and grow before it reaches its gestation, and before the rest of us mere mortals can fully appreciate an insanely high IQ.
In technology, this process seems to take even longer. Bill Gates, a horn-rimmed dweeb who snuck into his dad's office to play games on a supercomputer, went on to invent both DOS and Windows, and founded Microsoft.
Steve Jobs, the Apple co-founder, started out as a phone prankster before he dreamed up the iPhone. But you probably already know about those guys. Here's a few lesser-known geek geniuses, along with what they did that makes them so technologically special.
1. John Von Neumann, Stored-Program Architecture
Serving as a consultant in 1944 on the EDVAC computer - which used binary numbers instead of the more common decimals - Von Neumann was the first proponent of a stored-program architecture, where memory, I/O and storage are separate from the CPU. This early design for interconnected components is used in just about any electronics device - from your smartphone to a desktop PC. In fact, system designers are still realising the benefits of the Von Neumann architecture. Recently, Apple released a new Mac Pro that allows you to remove a tray that holds eight memory slots for easy access.
2. Alan Turing, Turing Test
This story behind this London mathematician is one of major success and high-profile failure. In 1945, according to Britannica.com, the National Physical Laboratory in London recruited Turing to invent the first stored-program digital computer. His design was deemed too complex for current engineering methods so the NPL created a less complicated version. In 1948, the Royal Society in Manchester instead built the first digital computer, beating Turing to the punch.
Turing left the NPL and went on to invent the Turing Test, which is still used in computing as a way to determine whether a computer can think like a human. Turing went on to invent other computing theories, but his story ended in despair: found guilty of homosexuality (then a crime in England) and sentenced to "gene therapy" he eventually committed suicide.
Get daily insight, inspiration and deals in your inbox
Sign up for breaking news, reviews, opinion, top tech deals, and more.
3. Jack Kilby , Microchip
Jack Kilby invented the first microchip at Texas Instruments in 1958. About the size of a paper clip, the first chip had just one transistor to store data - modern microchips have billions of transistors - and fit on a small piece of germanium placed on top of a glass sheet. Kilby showed that a sine wave could undulate across the microchip, ushering in a new age of electronic innovation. The microchip led to the first computer processor and all sort of gadgets - from 60-inch HDTV sets to the GPS receiver in your car. Someday, a microchip will be implanted in your body, making quick grocery trips even faster.
4. Grace Hopper, Compiler
In 1959, everyone thought a computer was just for doing simple maths and nothing more. Grace Hopper, a US Navy Admiral, had other ideas. She invented a compiler, which translates mathematical code into machine language. Think of it as teaching your dog to write English: she figured out how to make a computer think beyond basic numeric calculations. Later, she developed the foundations for what would eventually become COBOL (common business-oriented language), one of the most common programming languages in business.
Although Hopper was wrongly attributed as coming up with the term "bug" (no one knows who first thought of the term) she did invent the word "debug" which is far more helpful anyway - it means to get rid of errors in a computer program.
In 1969, the Data Management Processing Association awarded Hooper with the 'Computer Science Man of the Year' accolade, presumably because they weren't sure a woman would ever qualify. She also has the distinction of retiring from the Navy twice. After her first retirement, the Navy re-instated her because they couldn't figure out how to debug a payroll system. She retired again at the age of 80.
John Brandon has covered gadgets and cars for the past 12 years having published over 12,000 articles and tested nearly 8,000 products. He's nothing if not prolific. Before starting his writing career, he led an Information Design practice at a large consumer electronics retailer in the US. His hobbies include deep sea exploration, complaining about the weather, and engineering a vast multiverse conspiracy.