"Computers in the future will weigh no more than 1.5 tons."
Popular Mechanics, 1949
Before you dismiss this prediction as coming from an unlikely source, we should tell you that Popular Mechanics has been one of America's leading science and technology magazines for over 100 years. And as you'd expect from such an August publication, the prediction was, for the most part, spot-on – the vast majority of today's computers do indeed weigh in at less than 1.5 tons. Not all of them, though – not by a long way.
Jaguar, the world's fastest supercomputer, is housed at the Oak Ridge National Laboratory in Tennessee and weighs in at almost 200 tons. That doesn't even include the massive air conditioning units that are needed to get rid of the heat that's generated by almost a quarter of a million processor cores, which consume 10 megawatts of power between them.
FAT-CAT: Even today, some computers weigh more than 1.5 tons – this one considerably more
To be fair, though, at 1.75 petaflops, Jaguar is about two thousand billion times faster than 1949's latest and greatest.
"There is no reason for any individual to have a computer in his home." Ken Olsen, co-founder of Digital Equipment Corporation, 1977
He really ought to have known better. After all, the company Ken Olsen founded was responsible for the first of two important milestones in the history of home computing.
Prior to the early '60s, a computer was one thing and one thing only – a mainframe. It would be priced in hundreds of thousands of pounds, if not millions, occupy a whole room and require a full-time staff to operate and maintain it.
In 1964 DEC launched the PDP- 8, which is generally considered the first commercially successful minicomputer. It was the size of a refrigerator, cost $18,000 and over 50,000 were sold – more than any other computer before it. For the first time, a computer could be owned by a single department, not a huge organisation, and it could be operated by people who weren't scientists.
Computers were starting to pass from a select few to the many. Even more surprising, though, is the fact that Olsen made this statement after the second of those two milestones had passed. That was in 1975, when the MIPS Altair 8800 became the first personal computer to sell more than a handful of units.
"640kB should be enough for anyone."
Bill Gates, 1981
He later denied it, but this was allegedly Bill Gates' take on the maximum amount of memory a computer would need. Even if he didn't actually say it, we can be pretty sure he believed it, as it seems fairly realistic in context.
Previous personal computers were based on 8-bit processors, which meant they couldn't address more than 64kB of memory. But even this would have been the stuff of dreams for most home computer users of the day.
Perhaps the best known British home computer that year was the Sinclair ZX81, which had just 1kB of memory.
To put this in context, let's bring it up to date. If you were offered a PC today with 2.56TB of memory, wouldn't you think it was enough for anyone – at least for a few more years?
"I have travelled the length and breadth of this country and talked with the best people, and I can assure you that data processing is a fad that won't last out the year."
Editor in charge of business books, Prentice Hall, 1957
The computer revolution might already have been almost 10 years old by this point, but computers were still pretty thin on the ground. With an estimated 100 of them in use in 1953 and 250 in 1955, this new technology wasn't exactly taking the world by storm.
What's more, the phrase 'data processing' refers to business applications, which were lagging well behind technical computing. Lyons, of teashop fame, launched LEO, the first ever business computer, in 1951. But by 1957, only one was in operation – and that was used by Lyons itself for valuation jobs and payroll processing. Even Big Blue was slow to make an impact on business computing.
Its first offering, the IBM 702 Electronic Data Processing Machine, was only in production from 1953 to 1954. Its replacement, the 705, broke new ground by being the first commercial computer to use magnetic core memory, but the number sold isn't on record. What we do know, though, is that back in the '50s, IBM was overshadowed by a company now long forgotten: Remington Rand, later known as Sperry Rand.
Its earliest computer, the UNIVAC, first shipped in 1952 and was designed from the outset for business and administrative use. It did well, but success was relative back in the '50s. By the time the UNIVAC was replaced by the UNIVAC II in 1958, a grand total of 46 devices had been sold.
Given that such machines cost between $1.25 and $1.5million (around $10million today), this gloomy prophecy wasn't too surprising. We bet he thought differently in another five years, though.
"Transmission of documents via telephone wires is possible in principle, but the apparatus required is so expensive that it will never become a practical proposition."
Dennis Gabor, 1962
Dennis Gabor wasn't your average scientist – he was a Nobel Prize winner. That award was for his invention of holography, but he also applied his considerable talents to the theory of data communication. So he really ought to have known what he was talking about, but it turned out he didn't – at least not on this particular subject.
It wasn't long before his error was exposed. Later that same year, AT&T launched the Bell 103, which was the first commercially successful modem. It was now possible to transmit data at 300 bits per second across an ordinary telephone line. In fairness to Gabor, this technology was still too slow and too expensive to be used for anything other than mainframe communication.
It wasn't until the early '80s that the proliferation of bulletin boards heralded the era of low-cost data communication that was available to Joe Public. Just a year after making this spectacularly inaccurate prediction, Gabor had a change of heart on the subject of forecasting the future.
In his 1963 book, Inventing the Future, he wisely stated that "the future cannot be predicted, but futures can be invented". This is surely a fitting place to conclude our investigation of computing's most unreliable and inaccurate prophecies.