How does the ultimate PC of 10 years ago compare to the best today?

10 year old PC

Yes, you may have a brand new PC with a Titan on graphics duty, Gigabytes of RAM sitting spare, and both a hard drive with near unlimited storage and a big SSD – one that makes everything fly. Could it be, though, that a seemingly humble machine from 2005 could keep up… or even, just maybe, even out-power it?

Of course not – what a ridiculous idea! But how far have we come in just 10 years? This was the year both the Xbox 360 and PlayStation 3 were unveiled, and the former actually came out. It wasn't a great gaming year for the PC, though we did get a few notable releases like Psychonauts and Fahrenheit/Indigo Prophecy – and Carol Vorderman's Sudoku – along with the almost instantly forgotten damp squib that was Quake 4.

If you wanted a game to show your rig at its best, chances are your game of choice was either Far Cry, to see how well it could render a gorgeous tropical island, or Doom 3, where systems competed to render the blandest of horror.

Intel CPU

The two Pentium 4 Extreme Editions differed in their ability to handle 64-bit. At the time, this wasn't much of an issue for most home users

Engine of the beast

Let's start our retro system with the processor. 2005 was when dual-core processors ruled the day, with AMD's Socket 939 and Intel's Socket 775 taking point. AMD64 was popular with gamers at the time, but for simplicity's sake, we'll compare one of Intel's higher spec CPUs with today. The Pentium 4 Extreme Edition, running on a Prescott core, was a 64-bit CPU – though 32-bit was still the home standard – running at 3.7GHz with a 2MB cache.

Back to today, and let's look at the recommended specs for one of the year's most demanding games, The Witcher 3 – an Intel Core i7 3770. Perhaps surprisingly, this is only a quad-core chip, running at 3.4GHz with an 8MB cache – a big improvement to be sure, but not one that necessarily feels like a 10-year upgrade.

In the nineties for instance, we had the far more impressive jump from the Intel 386 through 486, with graphics going from simple 2D images to the likes of Doom and then full 3D experiences. Even if we raise our level to CPUs like the Core i7 5960X and 5820K that offer eight cores – far more than most people are using – there isn't the same raw feel of a generational leap.

Instead, while of course every task has its own requirements, in non-specialist environments processors and architectures have recently been more noted for other factors, like low power usage. Home applications at least haven't really benefitted from the jumps in years, with heavy lifting tasks increasingly a job for graphics cards.

Graphics card

Graphics cards were so much simpler then. And smaller, too

Graphic detail

That seems like a good next stop. Your card of choice in 2005 was likely the GeForce 7800 GTX – 512MB of throbbing graphical muscle, compared with the 4GB of NVIDIA's current top-end (excluding the Titan X), the GTX 980. Needless to say, that card doesn't so much crush its decade-old predecessor as atomise it. To pick just one stat, the 7800 GTX had a memory bandwidth of 54.4Gb/sec, while the GTX 980 boasts 224Gb/sec.

What really makes the difference though is how much more modern cards do. In 2005, for at least a while, there was talk of graphics cards being joined by another heavy-hitter, the 'physics' card. At the time even basic cloth simulation remained a showpiece technology (though given that Tomb Raider's next-gen console release is still making a big deal out of one of its characters having something approaching realistic hair, we probably shouldn't tut too much at that), to say nothing of throwing around debris and fancy effects.

Far Cry

Beauty, circa 2004. Far Cry's descendants are still benchmarks for visual loveliness

Cue the Ageia PhysX card, which was seen around this time as destined to be either the next 3DFX card – the PC's most successful pioneering GPU – or a complete bust. In the end, it found a middle-ground, with Nvidia buying the company and integrating the physics support into the 3D cards everyone now needed. It's a little old now, but this demo of Arkham City shows the kind of difference this can make.

Updated GPUs have also added many new strings to their bows, including better shaders in games, and Nvidia's CUDA (Compute Unified Device Architecture), which allows its GPUs to take some of the weight off the CPU even while not throwing around 3D graphics – rendering video footage for instance, or crunching numbers for cryptography.

TOPICS