The death of Moore’s law, and why it’s actually a good thing for games
As tech innovation slows, creativity grows
Gordon Moore looked into his proverbial crystal ball one day in 1965 and authored one of the 20th century’s most defining principles. Observing the pace of innovation in computing for the magazine Electronics, he predicted that the number of transistors on a circuit would double each year from the present until 1975.
What he didn’t know at the time was that Moore’s law would continue to hold true for a bit longer than that. In the intervening decades, that pace of growth of the number of transistors per circuit continued to remain steady, utterly transforming the modern world in the process. This technological revolution has been responsible for about a third of all productivity-based growth in the United States since 1974, according to MIT Technology Review. Moore supposed that it might lead to wonders like computers in our own homes, and here we are in 2022 trading NFTs on our smartphones. Dream bigger, Gordon. Jeez.
But nothing lasts forever. Not The Undertaker’s Wrestlemania streak, not your friend’s conversation about their holiday, and, most pertinently, not Moore’s law. There wasn’t a particular moment when the graph suddenly flatlined after decades of upwards diagonal trajectory. Instead, it’s been a slow tapering off, an increasing struggle to achieve the next microscopic manufacturing process.
Age against the machine
Intel’s 10nm manufacturing process was severely delayed, arriving in 2019 with poor die yields and leading to a four-year generational gap between it and Intel’s previous 14nm chips. The company’s designers agree that their rate of progress is no longer consistent with Moore’s law. Nvidia has recently said the same.
What’s more, it’s taking exponentially more effort to reach the next rung on the ladder. The amount of research and development required to bring about smaller, denser microchips has risen by a factor of 18 since 1971, say economists at Stanford and MIT.
Most computing experts are in agreement by now: Moore’s law is dead. And while that spells trouble for a western world that’s been largely driven by that innovation, with entire economies propped up on it, where we are now is a safe space. We’re only here to talk about games. Silly old games.
In this incredibly specific microcosm of society, the fall of Moore’s law is actually a good thing. That seems counter-intuitive at first, given that, as gamers, our initial reaction when we hear about a more powerful CPU or GPU is to imagine an outlandish degree of new graphical fidelity.
Get daily insight, inspiration and deals in your inbox
Sign up for breaking news, reviews, opinion, top tech deals, and more.
And particularly through the late ‘90s and noughties, the CPU arms race seemed to be generating massive steps forward for gaming. Not just better graphics, although that was certainly happening, but entirely new genres, made possible by complex AI, newfound environmental scale, and procedural generation.
But, since about 2008, the line’s been curving off. We didn’t notice an immediate effect in gaming. It’s happened so slowly that only now, more than a decade on, do we accept that games from 2012 don’t look that different to games from 2022. Particularly when you put games from 2012 next to games from 2002.
Having greater technological bandwidth at your disposal to make games is certainly a positive. But having a clear ceiling, albeit a little lower than you would have liked, is better.
Because creativity is all about constraints. In the writing world, you’d probably be looking at a blank page right now, just like I would have been the night before, if this commission asked me to write about whatever I liked. We all need constraints, because limitless choice is paralysis. Neil Postman called it the information action ratio – the more we know, the less we do.
In game development, the major constraints are budgetary – and that also covers time, since every day you’re still working on a game is another day you have to pay everyone – and technological.
We’ve become increasingly aware of the stresses and workloads involved with making games happen in recent years. But removing those aforementioned constraints wouldn’t alleviate that. Increasing them might.
Window of opportunity
Let’s take a classic example of development hell: Duke Nukem Forever. The project kicked off in 1997 and didn’t see the light of day until 2011, and at least part of the reason for that famously protracted dev time was that it kept missing its tech window – the period during which the visuals still looked competitively shiny, the mechanics were in line with what else was out there in the market, and culturally people were still into the idea of picking up a piece of feces. Unfortunately this piece was published by Gearbox Software and going for full price.
What if there had been no new game engine? What if the Build Engine was the be-all and end-all of technical wonderment? Duke Nukem Forever would probably have been out before ‘99. And it would, inarguably, have been a much better title.
Because its developer would have had to work within the lines. Find new ways to make sprites and corridors interesting, just a couple of years after the ideas they displayed in Duke Nukem 3D. Without new, 3D-rendered corridors and polygonal enemies to hide old ideas behind, they’d have to innovate on a creative level.
That’s exactly what we see in the current wave of retro shooters like Dusk, Ion Fury and Amid Evil. They download on Steam before you’ve had a chance to click ‘OK’, such is their tiny file size. Their visuals wouldn’t make the cover of a games mag in 2001. But they’re fantastic games, all of them, because of their level and enemy design, weapon feedback, and sense of atmosphere.
To broaden the idea out a bit, let’s look at consoles. The PlayStation 3 arrived with Motorstorm, Resistance: Fall of Man, Need For Speed: Carbon, Madden NFL 07 et al. That same console’s lifespan ended in 2013 with The Last of Us, Bioshock Infinite, and GTA 5.
The hardware ceiling was never raised. But developers were able to find efficiencies, smarter solutions to fundamental problems, and harness those same resources to produce games that looked easily a generation removed from the platform’s launch titles.
The less the goalposts move, the easier it becomes to score. Look through ‘Best games of 2021’ lists and you’ll find some gorgeous triple-A offerings like Forza Horizon 5 (which is nearly visually identical to its predecessor, incidentally), but also strange new concepts like Loop Hero, Unpacking, and Wildermyth. Their minimum specs are 2007-grade, but they’re full of new experiences and fresh angles.
So, while our global economy might be doomed by the decline of Moore’s law, itself the engine room for so much of our 20th and 21st century prosperity, we can at least, as we gather around the burning trashcans amidst the rubble of what were once cities, look forward to some really smart video games coming out.
Ad creative by day, wandering mystic of 90s gaming folklore by moonlight, freelance contributor Phil started writing about games during the late Byzantine Empire era. Since then he’s picked up bylines for The Guardian, Rolling Stone, IGN, USA Today, Eurogamer, PC Gamer, VG247, Edge, Gazetta Dello Sport, Computerbild, Rock Paper Shotgun, Official PlayStation Magazine, Official Xbox Magaine, CVG, Games Master, TrustedReviews, Green Man Gaming, and a few others but he doesn’t want to bore you with too many. Won a GMA once.