AMD and Nvidia leaks show we are drunk on power, and the hangover is going to be brutal

Graphics cards and processors acting as smoke stacks polluting the air
(Image credit: Shutterstock/Future)

A number of recent news items over the past few weeks have given us more insight into the soon to be announced next-gen graphics cards and processors, and if what we've heard is true, it looks like we've decided that energy efficiency and conservation is for suckers and newbs.

First, there's long been rumors that the next-gen Nvidia Lovelace graphics cards are going to be energy hogs, but earlier this week reliable Twitter leaker Kopite7kimi posted some supposed specs for a high-end Nvidia RTX 4000-series card, possibly a Titan-class card, that could have upwards of 800W of power draw.

Now, we're hearing news from Wccftech that the soon-to-be-announced AMD Ryzen 7000-series desktop processors appear to be throwing off any pretense at efficiency as well, with a reported 170W TDP for the top-tier Ryzen 9 chip. 

Assuming you paired these two components together and nothing else, you'd have nearly a whole kilowatt of power being sucked up by just the processor and graphics card, meaning that everything else will absolutely push this system over the 1000W line.

Without question, this would likely be the best gaming PC ever built, but is it even worth it at this point?

Do we really need this much power?

Plenty of the best graphics cards are energy sinks, like the Nvidia RTX 3090 Ti which has a rated TGP of 450W. It is unquestionably powerful, and it can make the best PC games look amazing, but I've had the priviledge of playing these games on all of this high-end hardware, and I can honestly say that the 4K eye candy that you'll get from an RTX 3090 Ti is very real, but the RTX 3070 or even the RTX 3060 Ti looks more than sweet enough for the vast majority of people.

As for 170W for, let's say, an AMD Ryzen 9 7950X, this would definitely make for a very powerful processor, but one whose power is absolutely wasted on the consumer market. This kind of power would be a multitasking champ, no doubt, but it's getting to be the processor equivalent of juggling a half-dozen knives while riding a circus bear in a tutu and balancing a bottle on your nose. An impressive feat, but it's ultimately just a spectacle. Nobody ever really needs to do this many things that necessitate this kind of performance in everyday life.

Meanwhile, Intel seemed to be going in the right direction before Alder Lake with an emphasis on improving efficiency on their processors, but the 12th-gen chips seem to have reversed a lot of that good work in order to reclaim the company's previous best-in-class performance.

Accepting good enough

There is a notion that only 1.25x or 1.5x performance increases can be considered a success, and that you need to pull off this kind of thing every one to two years. Some are talking about 2x performance increases for Nvidia Lovelace, and who knows what Intel Raptor Lake will bring. 

At some point, we're amassing all this computing power at the consumer level for the sake of amassing this power because we can. Then we just go and use it to stream Netflix.

This is not to say that performance increases aren't worth pursuing, but we should aim to match performance with our needs, not introduce this kind of performance and then look for new ways to use it – at least that can't be the default assumption every time. 

There's nothing wrong with Nvidia coming out and saying that the RTX 4090 isn't any more powerful than the RTX 3090, but that it uses half the energy, or that it cost a fifth of the price. Value and efficiency seem to have been completely thrown by the wayside, and that isn't just a mistake, it's increasingly unethical.

Performance at all costs actually imposes real, concrete costs

A Color-Enhanced Satellite View Of The Northwest Portion Of The Dixie Fire On August 17, 2021

A Color-Enhanced Satellite View Of The Northwest Portion Of The Dixie Fire On August 17, 2021 (Image credit: Pierre Markuse / Flickr)

There are two major issues with performance being the only metric that seems to matter anymore.

First, energy isn't free; not environmentally, and not economically. As it stands, rising carbon emissions are projected to make large, heavily populated swaths of the planet partially if not entirely uninhabitable at an accelerating pace. Our flagrant misuse of scarce energy resources requires producing more carbon emissions to keep up with our actual needs, and the trade-off simply isn't worth it. 

The consequences are assumed to be far enough in the future for most people to believe that it's a problem that we can solve tomorrow. That simply isn't true, as the recent heatwave in Europe and the continuing wildfires in the Western United States make plainly obvious, not to mention one of the worst droughts in recent history in parts of the Global South that gets far less, if any, attention the way middle- and upper-class families fleeing their suburban homes in California do.

What will it take?

If that can't convince us to be more rational about what we consider "progress" let's just point out a simple economic reality here: getting to this level of performance is just going to make these products even more expensive, pricing even more people out as families struggle with inflation and rising energy costs.

The current generation of graphics cards is already out of reach for most because they are just too expensive. This trend looks to continue in the future, making essential technology for the modern economy something that only the well-off can afford, whether that means families or affluent gamers buying wildly overpowered showpieces or rich countries that can afford to make research investments in these increasingly expensive technologies while universities in poorer countries increasingly get pushed aside.

All of this is a recipe for widening social divides at a time when everyone is going to be under more pressure than ever from a changing climate for everything from vaccines to drinking water.

I love computers and I am a lifelong PC gamer, so I get it, I really do. But I can also tell you that the performance of the RTX 3090 Ti, as impressive as it is has seriously diminishing returns after a while. At some point, it's ok to say, "you know, 60 to 70 fps at 1440p is good enough," because honestly, it is.

John Loeffler
Components Editor

John (He/Him) is the Components Editor here at TechRadar and he is also a programmer, gamer, activist, and Brooklyn College alum currently living in Brooklyn, NY. 


Named by the CTA as a CES 2020 Media Trailblazer for his science and technology reporting, John specializes in all areas of computer science, including industry news, hardware reviews, PC gaming, as well as general science writing and the social impact of the tech industry.


You can find him online on Threads @johnloeffler.


Currently playing: Baldur's Gate 3 (just like everyone else).