AMD and Nvidia leaks show we are drunk on power, and the hangover is going to be brutal

Graphics cards and processors acting as smoke stacks polluting the air
(Image credit: Shutterstock/Future)

A number of recent news items over the past few weeks have given us more insight into the soon to be announced next-gen graphics cards and processors, and if what we've heard is true, it looks like we've decided that energy efficiency and conservation is for suckers and newbs.

First, there's long been rumors that the next-gen Nvidia Lovelace graphics cards are going to be energy hogs, but earlier this week reliable Twitter leaker Kopite7kimi posted some supposed specs for a high-end Nvidia RTX 4000-series card, possibly a Titan-class card, that could have upwards of 800W of power draw.

Now, we're hearing news from Wccftech that the soon-to-be-announced AMD Ryzen 7000-series desktop processors appear to be throwing off any pretense at efficiency as well, with a reported 170W TDP for the top-tier Ryzen 9 chip. 

Assuming you paired these two components together and nothing else, you'd have nearly a whole kilowatt of power being sucked up by just the processor and graphics card, meaning that everything else will absolutely push this system over the 1000W line.

Without question, this would likely be the best gaming PC ever built, but is it even worth it at this point?

Do we really need this much power?

Plenty of the best graphics cards are energy sinks, like the Nvidia RTX 3090 Ti which has a rated TGP of 450W. It is unquestionably powerful, and it can make the best PC games look amazing, but I've had the priviledge of playing these games on all of this high-end hardware, and I can honestly say that the 4K eye candy that you'll get from an RTX 3090 Ti is very real, but the RTX 3070 or even the RTX 3060 Ti looks more than sweet enough for the vast majority of people.

As for 170W for, let's say, an AMD Ryzen 9 7950X, this would definitely make for a very powerful processor, but one whose power is absolutely wasted on the consumer market. This kind of power would be a multitasking champ, no doubt, but it's getting to be the processor equivalent of juggling a half-dozen knives while riding a circus bear in a tutu and balancing a bottle on your nose. An impressive feat, but it's ultimately just a spectacle. Nobody ever really needs to do this many things that necessitate this kind of performance in everyday life.

Meanwhile, Intel seemed to be going in the right direction before Alder Lake with an emphasis on improving efficiency on their processors, but the 12th-gen chips seem to have reversed a lot of that good work in order to reclaim the company's previous best-in-class performance.

Accepting good enough

There is a notion that only 1.25x or 1.5x performance increases can be considered a success, and that you need to pull off this kind of thing every one to two years. Some are talking about 2x performance increases for Nvidia Lovelace, and who knows what Intel Raptor Lake will bring. 

At some point, we're amassing all this computing power at the consumer level for the sake of amassing this power because we can. Then we just go and use it to stream Netflix.

This is not to say that performance increases aren't worth pursuing, but we should aim to match performance with our needs, not introduce this kind of performance and then look for new ways to use it – at least that can't be the default assumption every time. 

There's nothing wrong with Nvidia coming out and saying that the RTX 4090 isn't any more powerful than the RTX 3090, but that it uses half the energy, or that it cost a fifth of the price. Value and efficiency seem to have been completely thrown by the wayside, and that isn't just a mistake, it's increasingly unethical.

Performance at all costs actually imposes real, concrete costs

A Color-Enhanced Satellite View Of The Northwest Portion Of The Dixie Fire On August 17, 2021

A Color-Enhanced Satellite View Of The Northwest Portion Of The Dixie Fire On August 17, 2021 (Image credit: Pierre Markuse / Flickr)

There are two major issues with performance being the only metric that seems to matter anymore.

First, energy isn't free; not environmentally, and not economically. As it stands, rising carbon emissions are projected to make large, heavily populated swaths of the planet partially if not entirely uninhabitable at an accelerating pace. Our flagrant misuse of scarce energy resources requires producing more carbon emissions to keep up with our actual needs, and the trade-off simply isn't worth it. 

The consequences are assumed to be far enough in the future for most people to believe that it's a problem that we can solve tomorrow. That simply isn't true, as the recent heatwave in Europe and the continuing wildfires in the Western United States make plainly obvious, not to mention one of the worst droughts in recent history in parts of the Global South that gets far less, if any, attention the way middle- and upper-class families fleeing their suburban homes in California do.

What will it take?

If that can't convince us to be more rational about what we consider "progress" let's just point out a simple economic reality here: getting to this level of performance is just going to make these products even more expensive, pricing even more people out as families struggle with inflation and rising energy costs.

The current generation of graphics cards is already out of reach for most because they are just too expensive. This trend looks to continue in the future, making essential technology for the modern economy something that only the well-off can afford, whether that means families or affluent gamers buying wildly overpowered showpieces or rich countries that can afford to make research investments in these increasingly expensive technologies while universities in poorer countries increasingly get pushed aside.

All of this is a recipe for widening social divides at a time when everyone is going to be under more pressure than ever from a changing climate for everything from vaccines to drinking water.

I love computers and I am a lifelong PC gamer, so I get it, I really do. But I can also tell you that the performance of the RTX 3090 Ti, as impressive as it is has seriously diminishing returns after a while. At some point, it's ok to say, "you know, 60 to 70 fps at 1440p is good enough," because honestly, it is.

John Loeffler
Components Editor

John (He/Him) is the Components Editor here at TechRadar and he is also a programmer, gamer, activist, and Brooklyn College alum currently living in Brooklyn, NY.

Named by the CTA as a CES 2020 Media Trailblazer for his science and technology reporting, John specializes in all areas of computer science, including industry news, hardware reviews, PC gaming, as well as general science writing and the social impact of the tech industry.

You can find him online on Bluesky @johnloeffler.bsky.social

Read more
A stock photo of a man saying 'no thank you' to a gift box bearing the AMD Ryzen logo.
I'm tired of waiting for AMD's entry-level Ryzen 9000 series chips
An illustration of a desktop computer and monitor on fire in what appears to be hell
The GPU market is an absolute mess right now, and I don't blame console players for staying away
Moody shot of an Nvidia GPU
Nvidia RTX 5090 FE rumor claims high-end GPU gets loud - but other reports tell a very different story
A graphics card on a wooden table in shadow
What I want to see from next-gen graphics cards in 2025
An Nvidia GeForce RTX 4080 on a wooden desk in front of a white panel
Latest Nvidia RTX 5000 power usage rumors make me scared that my PSU will be nowhere near enough for the RTX 5080
A person holding out their hand with a digital AI symbol.
AI smartphone and laptop sales are said to be slowly dying – but is anyone surprised?
Latest in GPU
Zotac Gaming RTX 5090 Graphics Card
Nvidia Blackwell stock woes are compounded by price hikes as more RTX 5090 GPUs soar in pricing, and I’m sick and tired of it all at this point
Nvidia app
Tired of manually optimizing your games? Nvidia's new G-Assist could save you time
Nvidia RTX 5080 against a yellow TechRadar background
RTX 5080 24GB version teased by MSI - is it time to admit that 16GB isn't enough for 4K?
Nvidia AMD
Nvidia rumors suggest it's working on two affordable GPUs to spoil AMD's party
An Nvidia RTX 5080 vs RTX 4080 Super against a two-tone background
Nvidia RTX 5080 vs RTX 4080 Super: should you upgrade to the latest Blackwell GPU?
An Intel Arc B580 vs Nvidia RTX 4060 against a two-tone background
Intel Arc B580 vs Nvidia RTX 4060: Which mainstream GPU is right for you?
Latest in News
Buzz Lightyear Space Ranger Spin Rennovations
Disney’s giving a classic Buzz Lightyear ride a tech overhaul – here's everything you need to know
Hisense U8 series TV on wall in living room
Hisense announces 2025 mini-LED TV lineup, with screen sizes up to 100 inches – and a surprising smart TV switch
Nintendo Music teaser art
Nintendo Music expands its library with songs from Kirby and the Forgotten Land and Tetris
Opera AI Tabs
Opera's new AI feature brings order to your browser tab chaos
An image of Pro-Ject's Flatten it closed and opened
Pro-Ject’s new vinyl flattener will fix any warped LPs you inadvertently buy on Record Store Day
The iPhone 16 Pro on a grey background
iPhone 17 Pro tipped to get 8K video recording – but I want these 3 video features instead