Ahead of the launch of a new graphics card like the Nvidia GeForce RTX 3080, especially when it's as hotly anticipated as Team Green's newest, some preposterous rumors are inevitably going to spill out. In fact, that's half of the fun of following the PC hardware scene in the first place – however, now's not the time to get people alarmed about having to spend extra money for no reason.
Late last week we spotted a rumor about Nvidia submitting a new 12-pin PCIe cable to PCI-SIG for approval, which has led to some outlets, including TechRadar, jumping to the conclusion that the Nvidia Ampere GeForce lineup – which is likely hitting the market in just a few months – will require this new connector. That's probably not going to happen.
You probably don't have to worry about adding in the expense of an all-new power supply to the already substantial cost of an expensive graphics card – which is good when so many people are negatively affected by a global pandemic.
So there's a huge power supply shortage right now
If you go on Newegg right now and look for PC power supplies, particularly in the 850-1000W range that the kind of people buying flagship-level graphics buy, you'll notice that there's a massive shortage going on right now. Many models are simply out of stock, and those that aren't are seeing some pretty massive price increases.
Take the EVGA 850 G5, for instance – which just so happens to be the power supply we use in our personal gaming machine. This power supply has technically been succeeded by the EVGA SuperNOVA GA – which is only available through a third party seller on Newegg with a paltry 29% positive rating – but you'll have to pay $195 (about £150, AU$280) for this model, compared to the $149 (about £120, AU$210) list price of its successor. And, in case you haven't guessed, the SuperNOVA GA is definitely out of stock on EVGA's own website. Oh, and it's out of stock until July 31 on Amazon, too.
And, it's more than just a cursory glance at power supplies on digital stores, too. There are posts on Toms Hardware's forums, Reddit, EVGA forums and more of customers either not being able to procure a new power supply or even get their faulty power supply replaced by the manufacturer.
Your guess is as good as ours as to why this is happening – there are a lot of supply issues everywhere right now – but it doesn't look like the power supply market is ready to tackle the kind of demand that forcing everyone to upgrade would cause.
What happened to longevity?
In the PC hardware enthusiast scene, there's a lot of excitement about upgrading to the latest and best processors and graphics cards when they come out – we know we here at TechRadar are definitely in that camp – but you don't really see a lot of excitement about upgrading to new power supplies. That's because, well, they last a long time.
Searching around the internet, it looks like the general advice is to replace your power supply "when it dies", and for people that only built their computers a couple of years ago, having to upgrade their power supply to get the latest and greatest GPU would definitely be a tough pill to swallow – not to mention pretty awful for the environment – just check this report by the World Economic Forum if you need a reminder.
We can definitely attest to that PSU upgrade cycle. Before the PC we built when Nvidia Turing launched, the computer this editor used for their personal gaming and work uses used the same power supply since about 2009, with no hint of it dying when they eventually replaced it. That ThermalTake power supply lasted a good 9 years, through multiple graphics cards upgrades – from the AMD Radeon HD 5870 to the Nvidia GeForce GTX 970 and everything in between.
But, still, all of this advice is taken from forums or a TechRadar editor's personal experience. We haven't put any of these claims through any amount of rigorous testing, but it should give you an idea of the general vibe when it comes to upgrading power supplies, and how long they're typically expected to last.
You probably have nothing to worry about
The Tom's Hardware report that we then reported on suggests that a new 12-pin power connector has been submitted by Nvidia to the PCI-SIG standards body for approval. However, you shouldn't take that as confirmation that this connector is definitely going to be a thing, or even that it's actually been submitted.
However, let's just assume that Nvidia did submit this connector in for approval. According to Tom's Hardware, the rumor circles around upcoming Ampere graphics cards, which would theoretically include the Nvidia GeForce RTX 3080. The problem there, however, is that Ampere is the graphics architecture behind Nvidia's next-generation GPUs for data scientists and AI – not just GeForce.
We're talking about GPUs that are way more powerful than anything us consumers will see in even the best gaming PCs, and thus could theoretically use the added power that this new 12-pin PCIe connector would provide. The way we look at it, that's way more likely than Nvidia pushing a new power standard on mainstream consumers and impact the sales of its next-generation gaming cards, especially when AMD RDNA 2 likely won't do the same thing.
There have been rumors that the next-generation Nvidia flagship will consume up to 350W, and you can take that for what it's worth. It's unlikely that this will actually happen – that's more power than the Titan RTX consumes. Even the AMD Radeon R9 Fury X – which AMD had to ship with a liquid cooler – had a TDP of 275W.
Even if we did get a 350W RTX 3080 Ti – which would be a marketing nightmare for Nvidia – it wouldn't necessitate a connector allowing for 648W of PCIe power. As Tom's Hardware points out in its story, the current configuration of two 8-pin PCIe connectors allows for 375W of power.
So, if these new graphics cards don't actually require the added power, launching these graphics cards with the new power connector – even if it is just the Founders Edition cards – seems like a good way to limit how many will actually sell.
Another possibility, however, is that this new power connector will be used on aftermarket card designs. Many of these tend to add extra power connectors anyways in order to facilitate higher clock speeds and the resulting bump in power consumption.
At the end of the day, no one will know what the Nvidia GeForce RTX 3080 is going to look like, how many power connectors it will have or even what the TDP will be until Nvidia spills the beans. And, while there will inevitably be a ton of rumors to fill the gap until launch day, there are definitely some with more merit – and the "12-pin power connector on the RTX 3080 Ti" is definitely one that's less likely to actually be true.
- We'll show you how to build a PC