Jump to content

The Curious Case of the 12-pin Power Connector: It's Real and Coming with NVIDIA Ampere GPUs

Pickles von Brine

nvidia must come up with a way for upgradability to not be an issue for older systems with older psus. probably something like a connector adapter that comes with the gpu upon purchase or sumthing, i dont know

😕

Link to comment
Share on other sites

Link to post
Share on other sites

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/16/2020 at 6:32 PM, themoose5 said:

I think the quote below is really the motivating driver behind the change. It seems like nvidia is anticipating a much higher power draw/power quality in the future which is why they're introducing this new connector. 

Still makes no sense if it's going to have to work with people using an adapter...

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

So glad I bought cablemod cables just to be made obsolete if I want a powerful gpu. 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, For Science! said:

@jonnyGURU - Any opinion/insight on this matter?

You really think that with only two months until the card launches they're going to pop up and say, "BTW, you need a PSU that doesn't exist!"

 

No.

 

If it's real, it's probably for an adapter cable.  My guess is they're trying to save board space.  We already saw that the card and the cooler are quite huge.  And, according to these drawings, this connector is quite small.  So if it's real, it's not a "planned obsolescence conspiracy" like so many are making it out to be. 

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, Sauron said:

Still makes no sense if it's going to have to work with people using an adapter...

This assumes that this will take the existing dual six pin and just have a short adapter into the card. They could go the route of including a full length cable that would take the spot of the dual 6pin PCIe power connector in the power supply itself.

 

Either way my point is probably in preparation for cards that COULD pull that much power in the future not necessarily for Ampere cards. They are just introducing it now to warm the market up to the new connector before it becomes a requirement for new cards in the future. 

Edited by themoose5
Clarification
Link to comment
Share on other sites

Link to post
Share on other sites

There's still a lot of unconfirmed stuff, especially about whether AIB models (the ones you and I will more likely be getting) will adopt the new connector or is this just an FE thing. 

 

It's also worth noting that as Jon said, no consumer-tier (at least) PSU currently exists with support for the new connector, so there will likely be some form of adapter cable to help with the transition. The whole "need to swap PSU" FUD is a bit overblown, especially when you can't really buy a PSU that supports the 12-pin, though I can see it being an issue with totally non-modular PSUs 

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, themoose5 said:

They could go the route of including a full length cable that would take the spot of the dual 6pin PCIe power connector in the power supply itself.

 

Non-modular PSU's are still a thing.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, CarlBar said:

 

Non-modular PSU's are still a thing.

Fair point, I was mostly just spit balling ideas that would include support for a larger gauge wire all the way though. 

 

Honestly if they're increasing the gauge on the wire and connectors to handle 600 watts to the GPU alone the likelihood someone would run a non-modular power supply with that I would say is slim to none. In fact with the current power supply offerings you wouldn't even be able to run a non-modular PSU that could power a system with that card in it.

 

Using the same rule of thumb for power supply capacity as today, a RTX 2080 is recommended to have a 600 watt PSU. The 2080 is rated at 225 watts, roughly 2.6x  PSU capacity over the GPU. Translating that to the power draw rating for the new cables. That would be a GPU that's rated for 600 watts and a PSU capacity of 1500 watts. I don't think there is a 1500 watt non-modular power supply out there.

 

Meaning that using an adapter for a card that doesn't need that much power doesn't matter because the card won't be hitting the higher capacities provided by the thicker gauge wires and connectors. If you are hitting that threshold, it won't be on a non-modular PSU and thus could have an included full length cable that takes the spot of the dual 6 pin.

Link to comment
Share on other sites

Link to post
Share on other sites

53 minutes ago, themoose5 said:

Fair point, I was mostly just spit balling ideas that would include support for a larger gauge wire all the way though. 

 

Honestly if they're increasing the gauge on the wire and connectors to handle 600 watts to the GPU alone the likelihood someone would run a non-modular power supply with that I would say is slim to none. In fact with the current power supply offerings you wouldn't even be able to run a non-modular PSU that could power a system with that card in it.

 

Using the same rule of thumb for power supply capacity as today, a RTX 2080 is recommended to have a 600 watt PSU. The 2080 is rated at 225 watts, roughly 2.6x  PSU capacity over the GPU. Translating that to the power draw rating for the new cables. That would be a GPU that's rated for 600 watts and a PSU capacity of 1500 watts. I don't think there is a 1500 watt non-modular power supply out there.

 

Meaning that using an adapter for a card that doesn't need that much power doesn't matter because the card won't be hitting the higher capacities provided by the thicker gauge wires and connectors. If you are hitting that threshold, it won't be on a non-modular PSU and thus could have an included full length cable that takes the spot of the dual 6 pin.

The thing is, a 1500w PSU is basically the maximum you can use in North America, as circuits are generally 15A, 120V, and maxing it to 1800w is probably going to just trip the breaker every time you turn on a PC with something that has that kind of load. Some circuits in Europe can be as high as 5000w, and don't have that problem.

 

So we do hit a wall with what is possible without having to unplug a dryer or stove.

 

My guess is this might be a 12VO "PCIe" connector. So it would not surprise me if the x80/x80Ti/Titan model of the GPU requires this on a 12VO PSU over 600watts. So it might have two PCIe modular connectors on one end, and the 12-pin on the other end, and the box will have separate PCIe 6-pin's for cards that don't need it. At least that would make sense.

 

I don't see "adapters" for 6-to-12 or 2x6-to-12, as people will try to use them on smaller PSU's and probably kill the cheap ones when the GPU fires up, as that's kinda what already happens with some GPU/PSU configurations already. It's less "the PSU can't supply 75w to PCIe" and more "The PSU actually draws more than 75w, oops"

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, themoose5 said:

This assumes that this will take the existing dual six pin and just have a short adapter into the card. They could go the route of including a full length cable that would take the spot of the dual 6pin PCIe power connector in the power supply itself.

But the female on the power supply will still have the same quality of contacts as before. Also since 6pin connectors are "dumb" as in they don't have sense pins I wouldn't be surprised if you had to use two 8pins for an adapter to work.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

So for everyone taking this as fact there is one big thing to keep in mind: TPU is in no way a reliable source for rumors. They have about as much accuracy as WCCF. Take anything they confirm with a massive grain of salt.

 

Edit: If this was in any way true we'd have heard about PSUs using this connector a LOOOONG time ago. There is no chance in hell a new power connector would be kept secret and rely on a GPU manufacturer to reveal.

Link to comment
Share on other sites

Link to post
Share on other sites

Assuming this connector is coming to next gen nvidia consumer GPUs, they will know it will be a pain point if they do not offer some way around it. I think the adapter cable is the most sensible solution. It can work like a sleeve extension so you can hide the connection elsewhere. To be safe for power, I think it would have to require 2x8pin. Lesser PSUs that can't power that would need replacing. I think most PSUs, at least the ones I've owned, are quite realistic in the number of PCIe power connectors they provide. Higher power = more connectors. If the PSU isn't up to it, the presence of the new connector isn't going to make a difference.

 

Now 2x8pin would be overkill for a lower to mid range GPU, say <200W. Might they only put this on the higher end cards only, and leave the mainstream on the traditional connectors? I still think a new power connector makes more sense in the server/workstation market.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, porina said:

Assuming this connector is coming to next gen nvidia consumer GPUs, they will know it will be a pain point if they do not offer some way around it. I think the adapter cable is the most sensible solution. It can work like a sleeve extension so you can hide the connection elsewhere. To be safe for power, I think it would have to require 2x8pin. Lesser PSUs that can't power that would need replacing. I think most PSUs, at least the ones I've owned, are quite realistic in the number of PCIe power connectors they provide. Higher power = more connectors. If the PSU isn't up to it, the presence of the new connector isn't going to make a difference.

 

Now 2x8pin would be overkill for a lower to mid range GPU, say <200W. Might they only put this on the higher end cards only, and leave the mainstream on the traditional connectors? I still think a new power connector makes more sense in the server/workstation market.

8 pin's only add two ground pins to 6 pin. That's why they look like this:

pciex68adap.c.jpg

Then you end up with these kinds of adapters:

satpciex8adp.main.jpg

Notice there's only two 12V rails and two grounds, for what should be 3 rails and 5 grounds.

DELOCK_85452-01.png

The modular PSU's tend to just do this anyway. So there's actually TWO connectors from one PSU side connector. So if you did it this way that's 300w coming off just one 6-pin at the PSU.

 

So it's those additional ground pins that tell the GPU it can use twice as much power. There aren't additional voltage pins. And these can still carry up to 150w per voltage pin with that gauge of wire. Just what guarantees that it's deliverable? This is why it's a 12 pin connector, as it's the same 6 voltage pins, only now it's just one connector. The pins from the PSU are essentially unchanged.

 

However does the PSU support that?

corsair-ax1200i-psu-modular-cables-pinou

Some PSU's actually have different pinouts

PSU_Pinout_Voltage_-_Seasonic_FOCUS_SGX.

 

Like that Corsair above is wired significantly differently on the PSU than the Seasonic on the bottom.

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Kisai said:

8 pin's only add two ground pins to 6 pin. That's why they look like this:

I'm not sure where you're going with that, but it wasn't where I was going.

 

I'm aware of the sense pins on the 8 pin PCIe. Regardless of the mechanism, the idea I had was that the 8 pin connector supported more power than the 6 pin connector. Thus, to ensure sufficient power is available on the new connector, any adapter cable would probably have to enforce having 2x 8 pin power on its input. As such, the PSU end of the original cable doesn't matter at all. If you have a non-modular PSU, doesn't matter. 

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, porina said:

I'm not sure where you're going with that, but it wasn't where I was going.

 

I'm aware of the sense pins on the 8 pin PCIe. Regardless of the mechanism, the idea I had was that the 8 pin connector supported more power than the 6 pin connector. Thus, to ensure sufficient power is available on the new connector, any adapter cable would probably have to enforce having 2x 8 pin power on its input. As such, the PSU end of the original cable doesn't matter at all. If you have a non-modular PSU, doesn't matter. 

You have no guarantee what's on the PSU end. Do all the 12V rails connect to the same 12V circuit in the PSU, or are they all independent and capable of supplying >25w each. Like in the case of that 1 6-pin to two 6+2 connectors, clearly the PSU must be capable of supplying 100w on each pin or that connector would kill the PSU every time.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Kisai said:

You have no guarantee what's on the PSU end. Do all the 12V rails connect to the same 12V circuit in the PSU, or are they all independent and capable of supplying >25w each. Like in the case of that 1 6-pin to two 6+2 connectors, clearly the PSU must be capable of supplying 100w on each pin or that connector would kill the PSU every time.

Today we already have power balancing possible between each connector, so they can come off different 12V rails. Presumably the new connector will have some similar capability, even if we don't know how granular it is. We also have to assume that if the PSU comes with 8 pin PCIe power connectors, they can deliver that.

 

The only minor concern with my hypothetical power cable adapter/extender is that if the new 12 pin connector really is rated to 600W, that is more than two 8 pins can deliver. Here I'm making an assumption that this, as an interim solution, would not be intended to be used with a 600W rated GPU. I really don't see consumer cards going up that much in power. If it takes off, native implementations can follow later. I still think this is something for the high end, and not consumer level.

 

It is also possible the actual usage spec for the new connector will be below its absolute maximum to allow some headroom. 

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/17/2020 at 8:34 AM, pas008 said:

if these are required then these new ampere cards, these should be like 3x plus the performance or nvidia's so called efficiency got demolished here

 

otherwise I dont see why adding more power is needed

am I missing something here?

arent we suppose to be trying to draw less power and get more performance as we go forward?

Unpopular opinion here: Does that mean that Samsung's 8nm node is really inefficient, Nvidia's architecture for Ampere doesn't scale well, or that they're operating outside the efficiency curve because they're concerned about Big Navi?

 

Additionally, why do they need 650W... How would you cool that monster of a GPU? I'm not sure the benefits of the single connector outweigh the cost.

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Belgarathian said:

Unpopular opinion here: Does that mean that Samsung's 8nm node is really inefficient, Nvidia's architecture for Ampere doesn't scale well, or that they're operating outside the efficiency curve because they're concerned about Big Navi?

 

Additionally, why do they need 650W... How would you cool that monster of a GPU? I'm not sure the benefits of the single connector outweigh the cost.

Was just looking next gen with possible mcm design they might be just looking forward to getting these connectors going a gen before

Link to comment
Share on other sites

Link to post
Share on other sites

I like this idea except the proprietary part. 

"If a Lobster is a fish because it moves by jumping, then a kangaroo is a bird" - Admiral Paulo de Castro Moreira da Silva

"There is nothing more difficult than fixing something that isn't all the way broken yet." - Author Unknown

Spoiler

Intel Core i7-3960X @ 4.6 GHz - Asus P9X79WS/IPMI - 12GB DDR3-1600 quad-channel - EVGA GTX 1080ti SC - Fractal Design Define R5 - 500GB Crucial MX200 - NH-D15 - Logitech G710+ - Mionix Naos 7000 - Sennheiser PC350 w/Topping VX-1

Link to comment
Share on other sites

Link to post
Share on other sites

What about my CoolerMaster V650S? Out of luck? 

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

This seems... weird. I thought GPUs were getting more power efficient. And now all the sudden we need even more POWAH.?

It be one thing if Nvidia talked with AMD, and PSU makers and they all decided to retire the 8pin, and make 6 pin expandable.
That might actually make sense, and would fix a common irk I have when it comes to custom sleeved cables. - That your next GPU might not use the same Combo of 6 and 8 pin cables, as your current one. So you might need to buy new cables.

But this is Nvidia. The kings of making proprietary technology and refusing to share it.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Belgarathian said:

Unpopular opinion here: Does that mean that Samsung's 8nm node is really inefficient, Nvidia's architecture for Ampere doesn't scale well, or that they're operating outside the efficiency curve because they're concerned about Big Navi?

 

Additionally, why do they need 650W... How would you cool that monster of a GPU? I'm not sure the benefits of the single connector outweigh the cost.

Where did 650W come from? There's some inflation in the thread, OP says 600W.

 

Anyway, let's break it down one step at a time. That's the maximum the connector could take. That's all we know about it. Generally speaking, you don't want to run things at their maximum rating, so the actual usable rating for the connector may be somewhere below that.

 

Likewise for the GPU, just because a connector can deliver so much power, doesn't mean a GPU has to use that much.

 

My personal guess is that this might be more of a move for datacenters where higher power units are not a problem, providing they are still overall efficient. If a 400W GPU does the same work as two 200W GPUs, they'll take the space saving. 

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

There is absolutely no way that, just months before Nvidia released the 3000 series cards, would they drop a bombshell saying you're going to need a new PSU because they decided to require a new pin layout.

Something like that would completely tank the sales of their new cards and seeing at Nvidia is part of a publicly traded company I can't imagine any higher ups who want to keep their jobs would allow such a thing.

 

A new power connector is something Nvidia would want AMD and the various PSU makers to get on board with and also announce to the public long before they released hardware requiring it.

 

The only way I could possibly see any Nvidia card releasing soon that required the new power connector would be if it were a new Titan card.

Something that is relatively low volume with an audience that wouldn't put up much of a fuss at dropping an extra $100 - $200 for a new PSU to go with their new $1,300 - $1,500 graphics card.

I think it would be sales suicide for Nvidia to force the new connector on any of their cards in the $300 to $800 range.

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, Snadzies said:

There is absolutely no way that, just months before Nvidia released the 3000 series cards, would they drop a bombshell saying you're going to need a new PSU because they decided to require a new pin layout.

Something like that would completely tank the sales of their new cards and seeing at Nvidia is part of a publicly traded company I can't imagine any higher ups who want to keep their jobs would allow such a thing.

 

A new power connector is something Nvidia would want AMD and the various PSU makers to get on board with and also announce to the public long before they released hardware requiring it.

 

The only way I could possibly see any Nvidia card releasing soon that required the new power connector would be if it were a new Titan card.

Something that is relatively low volume with an audience that wouldn't put up much of a fuss at dropping an extra $100 - $200 for a new PSU to go with their new $1,300 - $1,500 graphics card.

I think it would be sales suicide for Nvidia to force the new connector on any of their cards in the $300 to $800 range.

or they will simply supply an adaptor for a while its not that complicated 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×