Jump to content

PSA: Do NOT use daisy chain power cables for 3000 series (One cable per 8 pin GPU power connection)

Intoxicus

TL;DR - Don't use Daisy Chain cables because most of the time they're underspec and if you do the math they can pull more they're rated for easily on a 3x8pin with a 400-500W power limit that has peaks in excess of the power limit. Always use 1 GPU power cable per 8 pin GPU power connector. At best you get crashes and instability. At worst magic smoke, a dead GPU, and a fire hazard potentially.
EVGA and Seasonic have been telling people it's safe to use daisy chain power cables for 3X8pin GPUs and if we look into it seems that is very very bad advice for them to give out. There was a recent post on r/nvidia that got me to look into what actually makes sense to use on a GPU pulling 400-500W depending on model and bios. (There's a 450W bios for EVGA 3080 cards, and a 500W bios for EVGA 3090 cards out there now btw.)

EVGA and Seasonic have been telling people it's safe to use daisy chain power cables for 3X8pin GPUs and if we look into it seems that is very very bad advice for them to give out. There was a recent post on r/nvidia that got me to look into what actually makes sense to use on a GPU pulling 400-500W depending on model and bios. (There's a 450W bios for EVGA 3080 cards, and a 500W bios for EVGA 3090 cards out there now btw.)

Note: I am not an electrical engineer myself, but my dad is and I grew up doing this stuff. If an electrical engineer can fact check me please do. If I am wrong on anything I would truly like to be respectfully corrected using valid references and data.

To start with:
The 8 pin connectors are rated for 150W.

6 pin connectors are rated for 75W.
PCI-E delivers 75W through the slot.
I can't find a definitive rating for what the power cables for GPUs can handle at this time. Or what the recommend specs are beyond it should be 16 gauge wiring in a daisy chain GPU power cable.
 
https://en.wikipedia.org/wiki/PCI_Express#Power
 
Buildzoid has an excellent video that talks about this where he specifically says daisy chains should be fine unless we get into 400W territory. That video is 3 years old though was made before we even had 2000 series. It would seem like this might be a big topic for all the big tech tubers to cover(hint hint Linus ;) )
 
https://www.youtube.com/watch?v=9nM80JmzKvc
 
If you have a Daisy chain with connectors rated at 150W and the cable can handle more you have an issue at the PSU side. The PSU can only safely draw 150W through it's single connection and somehow deliver more than 150W to two 8 pin power connectors at the GPU. Electricity does not magically increase amps and volts to make more watts. Actually because of resistance and voltage drop you get less than you asked for in the form of less voltage and the same(or more) current(amps.) This creates heat at the connectors, unused or inefficient electricity turns into heat.


Also lets remember you're NOT getting 100% efficiency(that's almost impossible) and you're not going to get the full 150W you're asking for. And that he cables should be 16 gauge for pigtails, but is typically 18 gauge which isn't as good.(Lower numbers are thinker wiring.) Also some could be using Aluminum wiring instead of copper which creates more heat. There is apparently enough variability in PSU specs and design that a daisy chain could work fine on PSU and be magic smoke on another. Is that alone a risk worth taking?
If we do the math each PSU end power connection is 150W send and each GPU power connection is 150W request. So if we have a daisy chain on an EVGA FTW3 Ultra we have on the PSU the ability to send 150W x 2 + 75W from PCIE slot(375W total, 25W short of the 400W power draw at max load.) The GPU will be asking for potentially up to 3 X 150W +75W PCIE slot which is 525W total potential power requested. This is obviously not good if the PSU is trying to deliver more power than the connectors and cables are capable of handling. You'll only get that demand during peaks, but it only takes one 500W plus peak to make magic smoke.
 
When you ask for more power that means more current(amps) and voltage. Wattage is literally volts multiplied by amps (W==v*a). Voltage can be modeled as the "amount" of electricity and current modeled as the "pressure of flow" in a sense. Current is more dangerous as a lot of electricity without "flow pressure" moving it doesn't do much except maybe look cool and tickle a little. You give a little voltage a 1 amp of current and that can kill people in contrast. When your GPU asks for power it's that extra current it's not rated to handle that will create the magic smoke.


If we ask for more than the cable and/or connection can handle we have risk of "magic smoke." The PSU and GPU assume the cables can handle what they want to do. If the cable isn't good enough we get magic smoke at worst case. With a Daisy Chain we are asking for potentially 300W through a cable not at the correct gauge to handle that through a single 150W rated connection at the PSU. Also how does the PSU and GPU know to not push too much power through the Daisy chain cable? It doesn't know it's a daisy chain, it sees three 8 pin power connections and that it's getting power from them. How could it know not to push more than 150W through the daisy chain? (It doesn't, it could easily overload the daisy chain on the assumption that it's a properly rated cable.)
It makes 0 logical or rational sense to use a daisy chain for 2000 or 3000 series. 3000 series is definitely drawing too much power for it to be smart to use a daisy chain. 2000 series seems to be borderlining it depending on specific GPU and PSU.
 
1000 series had low enough power draw it was permissible. But even then I would not recommend it to be on the safer side. Jayz 2 Cents found his OC and performance was slightly limited on a 1080 something using a daisy chain. His results have not been verified by others repeating the same test, so scientifically it's not exactly solid data(yet.)


Now add in that it kinda depends on your PSU and how well it followed specs like 16 gauge wiring for a daisy chain cable and copper wiring it seems like it's just not worth the risk to use a daisy chain at all, ever.
I've come to question their existence. And now think they should not have ever been made. Daisy chains seem like a recipe for user error to destroy components. When it comes to making and selling a product an uniformed consumer could easily destroy it's prudent to minimize giving them ways to enact disastrous failure. Especially when there's a fire hazard potential.


The rule of thumb should be this: One cable per 8 pin GPU connection. If they're 2 x 6 pin that's actually fine as 6 pin is only 75W and a daisy chain on 2 x 6 pin is 150W(2*75W.) But for newer builders than can be a point of confusion so we should really stick to a "One cable per 8 pin connector" mantra when it comes to GPU power.
 
Thanks for coming to my Flange Talk. ;)

Edited by Intoxicus
Formatting, added a TLDR
Link to comment
Share on other sites

Link to post
Share on other sites

The reason daisy-chain connectors exist is that PCIe power is massively underspecced. The physical connector is the same Molex Minifit that's used for CPU power, and can take something like 300-400W physically, the PCI SIG just rates it at 150W on the card side.

 

So if you're running your FTW3 or whatever with 3 8-pins daisy-chaining off 2 8-pins at the PSU, you have around 600-800W of available power from the PSU connectors, not the 300W of your reasoning.

 

Single connectors are still definitely preferable for transient response & cable heat reasons, but daisy-chain is not objectively a problem by any stretch.

Link to comment
Share on other sites

Link to post
Share on other sites

Can you prove that with data from a valid reference. Any links?
In my looking into engineering specs I've not come across such data.

Anecdotally using a daisy chain on my previous EVGA 2070 (now rocking an EVGA 3080 FTW3 Ultra) caused instability issues that using two separate cables solve instantly. 
It seems that if you are correct that should have never happened. 

Also if you are correct it should be ok to use a daisy chain for the FE 3000 series cards. Not only does Nvidia say that is big no no. It's also what caused a magic smoke incident that someone posted about on r/Nvidia that got me looking into this topic deeper. 

Even if you are correct often times just because you can, or it's technically possible, doesn't mean anyone should do it, or that its a safe/good idea.

If the spec being given as 150W on the connector I tend to assume there's a good reason for that...

Link to comment
Share on other sites

Link to post
Share on other sites

So. Sorta. I didn’t read all of it because there was a lot in that, just wanted to point a few things out. All of this is incredibly dependent on PSU design, and gauge of cable. IF the cable supplied is large enough, you could run all three 8 pins off a single line from the PSU... I know my Corsair rm650i has larger than standard gauge cables, and thus can carry more current, while also not having as great of a voltage drop due to lower resistance in the cable. Also, they have capacitors at the GPU side of the cable to help with large power spikes.

 

Additionally, each plug on the GPU is not “requesting” 150 watts, and they certainly are not requesting that much consistently. I definitely wouldn’t daisy chain a cheap PSU’s cables for a 3080 or higher, but a quality PSU shouldn’t have an issue for the reasons stated. I currently run a single wire with a daisy chain end for my 2080, with no issues. Adding a second 8 pin cable for a 3080, so a single and a single that terminates into 2 8’s.... I’d be adding “over” 150 watts of potential capacity to my current setup, which should be plenty fine.

 

But again, this is entirely dependent on PSU design and cable quality/thickness. There is no inherent reason running daisy chain is bad, it’s only bad if it engineered poorly. If you go by the logic of daisy chaining is inherently bad...... don’t forget your PSU only has a single cable going to your wall outlet, so your entire PC is daisy chained off that. Granted, wall power here in the US is 120v, so that requires a lot less amps than a 12v GPU does. But, point is, it’s not inherently bad, it’s only bad if engineered poorly which seasonic, EVGA, Corsair, Asus etc will not be, at least not on their high quality units.

 

For extreme OCing with modded BIOS or physically modded cards, yes, use a single wire for each plug. But that isn’t normal use case. They are shooting for ultimately stability at unrealistic wattages. 

Rig: i7 13700k - - Asus Z790-P Wifi - - RTX 4080 - - 4x16GB 6000MHz - - Samsung 990 Pro 2TB NVMe Boot + Main Programs - - Assorted SATA SSD's for Photo Work - - Corsair RM850x - - Sound BlasterX EA-5 - - Corsair XC8 JTC Edition - - Corsair GPU Full Cover GPU Block - - XT45 X-Flow 420 + UT60 280 rads - - EK XRES RGB PWM - - Fractal Define S2 - - Acer Predator X34 -- Logitech G502 - - Logitech G710+ - - Logitech Z5500 - - LTT Deskpad

 

Headphones/amp/dac: Schiit Lyr 3 - - Fostex TR-X00 - - Sennheiser HD 6xx

 

Homelab/ Media Server: Proxmox VE host - - 512 NVMe Samsung 980 RAID Z1 for VM's/Proxmox boot - - Xeon e5 2660 V4- - Supermicro X10SRF-i - - 128 GB ECC 2133 - - 10x4 TB WD Red RAID Z2 - - Corsair 750D - - Corsair RM650i - - Dell H310 6Gbps SAS HBA - - Intel RES2SC240 SAS Expander - - TreuNAS + many other VM’s

 

iPhone 14 Pro - 2018 MacBook Air

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Intoxicus said:

Can you prove that with data from a valid reference. Any links?
In my looking into engineering specs I've not come across such data.

Anecdotally using a daisy chain on my previous EVGA 2070 (now rocking an EVGA 3080 FTW3 Ultra) caused instability issues that using two separate cables solve instantly. 
It seems that if you are correct that should have never happened. 

Also if you are correct it should be ok to use a daisy chain for the FE 3000 series cards. Not only does Nvidia say that is big no no. It's also what caused a magic smoke incident that someone posted about on r/Nvidia that got me looking into this topic deeper. 

Even if you are correct often times just because you can, or it's technically possible, doesn't mean anyone should do it, or that its a safe/good idea.

If the spec being given as 150W on the connector I tend to assume there's a good reason for that...

What the other poster said is true. It’s massively underrated. Your 2070 should have no issues (assuming your PSU is decent quality) running off a daisy changed plug. My rm650i had been fine with both a 1080 FTW @ 2025mhz, and my 2080 XC Ultra @ 2025 MHz off a single plug that break out to dual 8’s.

 

Its likely nvidia doesn’t recommend it because they don’t know the quality of PSU the user will have. I wouldn’t run a 3080 off a single 8pin split out to however many is needed for you card (a FE card I WOULD use 2 physical cables back to the GPU, but a card with 3 8’s, I personally plan on using only 2 physical cables. 1 that terminates in an 8, one that terminates on dual 8’s. Between all of those cables, that’s a lot of wattage...

Rig: i7 13700k - - Asus Z790-P Wifi - - RTX 4080 - - 4x16GB 6000MHz - - Samsung 990 Pro 2TB NVMe Boot + Main Programs - - Assorted SATA SSD's for Photo Work - - Corsair RM850x - - Sound BlasterX EA-5 - - Corsair XC8 JTC Edition - - Corsair GPU Full Cover GPU Block - - XT45 X-Flow 420 + UT60 280 rads - - EK XRES RGB PWM - - Fractal Define S2 - - Acer Predator X34 -- Logitech G502 - - Logitech G710+ - - Logitech Z5500 - - LTT Deskpad

 

Headphones/amp/dac: Schiit Lyr 3 - - Fostex TR-X00 - - Sennheiser HD 6xx

 

Homelab/ Media Server: Proxmox VE host - - 512 NVMe Samsung 980 RAID Z1 for VM's/Proxmox boot - - Xeon e5 2660 V4- - Supermicro X10SRF-i - - 128 GB ECC 2133 - - 10x4 TB WD Red RAID Z2 - - Corsair 750D - - Corsair RM650i - - Dell H310 6Gbps SAS HBA - - Intel RES2SC240 SAS Expander - - TreuNAS + many other VM’s

 

iPhone 14 Pro - 2018 MacBook Air

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, LIGISTX said:

 All of this is incredibly dependent on PSU design, and gauge of cable. IF the cable supplied is large enough, you could run all three 8 pins off a single line from the PSU... I know my Corsair rm650i has larger than standard gauge cables, and thus can carry more current, while also not having as great of a voltage drop due to lower resistance in the cable. Also, they have capacitors at the GPU side of the cable to help with large power spikes. 
-Yes, the Buildzoid video linked talks about this in some depth.

 

Additionally, each plug on the GPU is not “requesting” 150 watts, |
-don't get too hung up on semantics/linguistics please. I didn't know how else to phrase it.

and they certainly are not requesting that much consistently. I definitely wouldn’t daisy chain a cheap PSU’s cables for a 3080 or higher, but a quality PSU shouldn’t have an issue for the reasons stated. I currently run a single wire with a daisy chain end for my 2080, with no issues. Adding a second 8 pin cable for a 3080, so a single and a single that terminates into 2 8’s.... I’d be adding “over” 150 watts of potential capacity to my current setup, which should be plenty fine.

 

But again, this is entirely dependent on PSU design and cable quality/thickness.
-Yes, according to Buildzoid, who I trust as a source, the daisy chains are going to be 18 gauge and not the 16 gauge they're supposed to be the majority of the time.

There is no inherent reason running daisy chain is bad, it’s only bad if it engineered poorly.
-That was one of my points, please read before commenting.

If you go by the logic of daisy chaining is inherently bad...... don’t forget your PSU only has a single cable going to your wall outlet, so your entire PC is daisy chained off that. Granted, wall power here in the US is 120v, so that requires a lot less amps than a 12v GPU does. But, point is, it’s not inherently bad, it’s only bad if engineered poorly which seasonic, EVGA, Corsair, Asus etc will not be, at least not on their high quality units.
-That is a false equivalnce.

 

For extreme OCing with modded BIOS or physically modded cards, yes, use a single wire for each plug. But that isn’t normal use case. They are shooting for ultimately stability at unrealistic wattages. - I strongly disagree based on the data I have available at this time. Add in my anecdotal experience with my 2070 that on paper *should* have been fine with a daisy chain, but had instability issues until I went 1 cable per *8pin* connector.

A daisy chain would be safe and fine on a 2x6 pin card. The trouble is that nuance will be lost on many newer builders and cause confusion. It's easier to tell them "one cable per GPU power connector" and then you know they won't get confused and make a mistake.

Sure we're all pros here and get nuance.
We have to think about the non pro star PC Builders too though....

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, LIGISTX said:

What the other poster said is true. It’s massively underrated. Your 2070 should have no issues (assuming your PSU is decent quality) running off a daisy changed plug. My rm650i had been fine with both a 1080 FTW @ 2025mhz, and my 2080 XC Ultra @ 2025 MHz off a single plug that break out to dual 8’s.

 

Its likely nvidia doesn’t recommend it because they don’t know the quality of PSU the user will have. I wouldn’t run a 3080 off a single 8pin split out to however many is needed for you card (a FE card I WOULD use 2 physical cables back to the GPU, but a card with 3 8’s, I personally plan on using only 2 physical cables. 1 that terminates in an 8, one that terminates on dual 8’s. Between all of those cables, that’s a lot of wattage...

"Should" and reality are often not congruent.

That's my point really. Even though they say it *should* be fine it appears the reality is that it is NOT fine at all.

BTW my PSU was, and still is, an EVGA Supernova G3 1000W. It was not likely the PSU itself, especially when what solved the issue was using two separate GPU cables. If it was the PSU having an issue I likely would not have had the instability clear up by changing the cabling setup.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Intoxicus said:

"Should" and reality are often not congruent.

That's my point really. Even though they say it *should* be fine it appears the reality is that it is NOT fine at all.

BTW my PSU was, and still is, an EVGA Supernova G3 1000W. It was not likely the PSU itself, especially when what solved the issue was using two separate GPU cables. If it was the PSU having an issue I likely would not have had the instability clear up by changing the cabling setup.

I have only ever used daisy chain cables and have never had an issue. Obviously, sure, every situation is different. Plenty of people have issues with things that 100% should work. That’s just how it goes, so many combo’s and potential places for issues, sometimes things just don’t work as expected. But to say it’s a fire danger is a bit.... extra. Again. If it’s a low quality unit, potentially yes. It’s possible the way that PSU is designed it wasn’t able to deal with the transients from the GPU, who knows. But the actual wattage the cables and connectors can support is far greater then that your 2070 is capable of drawing. Is your 2070 especially susceptible to slight ripple, or slight voltage drop, possibly. Is your particular PSU a subpar unit, maybe. But the potential for fire due to wires melting and shorting won’t happen, not with a 2070.

 

Anyways, like I said. With a 3080, using a single 8 pin and a single to double 8 pin should be no problem. That’s a lot of power spread across a lot of cables. If someone’s specific issue is resolved via running individual cables, good for them, and good on them for trying. It’s a good troubleshooting step, and if it helps someone’s situation, that’s great. It appears it helped fix yours. But that doesn’t mean a blanket statement of “they are bad don’t do it” is accurate either. 

Rig: i7 13700k - - Asus Z790-P Wifi - - RTX 4080 - - 4x16GB 6000MHz - - Samsung 990 Pro 2TB NVMe Boot + Main Programs - - Assorted SATA SSD's for Photo Work - - Corsair RM850x - - Sound BlasterX EA-5 - - Corsair XC8 JTC Edition - - Corsair GPU Full Cover GPU Block - - XT45 X-Flow 420 + UT60 280 rads - - EK XRES RGB PWM - - Fractal Define S2 - - Acer Predator X34 -- Logitech G502 - - Logitech G710+ - - Logitech Z5500 - - LTT Deskpad

 

Headphones/amp/dac: Schiit Lyr 3 - - Fostex TR-X00 - - Sennheiser HD 6xx

 

Homelab/ Media Server: Proxmox VE host - - 512 NVMe Samsung 980 RAID Z1 for VM's/Proxmox boot - - Xeon e5 2660 V4- - Supermicro X10SRF-i - - 128 GB ECC 2133 - - 10x4 TB WD Red RAID Z2 - - Corsair 750D - - Corsair RM650i - - Dell H310 6Gbps SAS HBA - - Intel RES2SC240 SAS Expander - - TreuNAS + many other VM’s

 

iPhone 14 Pro - 2018 MacBook Air

Link to comment
Share on other sites

Link to post
Share on other sites

that's just common sense really, I wouldn't even "daisy chain" my fan cables... daisy chaining gpu power cables is just asking for trouble... 

 

that said my 1070 FTW2 comes with 2 8pin connectors and I'm not sure that's necessary, it just uses around 170w max from what precision tells me, I'm still using my brand new 2x8 Bequiet! gpu power cables of course! (which I initially bought to use with a 3070) 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

54 minutes ago, LIGISTX said:

I have only ever used daisy chain cables and have never had an issue. Obviously, sure, every situation is different. Plenty of people have issues with things that 100% should work. That’s just how it goes, so many combo’s and potential places for issues, sometimes things just don’t work as expected. But to say it’s a fire danger is a bit.... extra. Again. If it’s a low quality unit, potentially yes. It’s possible the way that PSU is designed it wasn’t able to deal with the transients from the GPU, who knows. But the actual wattage the cables and connectors can support is far greater then that your 2070 is capable of drawing. Is your 2070 especially susceptible to slight ripple, or slight voltage drop, possibly. Is your particular PSU a subpar unit, maybe. But the potential for fire due to wires melting and shorting won’t happen, not with a 2070.

 

Anyways, like I said. With a 3080, using a single 8 pin and a single to double 8 pin should be no problem. That’s a lot of power spread across a lot of cables. If someone’s specific issue is resolved via running individual cables, good for them, and good on them for trying. It’s a good troubleshooting step, and if it helps someone’s situation, that’s great. It appears it helped fix yours. But that doesn’t mean a blanket statement of “they are bad don’t do it” is accurate either. 

Just because you can doesn't mean you should.
Just because it works for you doesn't mean it will for everyone else.

A large part of the point is that with the amount of variability the best practice guideline(especially for new and inexperienced builders) is to stick to one cable per GPU power connection.

If you want to take risks it's your PC, your GPU, your potential for magic smoke. It's your choice, and your risk to take.

What I'm getting at is should we advise people to use both connections on a daisy chain GPU power cable. I say no, it's bad advice.
But you can go ahead and do whatever you want at the end of the day.

You can pour gas on your GPU and start a fire with it if you want(as long as no one except yourself is harmed of course.) I would not advise doing that though.

Just don't tell others it's safe to do what you're doing...
 

Link to comment
Share on other sites

Link to post
Share on other sites

50 minutes ago, Mark Kaine said:

that's just common sense really, I wouldn't even "daisy chain" my fan cables... daisy chaining gpu power cables is just asking for trouble... 

 

that said my 1070 FTW2 comes with 2 8pin connectors and I'm not sure that's necessary, it just uses around 170w max from what precision tells me, I'm still using my brand new 2x8 Bequiet! gpu power cables of course! (which I initially bought to use with a 3070) 

"Common Sense" is learned behaviour taken for granted and is almost never as "common" as people believe it to be.

Daisy chaining fans and SATA power is fine, the power draw, etc is not even close to what we're talking about with GPUs. My statements are exclusive to GPU power connections.

The thought behind using GPU daisy chain power cables comes from 1000 series and earlier where the power draw wasn't as huge as it has become since.
Although it should be noted that Jayz 2 Cents found performance and OC performance loss using daisy chain cables on a 1080 something.

Even if a daisy chain GPU power cable does *work* it might be less than optimal and cost you performance.

Link to comment
Share on other sites

Link to post
Share on other sites

actually its the opposite. usually the specs are conservative and the cables and the pcie connector can handle more than the rating. while yes you dont want to test for those limit but 5ish watts more is nothing to worry about. ive been using 2 cables with my 3080 and a cx650 and a leadex III 850 without problems 

Link to comment
Share on other sites

Link to post
Share on other sites

45 minutes ago, spartaman64 said:

actually its the opposite. usually the specs are conservative and the cables and the pcie connector can handle more than the rating. while yes you dont want to test for those limit but 5ish watts more is nothing to worry about. ive been using 2 cables with my 3080 and a cx650 and a leadex III 850 without problems 

Based on what source?

I'll trust Buildzoid as a source when he says most daisy chain cables are at 18 gauge when they should be 16.

If you have a better source then please link it.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Intoxicus said:

Just because you can doesn't mean you should.
Just because it works for you doesn't mean it will for everyone else.

A large part of the point is that with the amount of variability the best practice guideline(especially for new and inexperienced builders) is to stick to one cable per GPU power connection.

If you want to take risks it's your PC, your GPU, your potential for magic smoke. It's your choice, and your risk to take.

What I'm getting at is should we advise people to use both connections on a daisy chain GPU power cable. I say no, it's bad advice.
But you can go ahead and do whatever you want at the end of the day.

You can pour gas on your GPU and start a fire with it if you want(as long as no one except yourself is harmed of course.) I would not advise doing that though.

Just don't tell others it's safe to do what you're doing...
 

It is safe.... don’t turn this into something it isn’t. Using a single cable isn’t going to let the magic smoke out.

 

There are caveats to that recommendation though. Don’t buy a top end GPU that is going to be very amp hungry and buy a crap tier PSU not capable of powering it. If you do that, yea, you will have a bad day. But any quality PSU will either 1) work fine, 2) trip OCP, 3) the card will not be stable due to some strange transient response issue that I’m not even going to try and understand.

 

Again, your trusting the engineering done was done correctly - you can’t assume all engineering is bad. If you buy a quality PSU, and it has daisy chain cables, I’m going to assume the engineer who designed it did analysis and testing. Being an engineer.... I am comfortable with that assumption. But I also am aware enough to pay attention to what is happening. If the daisy chain cable seems to not be thick enough gauge, no I wouldn’t try it, but I personally also wouldn’t own a PSU that wasn’t engineered to a high standard. 
 

But, sorta regardless of all of this, the chance of “harm, is so, extremely low. The only way anyone would ever get harmed is if the cables are not sufficient thickness and get hot and melt (very unlikely, but possible), then OCP fails and continues to provide current even after a short occurred in the melted wires, then you could have a fire on your hands. Chance of magical smoke is slightly higher then this, since just a short could cause part failure. 
 

But to scope this back to reality, I’d wager the vast majority of GPU’s out in the wild are plugged in with daisy chain cables..... since it’s the convenient and easy way to do it. The fact PC’s all over the world are not broken/fire balls sort of proves it’s “fine”. But, ¯\_(ツ)_/¯. 

Rig: i7 13700k - - Asus Z790-P Wifi - - RTX 4080 - - 4x16GB 6000MHz - - Samsung 990 Pro 2TB NVMe Boot + Main Programs - - Assorted SATA SSD's for Photo Work - - Corsair RM850x - - Sound BlasterX EA-5 - - Corsair XC8 JTC Edition - - Corsair GPU Full Cover GPU Block - - XT45 X-Flow 420 + UT60 280 rads - - EK XRES RGB PWM - - Fractal Define S2 - - Acer Predator X34 -- Logitech G502 - - Logitech G710+ - - Logitech Z5500 - - LTT Deskpad

 

Headphones/amp/dac: Schiit Lyr 3 - - Fostex TR-X00 - - Sennheiser HD 6xx

 

Homelab/ Media Server: Proxmox VE host - - 512 NVMe Samsung 980 RAID Z1 for VM's/Proxmox boot - - Xeon e5 2660 V4- - Supermicro X10SRF-i - - 128 GB ECC 2133 - - 10x4 TB WD Red RAID Z2 - - Corsair 750D - - Corsair RM650i - - Dell H310 6Gbps SAS HBA - - Intel RES2SC240 SAS Expander - - TreuNAS + many other VM’s

 

iPhone 14 Pro - 2018 MacBook Air

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Intoxicus said:


Daisy chaining fans and SATA power is fine, the power draw, etc is not even close to what we're talking about with GPUs. My statements are exclusive to GPU power connections.

Even if a daisy chain GPU power cable does *work* it might be less than optimal and cost you performance.

I would argue daisy chaining things not engineered to be is worse... getting multiple poor quality fan splitters could 1) over draw the mobo header 2) depending on the fans (most consumer fans wouldn’t have this issue, but server fans easily could) could melt the splitter cable due to too much amperage. Unlikely, but about as unlikely as it would be in the PCIe power example your arguing.

 

Also, it won’t cost performance. It may at the high end of OCing cause some instability, thus you will not be able to OC as high. But it’ll either work, or it won’t. The card will not run slower due to this. It’ll either be stable, or it won’t be. 
 

But yes, PCIe power cables can provide a lot more then 150 watts of power. Even if we assume they can’t and all of that is hearsay and false, there is going to be a safety factor built into the spec, so by default it’s going to support likely 20% minimum more then the spec just due to the safety factor.

 

Anyways, I agree. If you have the cables, run multiple. Can’t do anything but help. Certainly won’t hurt. But they didn’t mean we need to scare folks...

Rig: i7 13700k - - Asus Z790-P Wifi - - RTX 4080 - - 4x16GB 6000MHz - - Samsung 990 Pro 2TB NVMe Boot + Main Programs - - Assorted SATA SSD's for Photo Work - - Corsair RM850x - - Sound BlasterX EA-5 - - Corsair XC8 JTC Edition - - Corsair GPU Full Cover GPU Block - - XT45 X-Flow 420 + UT60 280 rads - - EK XRES RGB PWM - - Fractal Define S2 - - Acer Predator X34 -- Logitech G502 - - Logitech G710+ - - Logitech Z5500 - - LTT Deskpad

 

Headphones/amp/dac: Schiit Lyr 3 - - Fostex TR-X00 - - Sennheiser HD 6xx

 

Homelab/ Media Server: Proxmox VE host - - 512 NVMe Samsung 980 RAID Z1 for VM's/Proxmox boot - - Xeon e5 2660 V4- - Supermicro X10SRF-i - - 128 GB ECC 2133 - - 10x4 TB WD Red RAID Z2 - - Corsair 750D - - Corsair RM650i - - Dell H310 6Gbps SAS HBA - - Intel RES2SC240 SAS Expander - - TreuNAS + many other VM’s

 

iPhone 14 Pro - 2018 MacBook Air

Link to comment
Share on other sites

Link to post
Share on other sites

Also, I should add..... for a new “nvidia 12 pin” power cable, no, you shouldn’t use a daisy chain cable to plug into that dual 8 pin to 12 pin. That is a poor choice. I believe they clearly state not to do that though. That does have potential to overdraw the cable.

Rig: i7 13700k - - Asus Z790-P Wifi - - RTX 4080 - - 4x16GB 6000MHz - - Samsung 990 Pro 2TB NVMe Boot + Main Programs - - Assorted SATA SSD's for Photo Work - - Corsair RM850x - - Sound BlasterX EA-5 - - Corsair XC8 JTC Edition - - Corsair GPU Full Cover GPU Block - - XT45 X-Flow 420 + UT60 280 rads - - EK XRES RGB PWM - - Fractal Define S2 - - Acer Predator X34 -- Logitech G502 - - Logitech G710+ - - Logitech Z5500 - - LTT Deskpad

 

Headphones/amp/dac: Schiit Lyr 3 - - Fostex TR-X00 - - Sennheiser HD 6xx

 

Homelab/ Media Server: Proxmox VE host - - 512 NVMe Samsung 980 RAID Z1 for VM's/Proxmox boot - - Xeon e5 2660 V4- - Supermicro X10SRF-i - - 128 GB ECC 2133 - - 10x4 TB WD Red RAID Z2 - - Corsair 750D - - Corsair RM650i - - Dell H310 6Gbps SAS HBA - - Intel RES2SC240 SAS Expander - - TreuNAS + many other VM’s

 

iPhone 14 Pro - 2018 MacBook Air

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, LIGISTX said:

Also, I should add..... for a new “nvidia 12 pin” power cable, no, you shouldn’t use a daisy chain cable to plug into that dual 8 pin to 12 pin. That is a poor choice. I believe they clearly state not to do that though. That does have potential to overdraw the cable.

i didnt even think of that you dont even have the option to plug in 3 cables into the nvidia cards LUL

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, spartaman64 said:

i didnt even think of that you dont even have the option to plug in 3 cables into the nvidia cards LUL

Exactly. The fact and FE card can pull just as much watts as a stick bios board partner card.... and is not a 3 8 pin to 12 pin adapter sorta suggests you can run a board partner 3000 card with 3 8 pins off 2 physical cables, 1 daisy chain and one normal single 8 pin termination. But, again, ¯\_(ツ)_/¯. Lol. 

Rig: i7 13700k - - Asus Z790-P Wifi - - RTX 4080 - - 4x16GB 6000MHz - - Samsung 990 Pro 2TB NVMe Boot + Main Programs - - Assorted SATA SSD's for Photo Work - - Corsair RM850x - - Sound BlasterX EA-5 - - Corsair XC8 JTC Edition - - Corsair GPU Full Cover GPU Block - - XT45 X-Flow 420 + UT60 280 rads - - EK XRES RGB PWM - - Fractal Define S2 - - Acer Predator X34 -- Logitech G502 - - Logitech G710+ - - Logitech Z5500 - - LTT Deskpad

 

Headphones/amp/dac: Schiit Lyr 3 - - Fostex TR-X00 - - Sennheiser HD 6xx

 

Homelab/ Media Server: Proxmox VE host - - 512 NVMe Samsung 980 RAID Z1 for VM's/Proxmox boot - - Xeon e5 2660 V4- - Supermicro X10SRF-i - - 128 GB ECC 2133 - - 10x4 TB WD Red RAID Z2 - - Corsair 750D - - Corsair RM650i - - Dell H310 6Gbps SAS HBA - - Intel RES2SC240 SAS Expander - - TreuNAS + many other VM’s

 

iPhone 14 Pro - 2018 MacBook Air

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, spartaman64 said:

also remember the r9 295x2? thats a 450w card and it only has 2 8pins 

 

https://www.igorslab.de/en/power-recording-graphics-card-power-supply-interaction-measurement/5/

I'd consider a 295x2 to be an exception, that's running well outside spec (even though it seems to work fine as designed.)

 

One other piece of evidence that daisy-chained cables are fine is that most modular PSUs use the same connector for CPU and dual-8-pin cables, and CPU EPS connectors can supply something like 400W-- in fact a lot of enterprise compute cards have gone to using a single EPS connector to save space.

Link to comment
Share on other sites

Link to post
Share on other sites

Which is why I prefer my psu’s with single 8 pins. 
This newer cheaper style stuff is a real let down. 

Main RIg Corsair Air 540, I7 9900k, ASUS ROG Maximus XI Hero, G.Skill Ripjaws 3600 32GB, 3090FE, EVGA 1000G5, Acer Nitro XZ3 2560 x 1440@240hz 

 

Spare RIg Lian Li O11 AIR MINI, I7 4790K, Asus Maximus VI Extreme, G.Skill Ares 2400 32Gb, EVGA 1080ti, 1080sc 1070sc & 1060 SSC, EVGA 850GA, Acer KG251Q 1920x1080@240hz

 

Link to comment
Share on other sites

Link to post
Share on other sites

The ATX power supply specifications and Molex mini fit connector specs I'm researching are not congruent with your statements.

If the connector only does 150W it doesn't matter what the cable can do because the connector will overheat and melt if it tries to do more than it's capable of.

*The cable is bottlenecked by the capacity of the connector!*

The Molex Mini Fit Jr 5556(what we know as 8 pin PCI-E power) is rated for 13a @ 12v which is actually 156W to be precise. Even if the cable is rated for 288W the connector can't do it safely. I would assume they do make the cables a bit over spec so they can handle peaks, etc. But that is an *assumption.*

Add in voltage drop and inefficiency and you're not even going to get that full 156W per connection/cable.

Find me specifications and link them if you want me to believe a daisy chain is safe to use on 3x8 pin.

The connector is Molex Mini Fit Jr 5556. Molex has a ton of info on their website. The trouble is sorting through the volume of it and correlating product model numbers to what we know as "8 pin PCI-E power."

Intel does the ATX power supply specs and I found some pdfs detailing them. I can't find anything about daisy chain PCI-E power cables in those specs though...

What I also can not find is complete specifications for the cable itself in terms of a daisy chain and it's safe power limit. *Probably because the cable is bottle necked by the connector so the cable needs to at least be the spec of the connector to be safe.* Which could be one of those super obvious to an Electrical Engineer things they feel they don't need to state. So they probably don't list the spec for cables other than wire gauge and wire type because the connector bottlenecking the cable is probably "duh" level to actual Electrical Engineers.

Still waiting for an actual Electrical Engineer or equivalent to drop some references and data that proves me wrong...

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Intoxicus said:

The PSU can only safely draw 150W through it's single connection

This varies hugely between PSUs. If only 150w could be drawn from a PSU connector, how can many units share a connector type for EPS and PCIe?

 

4 hours ago, Intoxicus said:

You'll only get that demand during peaks, but it only takes one 500W plus peak to make magic smoke.

Energy = power * time. Temperatures can't go high enough within the time of such a spike

4 hours ago, Intoxicus said:

When you ask for more power that means more current(amps) and voltage.

Not in a PC context (or really, most)... Voltage stays constant(ish), current draw changes

4 hours ago, Intoxicus said:

you're not going to get the full 150W you're asking for.

Except you will as any losses made in the cable will be made up for by increased draw from the PSU.

4 hours ago, Intoxicus said:

Voltage can be modeled as the "amount" of electricity

?????

Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler

^-^

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Intoxicus said:

The ATX power supply specifications and Molex mini fit connector specs I'm researching are not congruent with your statements.

If the connector only does 150W it doesn't matter what the cable can do because the connector will overheat and melt if it tries to do more than it's capable of.

*The cable is bottlenecked by the capacity of the connector!*

The Molex Mini Fit Jr 5556(what we know as 8 pin PCI-E power) is rated for 13a @ 12v which is actually 156W to be precise. Even if the cable is rated for 288W the connector can't do it safely. I would assume they do make the cables a bit over spec so they can handle peaks, etc. But that is an *assumption.*

Add in voltage drop and inefficiency and you're not even going to get that full 156W per connection/cable.

Find me specifications and link them if you want me to believe a daisy chain is safe to use on 3x8 pin.

The connector is Molex Mini Fit Jr 5556. Molex has a ton of info on their website. The trouble is sorting through the volume of it and correlating product model numbers to what we know as "8 pin PCI-E power."

Intel does the ATX power supply specs and I found some pdfs detailing them. I can't find anything about daisy chain PCI-E power cables in those specs though...

What I also can not find is complete specifications for the cable itself in terms of a daisy chain and it's safe power limit. *Probably because the cable is bottle necked by the connector so the cable needs to at least be the spec of the connector to be safe.* Which could be one of those super obvious to an Electrical Engineer things they feel they don't need to state. So they probably don't list the spec for cables other than wire gauge and wire type because the connector bottlenecking the cable is probably "duh" level to actual Electrical Engineers.

Still waiting for an actual Electrical Engineer or equivalent to drop some references and data that proves me wrong...

I made an oops and used the current rating for the mini fit plus connector. Our PCI-E express 8 pin connectors use Molex Mini Fit Jr 5556. Which is the pin itself, not the whole connection.


It's current rating actually depends on wire gauge and materials used. It can range from 1a to 9a *maximum* current rating. 

At 9a it's (9a*12v)* 3 pins == 324W
At 8a it's (8a*12v)* 3 pins == 288W
At 7a it's (7a*12v)* 3 pins == 252W

"**Current rating is application dependent and may be affected by the wire rating such as listed in UL-60950-1. Each application should be evaluated by the end user for compliance to specific safety agency requirements. The ratings listed in the chart above are per Molex test method based on a 30°C maximum temperature rise over ambient temperature and are provided as a guideline. Appropriate de-rating is required based on circuit size, ambient temperature, copper trace size on the PCB, gross heating from adjacent modules/components and other factors that influence connector performance. Wire size, insulation thickness, stranding, tin coated or bare copper, wire length & crimp quality are other factors that influence current rating."

Except the PCI-E spec says 150W per 8 pin connection.

What are our molex mini fit jr connectors actually rated for?
Does it depend on who makes the cable? 
And by how much.

*Part of my point is without knowing these details should we take risks with using both connections from a daisy chain cable?*
 

Link to comment
Share on other sites

Link to post
Share on other sites

  • 6 months later...

  

On 12/5/2020 at 9:04 AM, Intoxicus said:

If the connector only does 150W it doesn't matter what the cable can do because the connector will overheat and melt if it tries to do more than it's capable of.

The connector plastics and metals do not melt just because you exceed 150w, melting of a molex connection is the result of high resistance fault demonstrated as I = V/R, and a result of loose fittings, dirty or degraded terminals(oxydized).

 

The terminals and connectors on the cable should always be rated inline with or better than the cable you're using, any legitimate overdraw (ie, daisychaining a 3080) will heat the cable before the terminals themselves and demonstrate measurable voltage drop well before beginning to melt.

PCI-SIG could as soon as next spec decide that PCIE boards can take 250w per molex 8pin without adding a single extra live pin, just by raising their artificial limit on how many amps is drawn per pin (Current limit is 4a)

 

On 12/5/2020 at 2:47 PM, Intoxicus said:

Except the PCI-E spec says 150W per 8 pin connection.

 

 

 

The PCIE ATX spec has only ever been about the Device's power draw per each connector on the card, who makes the cable, the psu, etc, is irrelevant.

Where these are relevant is when feeding a high current multisocket device, a psu vendor that has used 9a capable 12v cables, terminated them properly and mounted them into a psu backplane capable of outputting the entire draw, can pull atleast 300w down that cable across 2 8pin connectors.

 

This isn't likely to happen if a graphics card is 300w, the PCIE slot will make it more likely 130w comes down the cables. and 40w (to a max of 66w) from slot (give or take the card vendors power balancing).

 

EVGA recently fucked this up with a few of the 30 series products and couldn't be bothered to put out a recall notice, (so if you have an EVGA card, take a peek at the power (w) usage with gpu-z)

 

  

On 12/5/2020 at 4:28 AM, Intoxicus said:

PCI-E delivers 75W through the slot.

PCIE delivers 66w, you're only counting 12v lines here when counting supply power for a  graphics card.

 

  

On 12/5/2020 at 4:28 AM, Intoxicus said:

If you have a Daisy chain with connectors rated at 150W and the cable can handle more you have an issue at the PSU side. The PSU can only safely draw 150W through it's single connection and somehow deliver more than 150W to two 8 pin power connectors at the GPU. Electricity does not magically increase amps and volts to make more watts. Actually because of resistance and voltage drop you get less than you asked for in the form of less voltage and the same(or more) current(amps.) This creates heat at the connectors, unused or inefficient electricity turns into heat.

 

you have to buy a timebomb to only have 150w available at the psu side, the pcie spec is only accounting for the minimum supply power at the psu side and the maximum draw power at the device side.

 

what the psu can actually supply is entirely up to the psu vendor in its design of the psu and the cable quality, a single rail psu built to supply 8a per pin and wiring adequate to that spec could supply enough for the basic operations of a 3080, so long as the OCP for that socket on the psu is set such so as to not trip at the transients.

 

Quote

Can you prove that with data from a valid reference. Any links?

 

Become a PCIE SIG member, the data is locked behind membership.

 

  

On 12/5/2020 at 7:18 AM, Intoxicus said:

I'll trust Buildzoid as a source when he says most daisy chain cables are at 18 gauge when they should be 16.

 

Buildzoid is not a qualified electrical engineer,

 

This is the blind trusting the blind.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×