Jump to content

When is necessary to use two cables to power up a GPU

barclow

TL;DR;

Pretty much that. When is it necessary to power up your GPU with two different cables.

 

Long Version:

I have a Gigabyte 3070 GPU (TDP not specified) but after a day of use, HWiNFO reported a maximum of 250W.

According to this:

  • PCI Express can deliver up to 75W
  • 6pin power can deliver up to 75W
  • 8pin power can deliver up to 150W

So I have two questions:

  1. Why do the GPU requires 2 power cables if 220W could be delivered by PCI Express + 1x8pin power connector? (I believe this specific card is OCd, so it needs additional power, right?)
  2. When should I connect my GPU with 2 cables instead of using the same cable daisy chaining both, 6pin and 8 pin?
Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, barclow said:

a maximum of 250W

11 minutes ago, barclow said:

Why do the GPU requires 2 power cables if 220W could be delivered by PCI Express + 1x8pin power connector?

Because 250 W > 220 W.
 

 

11 minutes ago, barclow said:

When should I connect my GPU with 2 cables

If both, your graphics card and your power supply support it, do it.
It just means, there will be more wires with the same voltage.
So when there is high power consumption (so high current draw), less wires would become warmer than more wires, because more wires have less resistance.
So the more wires you use, the cooler the cables, the more efficient the power supply and the lower the risk of a burning cable 🙂

My build:

CPU

Intel Core i7 9700 8x 3.00GHz So.1151

 

CPU cooler

be quiet! Shadow Rock Slim

 

Motherboard

MSI B360-A PRO Intel B360 So.1151 Dual Channel DDR4 ATX

 

RAM

16GB (4x 4096MB) HyperX FURY black DDR4-2666

 

GPU

8GB Gigabyte GeForce RTX2070 WindForce 2X 3xDP/HDMI

 

SSD

500GB Samsung 970 Evo Plus M.2 2280

 

HDD

4000GB WD Red WD40EFRX Intellipower 64MB 3.5" (8.9cm) SATA 6Gb/s

 

Power Supply

bequiet! Straight Power 750W Platinum

 

Case

Fractal Design Define R6
3x bequiet! Silent Wings 3 PWM

Link to comment
Share on other sites

Link to post
Share on other sites

Best practice is to avoid daisy chaining power connectors to the GPU.

At one time when power supplies had fewer, fixed cables and GPU's used less power it was not such a bad thing.

Ensuring good power delivery is an easy way to avoid strange problems.

Transient power draws can cause system instabilities that are hard to trace. Best to avoid them from the beginning.

Link to comment
Share on other sites

Link to post
Share on other sites

That is software reporting which is basically always wrong.

 

Either way:

 

1. Headroom these connectors aren't perfect, a 3070 can easily go to 300w and you need 2 connectors because none of the values here would allow for only 1 connector (150+75 is still only 225w).

2. With high power cards. Daisy chaining 2 8 pin connectors on a card that can fully load them can cause instability due to the high power draw over a single cable. It also makes it so that the psu has 2 or more points of stabilizing sudden power spikes instead of one.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, barclow said:

TL;DR;

Pretty much that. When is it necessary to power up your GPU with two different cables.

 

Long Version:

I have a Gigabyte 3070 GPU (TDP not specified) but after a day of use, HWiNFO reported a maximum of 250W.

According to this:

  • PCI Express can deliver up to 75W
  • 6pin power can deliver up to 75W
  • 8pin power can deliver up to 150W

So I have two questions:

  1. Why do the GPU requires 2 power cables if 220W could be delivered by PCI Express + 1x8pin power connector? (I believe this specific card is OCd, so it needs additional power, right?)
  2. When should I connect my GPU with 2 cables instead of using the same cable daisy chaining both, 6pin and 8 pin?

 

PCI-e slot can deliver 65w on 12v and 10w on 3.3w.  I may be wrong, but I think the more recent standards have relaxed this to mean 75w on 12v. 

The pci-e slots are powered from the 24 pin connector, which has only 2 12v wires, so you have around 200 watts budget for all the pci-e slots, fan headers etc 

Some manufacturers try to not get close to the slot's limit or don't rely on the slot being capable of providing that much.  

Also, since the slots are so far from the 24 pin header, it's possible to have some voltage drop across the motherboard and not get quite 12v in the slot. 

 

Next, yeah, your video card may consume a maximum of 250 watts.... but you must leave room for overclocking.

For example, if the card draws 60w from slot and 60w from pci-e 6 pin, that leaves very little room for overclocking, only 15w on the pci-e 6 pin connector. Just raising the power slider a few percent can eat those 15 watts. 

So, it makes sense to upgrade to a 8 pin even though it's not technically needed.

 

It may be a design choice.  For example, the video card's main VRM may have 6 phases and the memory controller VRM may have 2 phases.  The designer may choose to connect just one phase of the main VRM and the memory controller phases to the pci-e slot, just enough to tell you "please connect the pci-e cable to the video card" if you happen to start the video card without any connector plugged in. Then, the remaining 5 phases of the main vrm could be powered from the extra pci-e connectors. 

This way, even though the pci-e slot can provide 65-75w, the card may only take 30-40w (10-15w for memory) and  150w / 6 phases = 25w on that phase. 

 

When/why should you connect GPU with 2 cables instead of using the same cable daisy chaining both, 6pin and 8 pin?

 

You can do whatever you want to do. you can daisychain or you can use separate cables.

 

Separate cables helps when there's a high power consumption because more wires between the video card and the power supply means there's less losses on the wires, and the wires will heat a bit less. 

In theory, the voltage coming into the video card will be slightly higher, but in practice it doesn't matter that much, as video cards have dc-dc converters converting the incoming 12v +/- 10% down to small voltages.  But, smoother voltage can help in extreme scenarios like overclocking. 

 

 

Here's the thing about losses and voltage drops... you have the basic formula Ohm's law.  Voltage = Current x Resistance 

 

Let's say you have a power supply and a cable with 2  pci-e 8pin connectors, the first 1 meter from the power supply and the 2nd one 10 cm further (1.1) and let's say the video card consumes 300 watts. 

 

[ psu ]  --------------- 1 m ------- [ 8 pin conn ] ----- 10 cm --- [ 8 pin conn ]

 

The wires used are AWG18, which means they're gonna have a resistance of 0.021 ohm per meter (approximately) so  

 

There's 3 pairs of wires carrying 12v to the connectors, so the overall current is evenly split through the wires. 

 

From the power supply to the first 8 pin connector, there's gonna be 300 watts across 3 pairs of wires, so each pair of wires (12v and ground wire) is gonna carry 300w / 3 pairs / 12v = 8.33 A of current. 

Since the electricity has to go all the way to the first connector and back on each pair, we have 2 meters of wire. 

So, we can now put the values in the formula : Voltage = Current x Resistance   =  8.33 A  x 2 meters x 0.021  = 0.35v  

 

If the power supply outputs exactly 12v, at the first connector you're going to have approximately 12-0.35 = 11.65v  because 0.35v is lost on the wires. 

The wires will also heat up by some amount ... the formula is power = I^2 x R = (8.33 x 8.33 x (2x0.021) = 2.91 watts  ... across each pair of wires, there's gonna be nearly 3 watts lost as heat, and you're gonna have 3 pairs of such wires bundled together and maybe under a sleeve, so 9 watts of heat across the length of wire. 

 

On the second connector, you have the 11.65v at the input and you have 10cm of cable to the other end, and you have 150w split across 3 pairs of wires, so 50w per pair, or 50w / 12v = 4.16A of current . 

So the voltage drop will be smaller, at   V = 4.16 x  2 wires x 0.1 meters x 0.021 = 0.017v    ... the second connector will see 11.65-0.017 or  ~11.63v 

 

A modern power supply will have sensing wires and will detect the voltage drop and adjust the output voltage - for example it may output 12.3v and you'll get 12v at the video card end. 

 

When you use 2 separate cables with such high power video card, you have half the current through each cable so each cable will have only 4.16A of current through each pair of wires to the first connector  therefore V = 4.16 x 2 meters x 0.021 = 0.17v drop 

and the power lost in the wires is much smaller .. P = I^2xR = 4.61 x 4.61 x 2 meters x 0.021 = 0.72w per pair  ... vs 2.91w per pair

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

In the past you can, except for cards that violate PCIe standards for power (looking at you, R9 295X2). Not ideal since you lose more power through the cable this way, but you can.

 

However on more recent cards (I think since the 2080Ti), cards start to crash if you only use 1 cable. I assume this is because modern cards go through bigger current fluctuations which leads to bigger voltage fluctuations even though average power ratings are still followed. You couldnt simply make the PSU supply higher voltage since that's bad for the peaks/low current draw situations, so reducing voltage loss is the only way. 

 

1 hour ago, barclow said:

I have a Gigabyte 3070 GPU (TDP not specified) but after a day of use, HWiNFO reported a maximum of 250W.

this is the peak, Nvidia cards still respect their average power ratings tightly. Meanwhile AMD cards follow them not as tightly, but it wouldnt go over 5% more than what it is set to either

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

thank you everyone for your replies, I will try to put another cable, my case is really clutered, but I'll do my best. Thanks.

TL; DR; of the replies:
It is preferable to have 2 cables if possible, not mandatory in my scenario but its a good practice

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×