Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

When it is safe to use 6-pin instead of 8-pin? And how come the extra 2 pins are ground?

(Talking about the PCI-E connectors for the GPU)

I couldn't find any good explanation, but maybe someone here knows how come the extra 2 pins which are ground help provide more wattage?

Is it because they close some internal circuit within the GPU itself as I've seen somewhere?

 

Also, if I have a GPU that uses no more than 120W, however it has 8-pin connector - can I still use my PSU with 6-pin temporarily? (If the PSU is rated for enough wattage and is of decent quality)

Link to post
Share on other sites

because of some weird math around ohm's law, adding two ground pins means less voltage drop than adding one vcc and one ground. the limiting factor isnt the gauge of the wires, as much as the voltage drop over the cabling.

 

as for if you can use a 6-pin.. i suppose as long as the power supply has enough headroom, it should theoretically be fine, but the general rule of thumb remains:

 

if it doesnt have the right connectors, it wasnt made for that sort of load, so you shouldnt use it for that sort of load.

Link to post
Share on other sites
6 minutes ago, manikyath said:

because of some weird math around ohm's law, adding two ground pins means less voltage drop than adding one vcc and one ground. the limiting factor isnt the gauge of the wires, as much as the voltage drop over the cabling.

 

as for if you can use a 6-pin.. i suppose as long as the power supply has enough headroom, it should theoretically be fine, but the general rule of thumb remains:

 

if it doesnt have the right connectors, it wasnt made for that sort of load, so you shouldnt use it for that sort of load.

Thanks!

 

The funny thing is the Nvidia themselves say the card requires 6-pin and minimum of 400w (Which is exactly my temporary PSU):

 

From the manual on page 6:

https://www.nvidia.com/content/geforce-gtx/GTX_1060_User_Guide.pdf 

 

Although my specific card is not the Nvidia one, buy Gigabyte

 

In this case it's safe to say I'm ok with temporarily using my 400w 6-pin PSU? Should I make a new post for that?

Link to post
Share on other sites

The 6 pin is supposed to deliver a maximum of 75 watts over 2-3 pairs of wires. Originally it was supposed to be 2 pairs, and a third 12v wire was supposed to "sense" the voltage, a feedback to the power supply. But, pretty much everyone just kept power supplies simple and used all 3 pairs of wires to deliver 12v to the video card.

 

I think the idea in the standard was that one could make a small 250-300w power supply and use thinner cables with the pci-e 6pin connector, for example AWG20 wires. \

 

If the extra 2 pins are present, that tells the video card that the connector and the wires are capable of delivering up to 150 watts to the power supply. The wires are supposed to be thicker to reduce voltage loss due to resistance of wires  - the standard is AWG18, but a lot of manufacturers nowadays use AWG16 wires (a bit thicker).

 

A video card must respect the PCIe specification - if a video card is tested and detected to take more 75w or more than 150w from a connector, in theory the PCIe organization could prevent the manufacturer from mentioning PCI-e anywhere on the card, the packaging, no logos, nothing... it's not a device conforming to standards.

 

The connectors themselves are good for 9-13A per pin, depending on how the pins in connectors are built.  The wires are good for 10A or more, with acceptable voltage loss and heating up due to resistance (at least 10A for AWG18, and at least 12-14A for AWG16).

 

So the power supply could provide 150 watts  (150w / 12v = 12.5w) through just 2 pairs of wires if it has to, and probably even one pair if you push it, but connectors and pins could be damaged over time.

 

Sensing gives the power supply the ability to measure the voltage at the connector and adjust the voltage to keep it close to 12v. Basically, when current goes through a wire, there's some voltage drop due to the wire resistance, so the video card would get a lower voltage... instead of 12v it may get 11.8v. So, by sensing the voltage at the connector, the power supply could adjust the voltage up to maybe 12.2v and you'd get 12v at the connector.

Link to post
Share on other sites
12 minutes ago, Filingo said:

I have a GPU that uses no more than 120W, however it has 8-pin connector - can I still use my PSU with 6-pin temporarily?

Usually it will refuse to boot if you don't use an 8-pin.

 

12 minutes ago, Filingo said:

how come the extra 2 pins which are ground help provide more wattage?

The extra 2 pins don't actually carry any current, they're just there to tell the GPU that the PSU and cable thickness are appropriate to draw 150W instead of 75.

 

 

F@H
Desktop: i7-5960X 4.4GHz, Noctua NH-D14, ASUS Rampage V, 32GB, RTX3080, 2TB SX8200Pro, 2x16TB Ironwolf RAID0, Corsair HX1200, Thermaltake Overseer RX1, Samsung 4K curved 49" TV, 23" secondary, Mountain Everest Max

Mobile SFF rig: i9-9900K, Noctua NH-L9i, Asrock Z390 Phantom ITX-AC, 32GB, GTX1070, 2x1TB SX8200Pro RAID0, 2x5TB 2.5" HDD RAID0, Athena 500W Flex (Noctua fan), Custom 4.7l 3D printed case

 

Dell XPS 2 in 1 2019, 32GB, 1TB, 4K

 

GPD Win 2

Link to post
Share on other sites

This got me curious too and from what I can find the standard specifying these only requires 2 12V pins on a 6 pin connector. If your power supply has 3 12V pins then you can use an adapter. The device being plugged in should not work if it doesn't detect the extra sensor pin in the 8 pin or if it does reduce its power level to only draw 75W.

Link to post
Share on other sites
1 minute ago, Kilrah said:

Usually it will refuse to boot if you don't use an 8-pin.

 

The extra 2 pins don't actually carry any current, they're just there to tell the GPU that the PSU and cable thickness are appropriate to draw 150W instead of 75.

 

 

If my card is designed for 6 pin however my OEM use 8-pin you think it will turn on? I'm going to try it soon. Check Nvidia's manual on page 6 regarding my card:

https://www.nvidia.com/content/geforce-gtx/GTX_1060_User_Guide.pdf

 

2 minutes ago, mariushm said:

The 6 pin is supposed to deliver a maximum of 75 watts over 2-3 pairs of wires. Originally it was supposed to be 2 pairs, and a third 12v wire was supposed to "sense" the voltage, a feedback to the power supply. But, pretty much everyone just kept power supplies simple and used all 3 pairs of wires to deliver 12v to the video card.

 

I think the idea in the standard was that one could make a small 250-300w power supply and use thinner cables with the pci-e 6pin connector, for example AWG20 wires. \

 

If the extra 2 pins are present, that tells the video card that the connector and the wires are capable of delivering up to 150 watts to the power supply. The wires are supposed to be thicker to reduce voltage loss due to resistance of wires  - the standard is AWG18, but a lot of manufacturers nowadays use AWG16 wires (a bit thicker).

 

A video card must respect the PCIe specification - if a video card is tested and detected to take more 75w or more than 150w from a connector, in theory the PCIe organization could prevent the manufacturer from mentioning PCI-e anywhere on the card, the packaging, no logos, nothing... it's not a device conforming to standards.

 

The connectors themselves are good for 9-13A per pin, depending on how the pins in connectors are built.  The wires are good for 10A or more, with acceptable voltage loss and heating up due to resistance (at least 10A for AWG18, and at least 12-14A for AWG16).

 

So the power supply could provide 150 watts  (150w / 12v = 12.5w) through just 2 pairs of wires if it has to, and probably even one pair if you push it, but connectors and pins could be damaged over time.

 

Sensing gives the power supply the ability to measure the voltage at the connector and adjust the voltage to keep it close to 12v. Basically, when current goes through a wire, there's some voltage drop due to the wire resistance, so the video card would get a lower voltage... instead of 12v it may get 11.8v. So, by sensing the voltage at the connector, the power supply could adjust the voltage up to maybe 12.2v and you'd get 12v at the connector.

Thank you!

Link to post
Share on other sites
3 minutes ago, Filingo said:

If my card is designed for 6 pin however my OEM use 8-pin you think it will turn on? I'm going to try it soon. Check Nvidia's manual on page 6 regarding my card:

https://www.nvidia.com/content/geforce-gtx/GTX_1060_User_Guide.pdf

Are you using a reference design card or a custom design one? Just because the reference design uses 6-pin and no more than 120w doesn't mean a factory overclocked custom design does the same thing.

Remember to quote or @mention others, so they are notified of your reply

Link to post
Share on other sites
2 minutes ago, Eigenvektor said:

Are you using a reference design card or a custom design one? Just because the reference design uses 6-pin and no more than 120w doesn't mean a factory overclocked custom design does the same thing.

Thank you, I made a dedicated post as well so sorry about duplicate: https://linustechtips.com/topic/1374236-connecting-gtx-1060-6gb-with-6-pin-instead-of-8-pin/

 

And mine is custom Gigabyte 1060 Gaming G1 one, however I will underclock it and limit power so it uses less power, if that works (or if it boots at all)

 

 

Link to post
Share on other sites
Just now, Filingo said:

The funny thing is the Nvidia themselves say the card requires 6-pin and minimum of 400w (Which is exactly my temporary PSU):

 

From the manual on page 6:

https://www.nvidia.com/content/geforce-gtx/GTX_1060_User_Guide.pdf 

 

Although my specific card is not the Nvidia one, buy Gigabyte

nVidia knows that the regular Joe out there is not technical enough to understand how much a video card needs. 

They could say "The video card needs 100 watts, it will take 60 watts through the pci-e 6pin connector and 40 watts through the pci-e slot"  

But what would a regular person do with this information?

 

The power supply they recommend has to power a motherboard, a processor, memory, one or more hard drives.  nVidia basically thinks what would be the average system configuration that would contain this video card, and then it will recommend a power supply wattage that's slightly bigger than needed.

 

Also, power supplies supply that wattage to multiple voltages, and a lot of cheap power supplies reserve some amount of power to specific voltages.

 

For example, you could have a 350 watts power supply,  but that power supply reserves 150 watts for 3.3v and 5v, and only 200 watts are available on 12v, which is used to power the processor and the video card.  The computer would start and go into Windows, because 2D stuff is not power hungry, and the PC would probably not consume more than 100 watts,  but as soon as the user would go into a game and have the video card consume 100-150 watts, the whole system could consume more than 200 watts on 12v and the power supply would shut down.  Then, the user would return the video card and not replace the power supply.

 

So they'll also often say a minimum wattage like 400 watts, because they're aware A LOT of very cheap power supplies lie on the label and inflate how much they can actually supply on 12v, and they also don't want power supplies to run for long periods of time, and they're hoping power supplies above that wattage actually can provide the minimum amount of power on 12v, the voltage rail used by the video card. 

 

I'll give you an example ... here's a 400w power supply : https://www.newegg.com/coolmax-i-400-400w/p/N82E16817159140

If you actually look on the label, it says it can do maximum 300 watts on 12v, which is what powers video cards and processors, but I can GUARANTEE you that it won't do more than 200 watts in real life:

 

image.png.8d6b6e7bf142ac1deacb434f0806664f.png

 

Newegg actually has a decent deal, a EVGA 450w BR for 20$ after a $10 rebate ... it's not a great psu, but good for office pc, something with integrated graphics, it's less likely to blow on you : https://www.newegg.com/evga-450-br-100-br-0450-k1-450w/p/N82E16817438144

 

 

Link to post
Share on other sites
3 minutes ago, Filingo said:

Thank you, I made a dedicated post as well so sorry about duplicate: https://linustechtips.com/topic/1374236-connecting-gtx-1060-6gb-with-6-pin-instead-of-8-pin/

 

And mine is custom Gigabyte 1060 Gaming G1 one, however I will underclock it and limit power so it uses less power, if that works (or if it boots at all)

 

 

In most cases the system won't even POST without the proper cables connected. 

 

Why are you not just using the 8-pin like the card requests?

5800x/3090

Link to post
Share on other sites
2 minutes ago, rickeo said:

In most cases the system won't even POST without the proper cables connected. 

 

Why are you not just using the 8-pin like the card requests?

No extra PSU right now but buying soon

 

  

3 minutes ago, mariushm said:

nVidia knows that the regular Joe out there is not technical enough to understand how much a video card needs. 

They could say "The video card needs 100 watts, it will take 60 watts through the pci-e 6pin connector and 40 watts through the pci-e slot"  

But what would a regular person do with this information?

 

The power supply they recommend has to power a motherboard, a processor, memory, one or more hard drives.  nVidia basically thinks what would be the average system configuration that would contain this video card, and then it will recommend a power supply wattage that's slightly bigger than needed.

 

Also, power supplies supply that wattage to multiple voltages, and a lot of cheap power supplies reserve some amount of power to specific voltages.

 

For example, you could have a 350 watts power supply,  but that power supply reserves 150 watts for 3.3v and 5v, and only 200 watts are available on 12v, which is used to power the processor and the video card.  The computer would start and go into Windows, because 2D stuff is not power hungry, and the PC would probably not consume more than 100 watts,  but as soon as the user would go into a game and have the video card consume 100-150 watts, the whole system could consume more than 200 watts on 12v and the power supply would shut down.  Then, the user would return the video card and not replace the power supply.

 

So they'll also often say a minimum wattage like 400 watts, because they're aware A LOT of very cheap power supplies lie on the label and inflate how much they can actually supply on 12v, and they also don't want power supplies to run for long periods of time, and they're hoping power supplies above that wattage actually can provide the minimum amount of power on 12v, the voltage rail used by the video card. 

 

I'll give you an example ... here's a 400w power supply : https://www.newegg.com/coolmax-i-400-400w/p/N82E16817159140

If you actually look on the label, it says it can do maximum 300 watts on 12v, which is what powers video cards and processors, but I can GUARANTEE you that it won't do more than 200 watts in real life:

 

image.png.8d6b6e7bf142ac1deacb434f0806664f.png

 

Newegg actually has a decent deal, a EVGA 450w BR for 20$ after a $10 rebate ... it's not a great psu, but good for office pc, something with integrated graphics, it's less likely to blow on you : https://www.newegg.com/evga-450-br-100-br-0450-k1-450w/p/N82E16817438144

 

 

thank you

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×