Jump to content

NAVI31 won’t be using the RTX 4090’s melting power connector!

b1k3rdude
10 hours ago, Arika S said:

I can guarantee this is not because of the issues nvidia is facing, they were just not planning on doing it this gen

Pretty much, also wouldn't surprise me if there's royalties involved in using Nvidia's new connector

Link to comment
Share on other sites

Link to post
Share on other sites

Yay NVidia is having an Intel moment (see: burnt pins)

"Don't fall down the hole!" ~James, 2022

 

"If you have a monitor, look at that monitor with your eyeballs." ~ Jake, 2022

Link to comment
Share on other sites

Link to post
Share on other sites

3090 ti used same connector right?  And the kingpin was pulling 800 watts.  So I don’t see a problem with the connector.   Just garbage adapters. 

CPU:                       Motherboard:                Graphics:                                 Ram:                            Screen:

i9-13900KS   Asus z790 HERO      ASUS TUF 4090 OC    GSkill 7600 DDR5       ASUS 48" OLED 138hz

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Shzzit said:

3090 ti used same connector right?  And the kingpin was pulling 800 watts.  So I don’t see a problem with the connector.   Just garbage adapters. 

Not really, kind of but no.

 

Quote

When Nvidia introduced its proprietary 12-pin power connector with its GeForce RTX 30-series Founders Edition graphics cards last year, many considered it an overkill. But the move might indeed be pretty wise as that connector and cable are compatible with the industry-standard 12+4-pin (12VHPWR) power connection, so existing GeForce Founders Edition graphics boards will work just fine with next-generation PSUs.

 

Quote

Nvidia's 12-pin power connector (formally known as the Molex MicroFit 3.0 dual row, 12 circuits) is designed to deliver up to 600W to a graphics card using 12V rail of a PSU and apparently the same amount of power is set to be delivered by a 12+4-pin auxiliary PCIe 5.0 power connector (also known as 12VHPWR).  

https://www.tomshardware.com/news/nvidias-12-pin-power-connector-will-work-with-next-gen-pcie-5-psus

 

RTX 30 series 12 Pin does not have the 4 sense pins or plugs on the connector at all. The Nvidia supplied 12 Pin to 8 Pin were also of a different design and only used 2x 8 Pin not 4x 8 Pin making the converter less complicated but also technically violating the maximum power over 8 Pin cables and connectors.

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, leadeater said:

Not really, kind of but no.

 

 

https://www.tomshardware.com/news/nvidias-12-pin-power-connector-will-work-with-next-gen-pcie-5-psus

 

RTX 30 series 12 Pin does not have the 4 sense pins or plugs on the connector at all. The Nvidia supplied 12 Pin to 8 Pin were also of a different design and only used 2x 8 Pin not 4x 8 Pin making the converter less complicated but also technically violating the maximum power over 8 Pin cables and connectors.

We’ll shit lol, if the old adapter was doing 400-800 watts just fine in 3090 ti, and now we have adapters that melt doing 300-500?  What the hell happend?  Are they just that garbage or badly built? 
 

I was gonna worry but I can’t even find a 4090 to buy 😢😢😢

CPU:                       Motherboard:                Graphics:                                 Ram:                            Screen:

i9-13900KS   Asus z790 HERO      ASUS TUF 4090 OC    GSkill 7600 DDR5       ASUS 48" OLED 138hz

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Shzzit said:

Are they just that garbage or badly built?

It's just the Nvidia provided adapter, native 12 Pin or anyone else's adapter is as far as we know fine. Seems to be manufacturing issue and with the outer pins and cable(s). A 4x 8 Pin to 12 Pin so close to the GPU is ugly and stupid anyway, no way I'd use that, much better adapters on the market.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, leadeater said:

It's just the Nvidia provided adapter, native 12 Pin or anyone else's adapter is as far as we know fine. Seems to be manufacturing issue and with the outer pins and cable(s). A 4x 8 Pin to 12 Pin so close to the GPU is ugly and stupid anyway, no way I'd use that, much better adapters on the market.

Good to know thanks mate.  I got the gf3 thermaltake 1650 atx 3.0 psu.  It’s 600 watt cables are very thick, I’m guessing I’ll be ok once I get a card? 
 

Thanks for all the information.

CPU:                       Motherboard:                Graphics:                                 Ram:                            Screen:

i9-13900KS   Asus z790 HERO      ASUS TUF 4090 OC    GSkill 7600 DDR5       ASUS 48" OLED 138hz

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Shzzit said:

Good to know thanks mate.  I got the gf3 thermaltake 1650 atx 3.0 psu.  It’s 600 watt cables are very thick, I’m guessing I’ll be ok once I get a card? 
 

Thanks for all the information.

I have the card but can't get the PSU. Let's fuse our possessions in the name of FPS safety.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, HumdrumPenguin said:

I have the card but can't get the PSU. Let's fuse our possessions in the name of FPS safety.

Haha sounds like the future. 

CPU:                       Motherboard:                Graphics:                                 Ram:                            Screen:

i9-13900KS   Asus z790 HERO      ASUS TUF 4090 OC    GSkill 7600 DDR5       ASUS 48" OLED 138hz

Link to comment
Share on other sites

Link to post
Share on other sites

So far it seems like 3rd party cables don't have this issue, only Nvidia's included adapter. This issue seems more like a build quality issue than anything else. And here you have hobby-grade electrical engineers saying this connector design is flawed and DOA.

 

Guys, sometimes it's best to take a step back, get some more information from another source and think about it before jumping on the hate-train. And no, another random reddit post is not what i mean.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Stahlmann said:

So far it seems like 3rd party cables don't have this issue, only Nvidia's included adapter. This issue seems more like a build quality issue than anything else. And here you have hobby-grade electrical engineers saying this connector design is flawed and DOA.

 

Guys, sometimes it's best to take a step back, get some more information from another source and think about it before jumping on the hate-train. And no, another random reddit post is not what i mean.

 

I say wait a year and see how many "burned" cable reports come out and what card they were used on. I would certainly not buy a 4090 and use that 4 PCIe adapter though. At this point I'd wait for the Ti cards and see if things change.

 

Link to comment
Share on other sites

Link to post
Share on other sites

people are saying just dont bend the cable but idk depending on your case it might be impossible to route the cable without some bending 

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, spartaman64 said:

people are saying just dont bend the cable but idk depending on your case it might be impossible to route the cable without some bending 

Nvidia's fix will be a drill you can rent for $199,95 a day to drill holes into the sidepanel of your case.

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×