Jump to content

AMD once again violating power specifications? (AMD RX-480)

Majestic
2 minutes ago, HKZeroFive said:

@Majestic, might be worth adding this to the OP. It's a statement from AMD themselves:

 

 

Thanks, that's good news, they seem to have understood part or all that went wrong.

Link to comment
Share on other sites

Link to post
Share on other sites

I just saw AMD's initial statement

prepare for some major total board power draw limiting - aka gimping the card xD

I doubt that will change load on PEG at the 75W limit, but we'll see

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, zMeul said:

I just saw AMD's initial statement

prepare for some major total board power draw limiting - aka gimping the card xD

I doubt that will change load on PEG at the 75W limit, but we'll see

Yeah, even if the RAM clock is lowered and the over-current on the PCI slot is controlled... that means no overclocking the GPU beyond that spec.  I guess we wait and see.

 

Maybe, just maybe... they have an advanced enough controller for the VRM to do a 60/40 split between the cable and slot.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, stconquest said:

Yeah, even if the RAM clock is lowered and the over-current on the PCI slot is controlled... that means no overclocking the GPU beyond that spec.  I guess we wait and see.

 

Maybe, just maybe... they have an advanced enough controller for the VRM to do a 60/40 split between the cable and slot.

If they can put a limit on the slot power draw, you can still overclock it with the power going through the 6pin connector. But anyway, there wasn't too much headroom as it was, so that won't change that anyway.

That may not be ideal, but there is a bit of headroom there.

The AIB cards will have more overclocking potential though.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, stconquest said:

Yeah, even if the RAM clock is lowered

and the GDDR5 chips are .. you guessed it! Samsung xD

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, laminutederire said:

If they can put a limit on the slot power draw, you can still overclock it with the power going through the 6pin connector. But anyway, there wasn't too much headroom as it was, so that won't change that anyway.

That may not be ideal, but there is a bit of headroom there.

The AIB cards will have more overclocking potential though.

If you can put a limit on the slot, the job is done and you can OC the GPU all you want.  That is what I meant by a VRM controller.  I don't think they have an advanced enough controller on the card... but I could be wrong.  I think it is all hard lined.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, stconquest said:

If you can put a limit on the slot, the job is done and you can OC the GPU all you want.  That is what I meant by a VRM controller.  I don't think they have an advanced enough controller on the card... but I could be wrong.  I think it is all hard lined.

Yeah sure, what I meant was that overclocking wasn't in a good state as it is so that may not change with the reference card power fix

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, zMeul said:

and the GDDR5 chips are .. you guessed it! Samsung xD

Are Samsung flash modules power hungry?  ...more than others?  I really have no idea o.O:D

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, laminutederire said:

Yeah sure, what I meant was that overclocking wasn't in a good state as it is so that may not change with the reference card power fix

Reference cooler... rated for 110W(GPU)-150W(entire card) TDP... nuff said.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, stconquest said:

Are Samsung flash modules power hungry?

doubt it, GTX1070 FE also uses Samsung

it was a joke about them using Samsung tech as GloFo's 14nm LPP is a licenced Samsung process

 

I don't understand tho, why are they hinting at the VRAM, is the VRAM power delivery drawing power directly from the PEG?

if so, they can limit the power draw .. but to the cost of lower VRAM freq  !?!?! - I'm guessing at this point

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, zMeul said:

doubt it, GTX1070 FE also uses Samsung

it was a joke about them using Samsung tech as GloFo's 14nm LPP is a licenced Samsung process

 

I don't understand tho, why are they hinting at the VRAM, is the VRAM power delivery drawing power directly from the PEG?

if so, they can limit the power draw .. but to the cost of lower VRAM freq  !?!?! - I'm guessing at this point

They are hinting towards the 8GB/s performance of the memory being a source of power draw that can be fixed through a driver update.  A.K.A. lower the clock. 

 

I really have no idea of how the card is hard wired, or if they can limit the any of the phases through a controller(bios).  I have heard (in this thread) the phases are split in two.  Three phases to the slot, and three to the cable;  hard line.

 

Bah, I don't know shit =P

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Trixanity said:

Is it really relevant to highlight how sensitive a lot of YouTubers (Jayztwocents, Tech of Tommorow, now AdoredTV) apparently are? Is it relevant to the topic? It does not change the situation or provide anything meaningful to the subject.

Seems like your intent is to create your own private cesspool here. Quite hypocritical given earlier posts of yours. This reads like a YouTube drama alert update. 

It becomes a problem when people start using their video's as evidence. Right now every apologist is linking the video. I think it's not unimportant to show a bit of character, because this clearly indicates he's being cognitive dissonant and biased.

 

Which is dicated by the amount of cherrypicking, non-sequiturs and strawman fallacies are present in the video.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Quibiss said:

While quick browsing through the thread things to notice: Meme bullshit, the usual suspects cursing, flaming and arguing in the name of their corporate overlords, yep, must be something about either AMD or Nvidia. LTT never disappoints.

Glad you're here to be the stereotypical guy to ride in on their high-horse and don't actually contribute.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Majestic said:

 

Another well done video by PCPER, adding it to the OP.

Very interesting, it looks like somebody code/design to draw the power equally from the MB and the connector and did not expect more than 150W power draw.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, MoonSpot said:

PcPer latest youtube upload.  Haven't watched it yet, will add comment once I have.

 

:EDIT:

The TL;DR/W is:

  • Power draw overage for extended periods through PCI-e confirmed, tis not just spikes.
  • AMD working on a patch/fix, no answer as to what since we're barely 48hrs in after release.
  • Other cards which people suspect were also pulling more power through mobo almost certainly were not.
  • Ryan has issues with pointing a stylus away from the screen when trying for touch functionality.

 

Full story link : https://www.pcper.com/reviews/Graphics-Cards/Power-Consumption-Concerns-Radeon-RX-480

 

:EDIT 2: Ha!  Sorry, didn't realize how late to the party I was here.  Have been drinking.

thanks for posting this. they certainly hit every nail on the head.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

AMD will provide an update on Tuesday. Apparently they can fix it via software.

 

https://www.reddit.com/r/Amd/comments/4qwj31/amdjoe_on_discord_shares_the_official_pcie/

 

CFJYSWK.png

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, Trixanity said:

Did you raise the power target? Tests show that you don't gain any performance from OC'ing because it's power starved meaning it doesn't really hit the specified clocks but raising that target blows up the power consumption. Also, you should also try undervolting a bit. Improves temps and could give you a better OC despite it being somewhat counterintuitive. 

1.075V stable was achieved by someone. It actually kinda points to that the card is running at too high a voltage at stock. Too high as in more than necessary, not too high as in explosions everywhere.

I think you are right. There are a lot of people in this reddit getting noticeably better performance out of their 480s by simply undervolting. It's insane. Seems like the cards are heavily overvolted

 

https://www.reddit.com/r/Amd/comments/4qupw4/super_psa_all_rx480_owners_please_attempt_to/

THE BEAST Motherboard: MSI B350 Tomahawk   CPU: AMD Ryzen 7 1700   GPU: Sapphire R9 290 Tri-X OC  RAM: 16GB G.Skill FlareX DDR4   

 

PSU: Corsair CX650M     Case: Corsair 200R    SSD: Kingston 240GB SSD Plus   HDD: 1TB WD Green Drive and Seagate Barracuda 2TB Media Drive

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, stconquest said:

That's awesome.  I wonder what they will change.  Will clocking down the RAM be enough...

It's funny, I was thinking about this last night in bed, and the same thought occurred to me.

 

About a week ago or so, pictures of the the memory chips on the 480 leaked, and it turned out to be the same chips as the 1070. Thing is, the chips on the 1070 can clock past 9ghz in many cases, while we're seeing much more conservative memory overclocks on the 480. It's as though they are using different chips, yet the stamp on them is the same.

 

That got me to thinking, maybe AMD originally intended the 480 to have 7ghz memory clocks, and the chips they are using (albeit the same model number) are a variant meant for 7ghz. That wouldn't seem the case though, as the 7Ghz chips end in 28. Unless of course they are overvolting the 1.35V 7Ghz chips and pushing them to 8Ghz (because they are cheaper?). It's possible they thought they could use the 1.35V chips and push them to save on board power, but again - total speculation on my part.

 

Here is a screenshot from Samsungs memory catalogue, the memory in question is K4G80325FB-HC25. Notice how the low voltage HC25 is rated for 7Ghz, while the normal voltage HC25 is rated for 8Ghz. the GTX 1070 is most certainly using the 1.5V chips, but I wonder about the 480...

480 memory.png

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Kobz360 said:

I think you are right. There are a lot of people in this reddit getting noticeably better performance out of their 480s by simply undervolting. It's insane. Seems like the cards are heavily overvolted

 

https://www.reddit.com/r/Amd/comments/4qupw4/super_psa_all_rx480_owners_please_attempt_to/

Can we spread this around? I'd love to see what happens with more cards tested.

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, Dabombinable said:

Oh remember that video where they blatantly benchmarked a certain CPU in a manner that was misleading

 

What was missleading about it?

he streaming benchmarks with xplit? nothing wrong with those.

Xplit did better on AMD FX that time.

 

And the other gaming benchmarks video´s were legit aswell.

Wenn people dont listen to what he says and how he does the benches and which particular hardware he used, doesnt make his video wrong.

The gaming benchmarks were with a 7870 and a 7970 card if i remember correctly.

So there was nothing missleading with that.

Because the FX8350 was totaly capable to maxout a 7970.

So it wasnt that strange that you got similar scores on the FX8350 vs the i7-3570K and soforth.

But yeah if people are just too lame to understand that, then yeah...

Also those people who keep reflecting those video´s back till todays time, dont understand how the revolution of hardware works atall.

But thats not TS fault.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Starelementpoke said:

Can we spread this around? I'd love to see what happens with more cards tested.

Is it like how skylake based processor are overvolted and that you can lower the voltage AND overclock them anyway?

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, laminutederire said:

Is it like how skylake based processor are overvolted and that you can lower the voltage AND overclock them anyway?

Possibly, haven't read too much into it.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×