Jump to content

AMD once again violating power specifications? (AMD RX-480)

Majestic
6 minutes ago, Curufinwe_wins said:

I mean, honestly this doesn't mean a damn thing for people running modern motherboards, but yea it's pretty shitty for a decent bit of the market this card is for.

I was honestly hoping this would've been a perfectly smooth launch, and was convinced it would be with rumors of plentiful stock and all that. Whether this hoopla is warranted well, only time will tell. AMD will be fine but damn this is the last thing they needed.

CPU: Intel Core i7 7820X Cooling: Corsair Hydro Series H110i GTX Mobo: MSI X299 Gaming Pro Carbon AC RAM: Corsair Vengeance LPX DDR4 (3000MHz/16GB 2x8) SSD: 2x Samsung 850 Evo (250/250GB) + Samsung 850 Pro (512GB) GPU: NVidia GeForce GTX 1080 Ti FE (W/ EVGA Hybrid Kit) Case: Corsair Graphite Series 760T (Black) PSU: SeaSonic Platinum Series (860W) Monitor: Acer Predator XB241YU (165Hz / G-Sync) Fan Controller: NZXT Sentry Mix 2 Case Fans: Intake - 2x Noctua NF-A14 iPPC-3000 PWM / Radiator - 2x Noctua NF-A14 iPPC-3000 PWM / Rear Exhaust - 1x Noctua NF-F12 iPPC-3000 PWM

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Curufinwe_wins said:

I mean, honestly this doesn't mean a damn thing for people running modern motherboards, but yea it's pretty shitty for a decent bit of the market this card is for.

Not all "modern motherboards" are all built to a high (or even moderately) high quality standard. Most are actually rather junk, especially at the low end where people would be shopping if this is the card they intend on buying. Same goes for power supplies. People simply aren't going to spend $200+ and $100+ on a motherboard and PSU if they're buying a $200 graphics card. Remember, roughly a third of your budget should go to a graphics card because that's where you'll get the most perf/$ in your system.

.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Suika said:

What an interesting turn of events. Do I hear WattGate?

Of course you won't hear a WattGate, because you have all of the defenders of AMD.

Although, as @Curufinwe_wins has pointed out a few times, this doesn't mean anything for people running modern motherboards and PSUs, still something that should be noted.

Check out my guide on how to scan cover art here!

Local asshole and 6th generation console enthusiast.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Suika said:

What an interesting turn of events. Do I hear WattGate?

Probably not the idea AMD had in mind when they came up with this ad campaign. :D

 

images_BrandPromos_AMD_PolarisUprising_A

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, AlwaysFSX said:

Not all "modern motherboards" are all built to a high (or even moderately) high quality standard. Most are actually rather junk, especially at the low end where people would be shopping if this is the card they intend on buying. Same goes for power supplies. People simply aren't going to spend $200+ and $100+ on a motherboard and PSU if they're buying a $200 graphics card. Remember, roughly a third of your budget should go to a graphics card because that's where you'll get the most perf/$ in your system.

Well, all Intel mobo's minimum requirements should allow this to work without issue, and same with the 990 chipset, but some of the others could have issues.

 

 

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Curufinwe_wins said:

Well, all Intel mobo's minimum requirements should allow this to work without issue, and same with the 990 chipset, but some of the others could have issues.

Minimum requires a lot of times is "won't catch on fire." Doesn't mean you should really try it. Plus, those are more for CPU load specifically, not add-in cards where boards can be really skimped on. Remember the MSI Krait having significantly worse performance with GPUs than literally every other motherboard?

.

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, RagnarokDel said:

There's no way in hell that's accurate. Just on the temperatures alone, this would make the card meltdown.

Those are spikes. Total average is 165W.

 

It pulls a lot from the PCIe slot...

On a mote of dust, suspended in a sunbeam

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Dan Castellaneta said:

Of course you won't hear a WattGate, because you have all of the defenders of AMD.

Although, as @Curufinwe_wins has pointed out a few times, this doesn't mean anything for people running modern motherboards and PSUs, still something that should be noted.

The sad thing is that I believe that to be true.

 

I would still argue it as being a concern for lower end or less modern hardware. For example, I was considering combining this with an LGA775 socket for a LAN machine, but uh, not anymore lol. It may turn out to be a non-issue, but until consumers of varying platforms all run the cards extensively, I don't want to recommend this card yet.

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, AlwaysFSX said:

Minimum requires a lot of times is "won't catch on fire." Doesn't mean you should really try it. Plus, those are more for CPU load specifically, not add-in cards where boards can be really skimped on. Remember the MSI Krait having significantly worse performance with GPUs than literally every other motherboard?

I do, and those specifications also add requirements for gpu load. Like any z97/z170 atx board is required to be able to post at least 200W through the pcie slots (same with the 990 chipset).

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, DevilishBooster said:

Well, I have never claimed to be a GPU expert, but everything that I have read on the Vortex says they are full power 980s. Here is a pic so you can let me know what you think, and I put the link to the article that image is from below. I'll try to find out more information. I'm not saying you're wrong, I'm just telling you what I have read/know. The point still stands that they are sending waaaaaaay more than 75W through the PCIe/MXM interface.

 

http://www.gizmodo.com.au/2016/06/msi-vortex-g65-sli-gaming-pc-australian-review/

They are indeed full power 980s. From what was said a while back when they released, they are binned differently, but are not cut down at all. My point however, is that these MXM modules receive power differently. The board itself is likely designed to deliver additional power to these modules. I have no proof of this, simply because I do not own the MSI Vortex, but it would not be the first time this is done. These are completely custom, in-house fabrications. There is a proof of concept board from Colorful floating around, with a desktop GTX 1070 integrated into the board itself (not socketed) yet is likely to receive power from the board itself in some way. 

 

I just can't see this being a valid comparison to proof a point. If anything, you are better off using dual-GPU graphics cards to make your point. They have 2 GPU's on a single PCB, in a single slot, yet work fine. However, this point will be argued by saying "They have additional PCIe connectors to power these GPU's".

 

At this point, the only option AMD has, is a potential fix with the BIOS if there is indeed a problem. If AMD is telling the truth, and it's just bad review samples, then there should be no worry at all. Only time will tell at this point though. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Majestic said:

Because it peaks 155W over a bus designed for 75W. 

Come again?

Quote

AMD’s Radeon RX 480 draws an average of 164W, which exceeds the company's target TDP. And it gets worse. The load distribution works out in a way that has the card draw 86W through the motherboard’s PCIe slot. Not only does this exceed the 75W ceiling we typically associate with a 16-lane slot, but that 75W limit covers several rails combined and not just this one interface

Italics are mine.

 

Edit: Actually, I see where you're coming from. I had a review of their graphs, and it indeed apparently peaks at 155W which is pathetic.

"It pays to keep an open mind, but not so open your brain falls out." - Carl Sagan.

"I can explain it to you, but I can't understand it for you" - Edward I. Koch

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not sure if it has been mentioned but I am hearing a lot of poo-pooing about the 480 not being as efficient as the 1070, but being on par with the 970. If memory serves doesn't the 480 still have a hardware scheduler and the 1070 still doesn't? Which is a big reason why the 900 series had the big upswing in efficiency in the first place. If and when these scheduler differences are focused on will the 1070 been seen to stumble as the 900's did? IF AMD had removed the scheduler what would its efficiency be in comparison to the 1000 series?

 

On the power draw issue I am not sure it is really an issue yet, until we know where the cause is and how big an impact it has. I'm ignorant of the process by which they peg the wattage drawn from PCIe vs 6-pin but aside from the massive spikes the amount of power going thru the motherboard does not necessarily dictate the amount being used by the GPU. Especially if power circuitry is heated or overtaxed.

 

I look forward to seeing the explanation because it could be a considerable issue. Although I hope it isn't

Link to comment
Share on other sites

Link to post
Share on other sites

Quote

 

With Tom's Hardware reporting that the RX 480 draws (substantailly) more than the 75W allowed from the motherboard (for example, the PCI Express high-power card spec allows a mazimum of 66W to be drawn from the 12V pins of the PCI Express slot, and the RX 480 averages 79W from the 12V lines alone) AMD seems to be violating the PCI Express(R) spec. Of course, I'd love to see HardOCP try to duplicate Tom's results.

According to the licensing contract for the spec, if they do not fix this within 3 months, AMD will NOT be able to call the card a PCI Express card. If they do, they face not only litigation, but if my understanding is correct an action before the U.S. International Trade Commission (ITC) to ban the importation of the card as counterfeit goods. You might think the PCI-SIG will give AMD a pass, but if they do, they risk loosing the trademark entirely. An unforced trademark gets invalidated. The SIG won't let that happen.

So what does this mean to the consumer? I think there are two possibilities, if we assume AMD will not choose to remove the PCI Express logos from these cards: Either they will alter the boards to have an 8-pin socket and to pull more power from there, or they will neuter the card to ensure it doesn't draw more power than the PCI Express specification allows. I don't see any other options.

Disclaimer: I am an attorney, but I practice patent law, not trademark law. This post does not constitute legal advice and does not create an attorney-client relationship.

 

https://hardforum.com/threads/amd-radeon-rx-480-video-card-review-h.1903637/page-3#post-1042386067

 

Russian-jet-fighters-NATO-scramble-400-t

Link to comment
Share on other sites

Link to post
Share on other sites

just saw this in paul's tardware 480 review, pretty bad system power draw  on the overclocked 480 for the performance, and those are overclocked 970 and 1070 cards at the bottom (both of which have more than just a single 6-pin connector).

 

480 power draw.jpg

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

and sudenly you got to wonder what card AMD submitted to PCI-SIG certification ... the one [H] said it couldn't pass 800Mhz, or the retail unit xD

how the truth keeps rushing to the surface

 

fuck'em! they did it with their own two hands

Link to comment
Share on other sites

Link to post
Share on other sites

The savior of AMD? Looks like not. 970 is close on price and the thing eats power like crazy.

Main Gaming PC - i9 10850k @ 5GHz - EVGA XC Ultra 2080ti with Heatkiller 4 - Asrock Z490 Taichi - Corsair H115i - 32GB GSkill Ripjaws V 3600 CL16 OC'd to 3733 - HX850i - Samsung NVME 256GB SSD - Samsung 3.2TB PCIe 8x Enterprise NVMe - Toshiba 3TB 7200RPM HD - Lian Li Air

 

Proxmox Server - i7 8700k @ 4.5Ghz - 32GB EVGA 3000 CL15 OC'd to 3200 - Asus Strix Z370-E Gaming - Oracle F80 800GB Enterprise SSD, LSI SAS running 3 4TB and 2 6TB (Both Raid Z0), Samsung 840Pro 120GB - Phanteks Enthoo Pro

 

Super Server - i9 7980Xe @ 4.5GHz - 64GB 3200MHz Cl16 - Asrock X299 Professional - Nvidia Telsa K20 -Sandisk 512GB Enterprise SATA SSD, 128GB Seagate SATA SSD, 1.5TB WD Green (Over 9 years of power on time) - Phanteks Enthoo Pro 2

 

Laptop - 2019 Macbook Pro 16" - i7 - 16GB - 512GB - 5500M 8GB - Thermal Pads and Graphite Tape modded

 

Smart Phones - iPhone X - 64GB, AT&T, iOS 13.3 iPhone 6 : 16gb, AT&T, iOS 12 iPhone 4 : 16gb, AT&T Go Phone, iOS 7.1.1 Jailbroken. iPhone 3G : 8gb, AT&T Go Phone, iOS 4.2.1 Jailbroken.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Hardware.Fr finds that by removing the power and temp limits, the RX480 will draw 192W in The Witcher 3 :o

http://www.hardware.fr/articles/951-9/consommation-efficacite-energetique.html

Quote

Par contre, en mode Uber, soit quand nous relevons les limites de consommation et de ventilation, le GPU monte à sa fréquence maximale dans tous les jeux et ce sont alors les plus lourds qui consomment le plus. Pour maintenir 1266 MHz dans The Witcher 3, la RX 480 a besoin de 192W.

 

Link to comment
Share on other sites

Link to post
Share on other sites

yeah so much for being power efficient. slightly higher system power draw than 1070, stock for stock. so much for 14nm 2.8x perf/watt

 

RDWM6sN.jpg

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

I thought that a smaller manufacturing process meant less power usage?

 

I hope this turns out to be a non-issue, as I would really like to not buy another Nvidia card down the road.

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, ace_cheaply said:

Where are you getting that it draws 155w over a bus designed for 75?  

When they tried overclocking it. To me it seems a firmware issue.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

And who was it that told you people Samsung has no experience at high-power processes and that GloFo sucking at being efficient combined with that would result in a lackluster product the first time around? That's right; it was me.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, Trik'Stari said:

I hope this turns out to be a non-issue, as I would really like to not buy another Nvidia card down the road.

it has already been confirmed that retail cards exhibit the same behavior - the non issue just turned into a possible legal issue at most

PCI SIG could redraw PCIe certification for AMD - that would mean disaster for AMD in their relation with OEMs and AIB partners

 

---

 

Quote

Purchased a Sapphire 8GB RX 480 today. After reading up about this issue, I decided to test for myself. I rigged up a riser to be able to measure 12V current with an AMP clamp from both the PCI-e slot, and 6 pin connector.
This isn't anywhere near being scientific, but I think it's accurate enough to confirm the problem. Running stock clocks with stock voltage while running ethereum mining = 83w from the 6 pin connector, and 88w from the PCI-e slot. That's a violation of both ATX and PCI specs. I don't particularly mind it violating the ATX spec as a quality 6 pin connector can provide 200w without issue. The PCI-e slot, on the other hand, is an issue. I bought 4 of these cards today, and intend (intended?) on setting them up on a Rampage 5 motherboard. I don't think even a top end motherboard like that will be able to supply 352w to the PCI-e slots, even using the 4 pin Molex

https://www.reddit.com/r/Amd/comments/4qfwd4/rx480_fails_pcie_specification/d4tfjaz

Edited by zMeul
Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, patrickjp93 said:

And who was it that told you people Samsung has no experience at high-power processes and that GloFo sucking at being efficient combined would result in a lackluster product? That's right; it was me.

...and you are also the one that believed this RX 480 would perform at the Fury level.  I like you, but give it a rest.  :D

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×