Jump to content

Vega 56 undervolting can beat GTX 1080 in benchmarks.

Dionyz
Quote

Let us now come to the results:

For the Radeon RX Vega 64, we were able to reduce the voltage from 1,150 or 1,200 mV to 1,110 mV, but we landed at 1.538 MHz at the clock. This means a reduction of the voltage by 8%. At the same time, however, we were able to increase the cycle by almost 10% in extreme cases. Thus a corresponding increase in performance can be seen over a large number of games in which the boost cycle has not been too high before. We have tested this with some games, the corresponding diagrams follow.

The undervolting potential of the Radeon RX Vega 56 is somewhat higher. Here, we were able to reduce the GPU voltage from 1,200 mV to 1.070 mV (-12%) and maintain the clock at 1.613 MHz, an increase of almost 25% in extreme cases. In part, the Radeon RX Vega 56 reduces its clock rate in our tests to 1,300 MHz. In this respect the 1.613 MHz achieved is a very good result. This is also reflected in the reduction of the power consumption by 73 W.

Saving more power, and getting more performance out of it sounds like a good deal.

 

Source: https://www.hardwareluxx.de/index.php/artikel/hardware/grafikkarten/44084-amd-radeon-rx-vega-56-und-vega-64-im-undervolting-test.html

Link to comment
Share on other sites

Link to post
Share on other sites

I wonder what the point of that usb monitoring port was when they were just going to overvolt it anyway 

hello!

is it me you're looking for?

ᴾC SᴾeCS ᴰoWᴺ ᴮEᴸoW

Spoiler

Desktop: X99-PC

CPU: i7 5820k

Mobo: X99 Deluxe

Cooler: Dark Rock Pro 3

RAM: 32GB DDR4
GPU: GTX 1080

Storage: 1TB 850 Evo, 1TB HDD, bunch of external hard drives
PSU: EVGA G2 750w

Peripherals: Logitech G502, Ducky One 711

Audio: Xonar U7, O2 amplifier (RIP), HD6XX

Monitors: 4k 24" Dell monitor, 1080p 24" Asus monitor

 

Laptop:

-Overkill Dell XPS

Fully maxed out early 2017 Dell XPS 15, GTX 1050 4GB, 7700HQ, 1TB nvme SSD, 32GB RAM, 4k display. 97Whr battery :x 
Dell was having a $600 off sale for the fully specced out model, so I decided to get it :P

 

-Crapbook

Fully specced out early 2013 Macbook "pro" with gt 650m and constant 105c temperature on the CPU (GPU is 80-90C) when doing anything intensive...

A 2013 laptop with a regular sized battery still has better battery life than a 2017 laptop with a massive battery! I think this is a testament to apple's ability at making laptops, or maybe how little CPU technology has improved even 4+ years later (at least, until the recent introduction of 15W 4 core CPUs). Anyway, I'm never going to get a 35W CPU laptop again unless battery technology becomes ~5x better than as it is in 2018.

Apple knows how to make proper consumer-grade laptops (they don't know how to make pro laptops though). I guess this mostly software power efficiency related, but getting a mac makes perfect sense if you want a portable/powerful laptop that can do anything you want it to with great battery life.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, AresKrieger said:

Simple it isn't reliably stable

Indeed. It's should somehow has to do with silicon lottery too just like overclocking.

CPU: Ryzen 7 5800X | MOBO: Gigabyte B550 Vision D | RAM: Crucial Ballistix RGB 32GB 3600MHz | GPU: Gigabyte RTX 3070 Vision D | PSU: Seasonic Focus+ Gold 750W

Link to comment
Share on other sites

Link to post
Share on other sites

I'm going to go with "AMD would have done it" if it was actually a good idea.

 

And just go buy a NV card and then you get good shit out of the box you don't have to fuck around with trying to improve on AMD's disaster.

Workstation:  13700k @ 5.5Ghz || Gigabyte Z790 Ultra || MSI Gaming Trio 4090 Shunt || TeamGroup DDR5-7800 @ 7000 || Corsair AX1500i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, AnonymousGuy said:

I'm going to go with "AMD would have done it" if it was actually a good idea.

 

And just go buy a NV card and then you get good shit out of the box you don't have to fuck around with trying to improve on AMD's disaster.

Agree,   undervolting and overclocking is for people who like doing that as much as using the card.  It's called the silicon lottery for a reason.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, Dionyz said:

Saving more power, and getting more performance out of it sounds like a good deal. I am baffled that AMD didn't do this for all cards. 

More than likely that voltages were set higher than necessary to ensure all cards would be stable, you wouldn't be able to downvolt all cards to the same level, just like you can't OC them as much.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, AnonymousGuy said:

I'm going to go with "AMD would have done it" if it was actually a good idea.

 

And just go buy a NV card and then you get good shit out of the box you don't have to fuck around with trying to improve on AMD's disaster.

I mean beating a GTX 1080 when it was on a GTX 1070 level is pretty dam impressive. I quite honestly believe they could do this on a bigger scale. I would say only a few wouldn't be able to undervolt. 

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Dionyz said:

I mean beating a GTX 1080 when it was on a GTX 1070 level is pretty dam impressive. I quite honestly believe they could do this on a bigger scale. I would say only a few wouldn't be able to undervolt. 

Polaris & Vega are functionally factory Overclocked, in both voltage & clocks, to keep up with Nvidia. GCN's later updates have been rough for AMD just because the Maxwell/Pascal range has been an All-timer. (The GTX 970 will end up being one of the best selling cards of all-time, it seems.) So if you get a card that undervolts, you can get some much better performance than you paid for.

Link to comment
Share on other sites

Link to post
Share on other sites

51 minutes ago, AnonymousGuy said:

I'm going to go with "AMD would have done it" if it was actually a good idea.

 

And just go buy a NV card and then you get good shit out of the box you don't have to fuck around with trying to improve on AMD's disaster.

I agree.

 

I do not understand how they did so well with Ryzen and Threadripper, then completely drop the fucking ball and start eating glue with Vega. Seriously, someone needs to get fired from that division because he's a fucking moron.

 

Hell, wasn't "make a card that can beat the 1080 when crossfired, so that two of that card costs less than a 1080" an option? I'd prefer that option than "not really any better performance reliably, for a similar price point"

 

Find a way to price it so three of that card beat a 1080ti but all together costs less than the 1080ti. That's a better option than this.

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Trik'Stari said:

I agree.

 

I do not understand how they did so well with Ryzen and Threadripper, then completely drop the fucking ball and start eating glue with Vega. Seriously, someone needs to get fired from that division because he's a fucking moron.

 

Hell, wasn't "make a card that can beat the 1080 when crossfired, so that two of that card costs less than a 1080" an option? I'd prefer that option than "not really any better performance reliably, for a similar price point"

 

Find a way to price it so three of that card beat a 1080ti but all together costs less than the 1080ti. That's a better option than this.

Vega is a transition uArch. It also seems that the Drivers are something of a disaster. All of the new-stuff is basically turned off. So it was always going to be a rough situation.

 

The other issue is that AMD isn't abandoning the Gaming Market, but it's pretty clear they're going to take their amazing Compute performance and break into Nvidia's Tesla market. That's actually the point of the Vega design right now. It's very much like the Zen package, though they can't stack multiple together.

 

It's the reason the Vega SSG is far & away the most interesting part of the product stack. Yes, the $7000 card.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, AnonymousGuy said:

I'm going to go with "AMD would have done it" if it was actually a good idea.

 

And just go buy a NV card and then you get good shit out of the box you don't have to fuck around with trying to improve on AMD's disaster.

I was holding out for the Vega 56 launch and really wanted to believe that AMD would bring a great card to the party, sadly it seems that I'll be picking up a GTX 1070 instead as I'm not one of those people who like to fuck around with a card.

 

Had they done this a year ago my opinion might be a bit different. Really is a shame considering how good the Ryzen CPU's were Vs. expectations.

Link to comment
Share on other sites

Link to post
Share on other sites

Wish I could afford Vega 56 even if I wouldn't mess around with the voltages 

The only reason I'm here is that I have homework that I don't want to do

 

PC  Specs   CPU: Intel Celeron N3060 |GPU: Intel HD graphics 400 |RAM2 gigs  |STORAGE16 gigs

 

 

It took me half an hour to find where to change my signature :(

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, Trik'Stari said:

I agree.

 

I do not understand how they did so well with Ryzen and Threadripper, then completely drop the fucking ball and start eating glue with Vega. Seriously, someone needs to get fired from that division because he's a fucking moron.

 

Hell, wasn't "make a card that can beat the 1080 when crossfired, so that two of that card costs less than a 1080" an option? I'd prefer that option than "not really any better performance reliably, for a similar price point"

 

Find a way to price it so three of that card beat a 1080ti but all together costs less than the 1080ti. That's a better option than this.

 

24 minutes ago, Taf the Ghost said:

Vega is a transition uArch. It also seems that the Drivers are something of a disaster. All of the new-stuff is basically turned off. So it was always going to be a rough situation.

 

The other issue is that AMD isn't abandoning the Gaming Market, but it's pretty clear they're going to take their amazing Compute performance and break into Nvidia's Tesla market. That's actually the point of the Vega design right now. It's very much like the Zen package, though they can't stack multiple together.

 

It's the reason the Vega SSG is far & away the most interesting part of the product stack. Yes, the $7000 card.

 

I still think the problem with VEGA is GCN.  It started out great guns, but over time is becoming less and less able to keep up.  It seems to me to be an architectural foundation that just doesn't lend itself top long term improvement with every generation being harder and taking longer to improve upon.   AMD can't afford to start again from scratch, but if navi turns out to be just as bad then they may have no choice.

 

EDIT: talking purely from a gaming perspective here.   Compute is a different kettle of fish.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, mr moose said:

 

 

I still think the problem with VEGA is GCN.  It started out great guns, but over time is becoming less and less able to keep up.  It seems to me to be an architectural foundation that just doesn't lend itself top long term improvement with every generation being harder and taking longer to improve upon.   AMD can't afford to start again from scratch, but if navi turns out to be just as bad then they may have no choice.

 

EDIT: talking purely from a gaming perspective here.   Compute is a different kettle of fish.

Yeah, with  Navi having AI on-chip I would put Navi as being a monster compute card. 

 

The difference between Nvidia & RTG at this stage seems to be that GCN is designed as a scaleable architecture that does everything, whereas Nvidias approach has a far more optimised pipeline depending on the application, i.e GTX970 is all about putting pixels on a screen as quickly as possible. 

Link to comment
Share on other sites

Link to post
Share on other sites

So some random reviewer managed to get a golden sample chip, overclocked it like crazy, then compared it against a stock Nvidia card?

Yeah... Sounds like a fair comparison. 9_9

 

 

57 minutes ago, Belgarathian said:

Yeah, with  Navi having AI on-chip I would put Navi as being a monster compute card. 

 

The difference between Nvidia & RTG at this stage seems to be that GCN is designed as a scaleable architecture that does everything, whereas Nvidias approach has a far more optimised pipeline depending on the application, i.e GTX970 is all about putting pixels on a screen as quickly as possible. 

Navi having "hardware accelerated AI" doesn't actually mean much.

I mean, the Snapdragon 835 has that too, and I am not hearing anyone shouting that it will be a computer compute chip.

It might be great, but it might also turn out to just be mostly marketing buzzwords.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, LAwLz said:

So some random reviewer managed to get a golden sample chip, overclocked it like crazy, then compared it against a stock Nvidia card?

Yeah... Sounds like a fair comparison. 9_9

 

 

Navi having "hardware accelerated AI" doesn't actually mean much.

I mean, the Snapdragon 835 has that too, and I am not hearing anyone shouting that it will be a computer compute chip.

It might be great, but it might also turn out to just be mostly marketing buzzwords.

Tbh he said that it's very complicated undervolting the chip, so I don't think many reviewers tried. 

Link to comment
Share on other sites

Link to post
Share on other sites

So if you do a questionable and maybe even dangerous undervolt/overclock of your card, that costs as much as a 1080 at this time, it might run a bit faster but still far hotter than a stock 1080?

 

Not impressed in the slightest: AMD enthusiast are in denial mode, for more on that see Level1techs talk about Vega and completely sidestep and dismiss stock and pricing issues.

 

The card was not ready, not even with the delays. It was probably a matter of "just fucking release as is now or you might end up having to scrap out the entire Vega lineup"

 

 

 

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

Aaaaaaand goodbye Vega stocks, forever. 

Thank you miners

On a mote of dust, suspended in a sunbeam

Link to comment
Share on other sites

Link to post
Share on other sites

AMD, where overclocking is spelled undervolting and used to reduce the internal fusion to sane levels.

Where all you need to do to break even is limit your chips nutrition (energy) while pushing it just as hard, or HARDER (still overclocking).

 

The end result is a card that is on even ground, while still needing more energy, but less... more... energy.

 

Uhm, no thanks.

Just fire the guys doing vega, ask what the guys at Ryzen did and copy that for GPUs next time.

Link to comment
Share on other sites

Link to post
Share on other sites

Makes sense that under volting gets better performance, as the less voltage equals less heat, and therefore allows the card to maintain higher clocks for longer.

 

I under volt my fury to keep temps down, doesn't help with overclocks though.

 

----Ryzen R9 5900X----X570 Aorus elite----Vetroo V5----240GB Kingston HyperX 3k----Samsung 250GB EVO840----512GB Kingston Nvme----3TB Seagate----4TB Western Digital Green----8TB Seagate----32GB Patriot Viper 4 3200Mhz CL 16 ----Power Color Red dragon 5700XT----Fractal Design R4 Black Pearl ----Corsair RM850w----

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Trik'Stari said:

 

I do not understand how they did so well with Ryzen and Threadripper, then completely drop the fucking ball and start eating glue with Vega. Seriously, someone needs to get fired from that division because he's a fucking moron.

The guy who dreamt up Vega has been removed/fired. However by the time they brought Koduri back in, it was too late to do a complete design so he salvaged it as best he could. Navi will be Koduri's brainchild alone so it should be interesting to see what happens.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, LAwLz said:

So some random reviewer managed to get a golden sample chip, overclocked it like crazy, then compared it against a stock Nvidia card?

Yeah... Sounds like a fair comparison. 9_9

Yeah, thats what I tought. There is no way that is consistent. Its like getting a monster 980ti with 1450Mhz and compairing to a 1080 FE that is not boosting properly because temps. Nowhere near a real usecase comparison.

 

Vega was bad, so bad. Man, I was hoping it would bring 1070 performance at lower prices, so I could MAYBE buy it in my country :/

Ultra is stupid. ALWAYS.

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, ravenshrike said:

The guy who dreamt up Vega has been removed/fired. However by the time they brought Koduri back in, it was too late to do a complete design so he salvaged it as best he could. Navi will be Koduri's brainchild alone so it should be interesting to see what happens.

Please... No more hyping up future AMD products.

How many times do people need to get burned before they stop?

 

Even if we say that Raja is a super genius, designing a GPU is a massive undertaking which requires many people. One person getting replace won't make a company go from making bad cards to making great cards.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×