Jump to content

Vega FE Hybrid Mod - Passing the 1080 but 400W+ Power Draw

Hunter259
3 minutes ago, Hunter259 said:

If it's drivers then that is a switch. From my knowledge, rasterization is done in hardware with no driver interaction. If for some reason you can just switch it on and off in drivers that is quite strange. 

The drivers don't make use of it, it's something you need to implement in them afaik. Right now it's basically running the fury drivers which probably can't make use of the rasterization implementation in Vega. 

I spent $2500 on building my PC and all i do with it is play no games atm & watch anime at 1080p(finally) watch YT and write essays...  nothing, it just sits there collecting dust...

Builds:

The Toaster Project! Northern Bee!

 

The original LAN PC build log! (Old, dead and replaced by The Toaster Project & 5.0)

Spoiler

"Here is some advice that might have gotten lost somewhere along the way in your life. 

 

#1. Treat others as you would like to be treated.

#2. It's best to keep your mouth shut; and appear to be stupid, rather than open it and remove all doubt.

#3. There is nothing "wrong" with being wrong. Learning from a mistake can be more valuable than not making one in the first place.

 

Follow these simple rules in life, and I promise you, things magically get easier. " - MageTank 31-10-2016

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, tom_w141 said:

Despite popular opinion it matters because I don't like excess heat as I sit right by my PC and am not comfortable with running components above 80 deg. On the topic of power well I pay the bills so less power draw please xD (unless performance is literally insane then i'd overlook it)

For me both are pretty negligible, thus I never cared. I once at 2x 6990s and a 9590. 1200w psu with the heat output to match. 

CPU: Amd 7800X3D | GPU: AMD 7900XTX

Link to comment
Share on other sites

Link to post
Share on other sites

48 minutes ago, goodtofufriday said:

For me both are pretty negligible, thus I never cared. I once at 2x 6990s and a 9590. 1200w psu with the heat output to match. 

You are thinking like a fanboy and not as businessman. On your residential scale, the extra electricity and power consumption of AMD may be negligible but for any enterprise consumer, it matters a lot. AMD won't be able to complete in those markets which means they'll generate far less revenue. In addition, their profit margins on Vega will be lower than Nvidia because the die size is larger which means it's more expensive to manufacture. The end result of this is less money to improve their product in the future. And that will hurt you as a residential customer, which is the problem right now where Vega can't beat Nvidia's top flagship GPU, just like Fury X couldn't beat 980 Ti. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, RagnarokDel said:

What does a 1080 overclocked balls to the wall pull?

1080 Ti is around 330-350W. So 1080 would be the same or less. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, bomerr said:

1080 Ti is around 330-350W. So 1080 would be the same or less. 

Quite a bit less. 

http://www.tomshardware.com/reviews/nvidia-geforce-gtx-1080-pascal,4572-10.html

 

http://www.tomshardware.com/reviews/nvidia-geforce-gtx-1080-ti,4972-6.html

 

And no 1080ti is a LOT less than 330w. Overclocked it might hit 300.

Main Gaming PC - i9 10850k @ 5GHz - EVGA XC Ultra 2080ti with Heatkiller 4 - Asrock Z490 Taichi - Corsair H115i - 32GB GSkill Ripjaws V 3600 CL16 OC'd to 3733 - HX850i - Samsung NVME 256GB SSD - Samsung 3.2TB PCIe 8x Enterprise NVMe - Toshiba 3TB 7200RPM HD - Lian Li Air

 

Proxmox Server - i7 8700k @ 4.5Ghz - 32GB EVGA 3000 CL15 OC'd to 3200 - Asus Strix Z370-E Gaming - Oracle F80 800GB Enterprise SSD, LSI SAS running 3 4TB and 2 6TB (Both Raid Z0), Samsung 840Pro 120GB - Phanteks Enthoo Pro

 

Super Server - i9 7980Xe @ 4.5GHz - 64GB 3200MHz Cl16 - Asrock X299 Professional - Nvidia Telsa K20 -Sandisk 512GB Enterprise SATA SSD, 128GB Seagate SATA SSD, 1.5TB WD Green (Over 9 years of power on time) - Phanteks Enthoo Pro 2

 

Laptop - 2019 Macbook Pro 16" - i7 - 16GB - 512GB - 5500M 8GB - Thermal Pads and Graphite Tape modded

 

Smart Phones - iPhone X - 64GB, AT&T, iOS 13.3 iPhone 6 : 16gb, AT&T, iOS 12 iPhone 4 : 16gb, AT&T Go Phone, iOS 7.1.1 Jailbroken. iPhone 3G : 8gb, AT&T Go Phone, iOS 4.2.1 Jailbroken.

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Hunter259 said:

That's founders edition which has a stock power limit of 250W, 300W with power slider maxed out but they can't hit that because of overheating. Non reference cards have higher power limits of 360W for EVGA FTW3, 330W for Strix OC, Zotac is above 400W IIRC. From my experience, and other users as well, these cards like to run at around 330W, It's hard to push more watts out of out but not impossible. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Bananasplit_00 said:

Never claimed your card was a potato but your definition of a potato seems to be a card that is still pretty decent at 1080p which is quite a bit off what most other people probably think. A potato would be a 750ti or lower to me

Probably though even in some games like wild lands maxed out in 1080p my card becomes a potato as well.....potato's everywhere

CPU: 6700K Case: Corsair Air 740 CPU Cooler: H110i GTX Storage: 2x250gb SSD 960gb SSD PSU: Corsair 1200watt GPU: EVGA 1080ti FTW3 RAM: 16gb DDR4 

Other Stuffs: Red sleeved cables, White LED lighting 2 noctua fans on cpu cooler and Be Quiet PWM fans on case.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, cj09beira said:

My 7870 at 1300mhz and my rx480 at 1450mhz say hi

Many Vega features are not working and it's getting 1080 perf. The rasterizer will make it use less power for the same perf also. Why they are disabled? Too small driver team probably. 

Shame I wanted it to destroy Nvidia for a cheaper price. Either way that's not going to happen now that mining made everything more expensive.

CPU: 6700K Case: Corsair Air 740 CPU Cooler: H110i GTX Storage: 2x250gb SSD 960gb SSD PSU: Corsair 1200watt GPU: EVGA 1080ti FTW3 RAM: 16gb DDR4 

Other Stuffs: Red sleeved cables, White LED lighting 2 noctua fans on cpu cooler and Be Quiet PWM fans on case.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, rasmuskrj said:

Why doesn't Linus do something like this? Pretty disappointed at the 'usual' tech channels' coverage of the Vega FE. If you only make reviews of the stuff manufacturers send to you, you lose your position as an independant journalist.

 

Gamer's Nexus is the hero we need, apparently.  

Linus has gotten really big and his channel has moved on to more "mainstream" content. This type of hardcore analysis is only interesting for a very small niche of hardcore gamers such as ourselves which isn't enough people to support a channel as large as Linus. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Ekst4zy said:

I would love to argue that AMD did a good job with Vega, but damn it's hard :( 

Just do what all the AMD fanboys are doing and say that the card is amazing and future driver updates will make it better than Nvidia.

It is impossible to disprove so you can make the claim however much you want, nobody can argue against it.

 

 

2 hours ago, goodtofufriday said:

The Amd FE has many gaming optimizations, such as tile based rasterization, disabled. This negatively impacts gaming performance by a significant degree. The quadro doesnt have that limitation.

 

This is part of why most havent' said Vega RX is doa.

Can you give a few examples of these "many gaming optimizations" that are disabled? I only know of supposedly one (which you mentioned).

The only one you mentioned was tile based rasterization. How much performance increase will that have? You say that it's significant but I can not find any numbers regarding it (actually, that's a lie because I have a vague idea of what effect it will have and it doesn't align with what AMD fanboys on /r/AMD expect). I am pretty sure you also said it would decrease power. How much do you expect that it will decrease by and what do you base that estimate on?

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, goodtofufriday said:

I subscribe to the thought that power and thermals dont matter =/ I really dont get why it matters. 

Given the lack of LAN gaming as a pastime within the last 5 to 10 years, thermals* and power draw don't matter nearly as much as they used to since you're not in danger of blowing a breaker by slapping three or more high end computers on the same electrical circuit anymore. True, you need a beefier power supply, but anyone building a computer with the highest end components should be running one anyway as well as a high end UPS that runs a normal current sine wave.

 

 

*On a properly designed card anyway. Vega Frontier really doesn't qualify.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, bomerr said:

You are thinking like a fanboy and not as businessman. On your residential scale, the extra electricity and power consumption of AMD may be negligible but for any enterprise consumer, it matters a lot. AMD won't be able to complete in those markets which means they'll generate far less revenue. In addition, their profit margins on Vega will be lower than Nvidia because the die size is larger which means it's more expensive to manufacture. The end result of this is less money to improve their product in the future. And that will hurt you as a residential customer, which is the problem right now where Vega can't beat Nvidia's top flagship GPU, just like Fury X couldn't beat 980 Ti. 

In those markets Vega competes with p100 card not this small gaming gpus

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, bomerr said:

You are thinking like a fanboy and not as businessman. On your residential scale, the extra electricity and power consumption of AMD may be negligible but for any enterprise consumer, it matters a lot. AMD won't be able to complete in those markets which means they'll generate far less revenue. In addition, their profit margins on Vega will be lower than Nvidia because the die size is larger which means it's more expensive to manufacture. The end result of this is less money to improve their product in the future. And that will hurt you as a residential customer, which is the problem right now where Vega can't beat Nvidia's top flagship GPU, just like Fury X couldn't beat 980 Ti. 

unless im wrong... Unless someone buys AMDs product, they wont have the budget to make more or better cards. Which would the gerneral consumer more whether they buy AMD or Nvidia.

And, at least for gaming, im not an enterprise consumer, so why do I care what their prosumer/enterprise cards do?

CPU: Amd 7800X3D | GPU: AMD 7900XTX

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, goodtofufriday said:

unless im wrong... Unless someone buys AMDs product, they wont have the budget to make more or better cards. Which would the gerneral consumer more whether they buy AMD or Nvidia.

And, at least for gaming, im not an enterprise consumer, so why do I care what their prosumer/enterprise cards do?

Look at the reality today. You CANNOT purchase an AMD card that outperforms an Nvidia card on the high-end. That's why you should care. Today AMD and Nvidia only trade heads at the 1060/580 price point. Above that, @ 1070, 1080 and 1080 Ti, price points, Nvidia has complete dominance. Eventually that will trickle down until Nvidia has complete dominance of the market even at 1060 and lower price points. AMD cannot exist in the GPU space long-term with their current product offerings. So If you care about buying AMD cards in the future then you should care a lot. 

 

Also AMD and Nvidia use the exact same architecture for both their consumer and enterprise cards. 

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, LAwLz said:

Just do what all the AMD fanboys are doing and say that the card is amazing and future driver updates will make it better than Nvidia.

It is impossible to disprove so you can make the claim however much you want, nobody can argue against it.

 

 

Can you give a few examples of these "many gaming optimizations" that are disabled? I only know of supposedly one (which you mentioned).

The only one you mentioned was tile based rasterization. How much performance increase will that have? You say that it's significant but I can not find any numbers regarding it (actually, that's a lie because I have a vague idea of what effect it will have and it doesn't align with what AMD fanboys on /r/AMD expect). I am pretty sure you also said it would decrease power. How much do you expect that it will decrease by and what do you base that estimate on?

Dont expect it to decrease power. I actually expect a power increase, especially on the water cooled product. I'm just personally fine with that fact and dont really care about more power consumption.

 

And I overstated "many" gaming optimizations. I only know about the rasterizing, which if it is non-existent as has been suggested then that would hurt performance more so than AMDs poor implenation of rasterizing. 

And the other being gamernexus' conformation that gaming mode doesnt do anything at current.

CPU: Amd 7800X3D | GPU: AMD 7900XTX

Link to comment
Share on other sites

Link to post
Share on other sites

Man I'm so glad AMD got Ryzen because Vega doesn't really look good right now.

 

Even with a better driver GTX 1080 performance a year too late and with 300W is just horrible no matter how you look at it. It has been clear since the 4k Doom demo we wouldn't be seeing 1080 Ti performance so I kind of expected better than GTX 1080 at 225W but this is worse than expected. Such a shame.

 

I really feel bad about this whole situation. :(

 

I really hope AMD sells lot of cards to professionals (where it actually competes with the Titan Xp) so AMD can actually compete in the future!

\\ QUIET AUDIO WORKSTATION //

5960X 3.7GHz @ 0.983V / ASUS X99-A USB3.1      

32 GB G.Skill Ripjaws 4 & 2667MHz @ 1.2V

AMD R9 Fury X

256GB SM961 + 1TB Samsung 850 Evo  

Cooler Master Silencio 652S (soon Calyos NSG S0 ^^)              

Noctua NH-D15 / 3x NF-S12A                 

Seasonic PRIME Titanium 750W        

Logitech G810 Orion Spectrum / Logitech G900

2x Samsung S24E650BW 16:10  / Adam A7X / Fractal Axe Fx 2 Mark I

Windows 7 Ultimate

 

4K GAMING/EMULATION RIG

Xeon X5670 4.2Ghz (200BCLK) @ ~1.38V / Asus P6X58D Premium

12GB Corsair Vengeance 1600Mhz

Gainward GTX 1080 Golden Sample

Intel 535 Series 240 GB + San Disk SSD Plus 512GB

Corsair Crystal 570X

Noctua NH-S12 

Be Quiet Dark Rock 11 650W

Logitech K830

Xbox One Wireless Controller

Logitech Z623 Speakers/Subwoofer

Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, bomerr said:

Look at the reality today. You CANNOT purchase an AMD card that outperforms an Nvidia card on the high-end. That's why you should care. Today AMD and Nvidia only trade heads at the 1060/480 price point. Above that, @ 1070, 1080 and 1080 Ti, price points.Nvidia has complete dominance. Eventually that will trickle down until Nvidia has complete dominance of the market even at 1060 and lower price points. AMD cannot exist in the GPU space long-term with their current product offerings. So If you care about buying AMD cards in the future then you should care a lot. 

 

Also AMD and Nvidia use the exact same architecture for both their consumer and enterprise cards. 

Thats okay. AMD's timeline for cards that would be at the high end was for 2018. 2017 cards are supposed to fall in between the 1070 and the 1080. So I'll just be patient. 

And I do care. Which is why I buy their products even if its inferior. Given them my money and market share provides them with an increased budget for next year. What we need are more people investing into AMD with their wallets. 

CPU: Amd 7800X3D | GPU: AMD 7900XTX

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, goodtofufriday said:

Thats okay. AMD's timeline for cards that would be at the high end was for 2018. 2017 cards are supposed to fall in between the 1070 and the 1080. So I'll just be patient. 

Based on what data do you predict AMD will be able to deliver these high-end cards? Look at history, AMD hasn't released cards that are competitive to Nvidia high-end since like R9 290x. 

3 minutes ago, goodtofufriday said:

And I do care. Which is why I buy their products even if its inferior. Given them my money and market share provides them with an increased budget for next year. What we need are more people investing into AMD with their wallets. 

That's the complete opposite. You are rewarding AMD for developing mediocre cards/

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, goodtofufriday said:

Dont expect it to decrease power. I actually expect a power increase, especially on the water cooled product. I'm just personally fine with that fact and dont really care about more power consumption.

 

And I overstated "many" gaming optimizations. I only know about the rasterizing, which if it is non-existent as has been suggested then that would hurt performance more so than AMDs poor implenation of rasterizing. 

And the other being gamernexus' conformation that gaming mode doesnt do anything at current.

It's the rasterizer and the extra geometry output. The rasterizer should reduce power output as it allows to stop rendering triangles sooner. How much only AMD knows. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, bomerr said:

Based on what data do you predict AMD will be able to deliver these high-end cards? Look at history, AMD hasn't released cards that are competitive to Nvidia high-end since like R9 290x. 

That's the complete opposite. You are rewarding AMD for developing mediocre cards/

I'm not predicting anything. Thats AMDs timeline that they've stated with graphs. AMDs history not being competitive is also only in the last few years, not the last 15. They can change as they did with Ryzen.

 

Rewarding? So exactly how do you expect AMD to get R&D budget? Magic? Counterfeiting cash??? Not paying taxes????? I mean are you being serious here? If no one buys their cards, then AMD goes out of busienss, and then theres only Nvidia. I'm guessing thats what you want though.

CPU: Amd 7800X3D | GPU: AMD 7900XTX

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, goodtofufriday said:

Dont expect it to decrease power. I actually expect a power increase, especially on the water cooled product.

Why do you expect it to increase power, and why do you expect the water cooled one to be affected more than the air cooled one?

 

1 hour ago, goodtofufriday said:

I'm just personally fine with that fact and dont really care about more power consumption.

It's not really a fact if it's just a prediction, right?

 

1 hour ago, goodtofufriday said:

And I overstated "many" gaming optimizations. I only know about the rasterizing, which if it is non-existent as has been suggested then that would hurt performance more so than AMDs poor implenation of rasterizing. 

Well that didn't answer my question. I asked how big of an improvement you expect and what you base that estimate on. I have seen you mention this over, and over, and over so I assume you got some actual evidence or sources to back it up. Right?

 

1 hour ago, goodtofufriday said:

And the other being gamernexus' conformation that gaming mode doesnt do anything at current.

Are you sure that it actually will do anything? Like I said in some other thread about this, there is no such thing as a "gaming driver" vs a "workstation driver" which alters performance. Workstation drivers are just the same old gaming drivers but that have gone through more verification (and therefore are often a few releases behind) and in some cases they might include a special feature like 10 bit color support (which does not increase performance).

 

I find this entire idea of switching between "game mode" and "workstation mode", where the driver supposidly gets changed to perform better in games or workstation tasks really silly.

Link to comment
Share on other sites

Link to post
Share on other sites

Wait for the gaming version

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, bomerr said:

. AMD cannot exist in the GPU space long-term with their current product offerings. So If you care about buying AMD cards in the future then you should care a lot. 

 

Also AMD and Nvidia use the exact same architecture for both their consumer and enterprise cards. 

First you say this,  then you insinuate he should not "reward" (interpreted as: buy products from) AMD.

2 hours ago, bomerr said:

 

That's the complete opposite. You are rewarding AMD for developing mediocre cards/

 

Suggesting AMD are not going to strive for the best performing GPU because people are "rewarding" them by buying mediocre cards is disingenuous to AMD's corporate professionalism.   There is literally is no other way to meaningfully support a company other than to buy its products.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, bomerr said:

You are thinking like a fanboy and not as businessman. On your residential scale, the extra electricity and power consumption of AMD may be negligible but for any enterprise consumer, it matters a lot. AMD won't be able to complete in those markets which means they'll generate far less revenue. In addition, their profit margins on Vega will be lower than Nvidia because the die size is larger which means it's more expensive to manufacture. The end result of this is less money to improve their product in the future. And that will hurt you as a residential customer, which is the problem right now where Vega can't beat Nvidia's top flagship GPU, just like Fury X couldn't beat 980 Ti. 

Nvidia makes far more money selling 1060's than selling to enterprise. Also Vega FE is not an enterprise card, at all, look at Radeon Instinct for that which does have a Vega card in it's line up passively cooled and less than 300W TDP. Not that it really matters at all since no matter how good the performance is, how little the power draw is, how cool it runs 90% of datacenters won't be switching to AMD this generation as it not possible to on the software side. VDI is the only real use case for AMD cards and in that respect they do actually have superior technology to do it, SR-IOV capable GPUs.

 

If you want to make money sell as many mid range as you can, those typically get sold by having the best high end card as that gives brand appeal. The whole form is living proof of how brand appeal works ;). There is nothing wrong with that, but it is a bit silly when people end up buying a worse card at a certain price point because of it.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×