Jump to content

Why are so many people saying that the savings in power between RX VEGA 64, and a GTX 1080, is massive?

GamingDevilsCC

Assuming the average power cost is 13 cents a kw/h, and the PC uses a total of 500 watts, the person would be paying $56.94 per year (0.13 / 5 x 6 x 365) , for 6 hours of nonstop gaming, per year. Not only that, a GTX 1080 is about 50 watts less, (http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/30) so the electricity bill would only be $24.48 different in a years point. (assuming it's in the same setup as the VEGA 64 PC)

 

formula if any of you guys would like to disagree with me. (kh/h / number that divides 1000 to your pc's output wattage / hours played per day days in a year (365) )

 

I cant seem to find a clear answer for the power draw for RX VEGA 64 Air, as TDP does not equal Power consumption, so I'm assuming that it uses 400 watts.

 

However, I'm not going to be saying that VEGA 64 is bettter than other Graphics cards in it's price point, as a GTX 1080TI costs roughly the same, and is faster. Instead I am trying to clear out all this incorrect information about the price difference in power draw.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, RGProductions said:

overclocked these things draw north of 500 watts

150 Watts increase? Could you send me a link? Also, won't those same people overclocking RX VEGA 64, also overclock a GTX 1080?

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, GamingDevilsCC said:

150 Watts increase? Could you send me a link? Also, wont people also overclock a GTX 1080 then?

People have blown the importance of power draw out of proportion for generations now in order to make Nvidia cards look better.

CPU: I5 4590 Motherboard: ASROCK H97 Pro4 Ram: XPG 16gb v2.0 4x4 kit  GPU: Gigabyte GTX 970 PSU: EVGA 550w Supernova G2 Storage: 128 gb Sandisk SSD + 525gb Mx300 SSD Cooling: Be Quiet! Shadow Rock LP Case: Zalman T2 Sound: Logitech Z506 5.1 Mouse: Razer Deathadder Chroma Keyboard: DBPower LED

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, RGProductions said:

overclocked these things draw north of 500 watts

Gamers Nexus undervolted threw Vega 56 by 170mV and was able to OC by +66MHz on that voltage.

Power draw went down by 71W.

That's with the 50% power limit increase

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Moress said:

People have blown the importance of power draw out of proportion for generations now in order to make Nvidia cards look better.

Still annoying being in a room on a summer's day that's being cooked by a power hungry GPU.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, GamingDevilsCC said:

150 Watts increase? Could you send me a link? Also, wont people also overclock a GTX 1080 then?

yeah, but a card drawing 500 watts is just stupid. That draws more than my build. The only article I can find is the old one with 400+, but I recall there was one at 500 or so

M1 MacBook Air 256/8 | iPhone 13 pro

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Moress said:

People have blown the importance of power draw out of proportion for generations now in order to make Nvidia cards look better.

Not quite, it's a possible difference of hundreds of $$$ over 3-5 years.

 

 

Want to custom loop?  Ask me more if you are curious

 

Link to comment
Share on other sites

Link to post
Share on other sites

For some, the savings won't be much, you're right.  But in some places, electricity is more than 13 cents per kWh, it's more like in the 30s.  And some people play far more than you estimated too.  And, finally, at some point it's not even about the savings in power necessarily, but just the comfort of your room, as a 250 W PC will heat it a lot slower than a 600 W one.

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, GamingDevilsCC said:

150 Watts increase? Could you send me a link? Also, won't those same people overclocking RX VEGA 64, also overclock a GTX 1080?

Popped a breaker in my room overclocking my Vega 64. Granted I had 2 other PCs going and a portable ac. I was pulling 700w system power at +50%

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Cookybiscuit said:

Still annoying being in a room on a summer's day that's being cooked by a power hungry GPU.

http://www.anandtech.com/show/11717/the-amd-radeon-rx-vega-64-and-56-review/19

It's a 1 degree difference between a GTX 1080.

 

Also, turns out RX VEGA 64 is about 100 watts more, so I'm going to increase the power difference price in a year from $14.24 (50 watts) to $28.48 (100 watts).

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Ryan_Vickers said:

For some, the savings won't be much, you're right.  But in some places, electricity is more than 13 cents per kWh, it's more like in the 30s.  And some people play far more than you estimated too.  And, finally, at some point it's not even about the savings in power necessarily, but just the comfort of your room, as a 250 W PC will heat it a lot slower than a 600 W one.

I guess you are right about the power cost, but wont that triple the power pricing for a GTX 1080 by 2-3 times too? (assuming its 30 cents a kw/h) And if people were to play for more than 6 hours, they can double my results to get a 12 hour result, or use the provided formula, and calculate with there statistics!

 

Anyways, a GTX 1080 is a 1 degree difference than an RX VEGA 64 http://www.anandtech.com/show/11717/the-amd-radeon-rx-vega-64-and-56-review/19 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, GamingDevilsCC said:

http://www.anandtech.com/show/11717/the-amd-radeon-rx-vega-64-and-56-review/19

It's a 1 degree difference between a GTX 1080.

 

Also, turns out RX VEGA 64 is about 100 watts more, so I'm going to increase the power difference price in a year from $14.24 (50 watts) to $28.48 (100 watts).

Operating temperature does not indicate anything about how much heat the GPU is outputting. If anything, the cooler has to work harder on a card that generates more heat to achieve the same temperature, which the fan noise charts indicate.

 

Besides that, GPUs will operate as fast as they can before temperatures get too hot. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, GamingDevilsCC said:

I guess you are right about the power cost, but wont that triple the power pricing for a GTX 1080 by 2-3 times too? (assuming its 30 cents a kw/h) And if people were to play for more than 6 hours, they can double my results to get a 12 hour result, or use the provided formula, and calculate with there statistics!

yes but the point is it would triple the difference between them.

1 minute ago, GamingDevilsCC said:

Anyways, a GTX 1080 is a 1 degree difference than an RX VEGA 64 http://www.anandtech.com/show/11717/the-amd-radeon-rx-vega-64-and-56-review/19 

Load temps?  Not relevant :P

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, GamingDevilsCC said:

http://www.anandtech.com/show/11717/the-amd-radeon-rx-vega-64-and-56-review/19

It's a 1 degree difference between a GTX 1080.

 

Also, turns out RX VEGA 64 is about 100 watts more, so I'm going to increase the power difference price in a year from $14.24 (50 watts) to $28.48 (100 watts).

Tell me, if you take ten hairdryers that all run at the same temperature, do you think the room is going to be hotter if you turn one of them on or all ten?

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Damascus said:

Not quite, it's a possible difference of hundreds of $$$ over 3-5 years.

 

 

From what I've seen, (I'm not an owner of a high end card), people with high end cards are usually said to upgrade every 2-3 years. Along with that, in 3 years the power price difference using the average american power usage (13 cents a kw/h) is $85.44

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, GamingDevilsCC said:

Assuming the average power cost is 13 cents a kw/h, and the PC uses a total of 500 watts, the person would be paying $56.94 per year (0.13 / 5 x 6 x 365) , for 6 hours of nonstop gaming, per year. Not only that, a GTX 1080 is about 50 watts less, (http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/30) so the electricity bill would only be $14.24 different in a years point. (assuming it's in the same setup as the VEGA 64 PC)

 

formula if any of you guys would like to disagree with me. (kh/h / number that divides 1000 to your pc's output wattage / hours played per day days in a year (365) )

 

I cant seem to find a clear answer for the power draw for RX VEGA 64 Air, as TDP does not equal Power consumption, so I'm assuming that it uses 350 watts.

 

However, I'm not going to be saying that VEGA 64 is bettter than other Graphics cards in it's price point, as a GTX 1080TI costs roughly the same, and is faster. Instead I am trying to clear out all this incorrect information about the price difference in power draw.

The thing is, if you're rich enough to afford any of the competing top-of-the-line cards, IT SHOULDN'T FUCKING MATTER. STOP USING POWER DRAW IN GPU VALUE ARGUMENTS

Link to comment
Share on other sites

Link to post
Share on other sites

 

Just now, Cookybiscuit said:

Tell me, if you take ten hairdryers that all run at the same temperature, do you think the room is going to be hotter if you turn one of them on or all ten?

Tell me, if you try to debate actual stats with loose analogies do you think it will work (These days its unfortunately yes)

CPU: I5 4590 Motherboard: ASROCK H97 Pro4 Ram: XPG 16gb v2.0 4x4 kit  GPU: Gigabyte GTX 970 PSU: EVGA 550w Supernova G2 Storage: 128 gb Sandisk SSD + 525gb Mx300 SSD Cooling: Be Quiet! Shadow Rock LP Case: Zalman T2 Sound: Logitech Z506 5.1 Mouse: Razer Deathadder Chroma Keyboard: DBPower LED

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Ryan_Vickers said:

yes but the point is it would triple the difference between them.

Load temps?  Not relevant :P

So the power difference price would be about $85.44. But if a person can spend upwards of $500 on these cards, would they mind it a ton? If they did, wouldn't they also stabily undervolt the card?

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, H0R53 said:

The thing is, if you're rich enough to afford any of the competing top-of-the-line cards, IT SHOULDN'T FUCKING MATTER. STOP USING POWER DRAW IN GPU VALUE ARGUMENTS

If the person values how much heat their system generates, then power draw does add to the value of the GPU.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, H0R53 said:

The thing is, if you're rich enough to afford any of the competing top-of-the-line cards, IT SHOULDN'T FUCKING MATTER. STOP USING POWER DRAW IN GPU VALUE ARGUMENTS

Based on the part you bolded you read the OP wrong. The op is saying that since the Vega 64 performs like a 1080 and is priced like a 1080ti its a bad value regardless of power draw.

CPU: I5 4590 Motherboard: ASROCK H97 Pro4 Ram: XPG 16gb v2.0 4x4 kit  GPU: Gigabyte GTX 970 PSU: EVGA 550w Supernova G2 Storage: 128 gb Sandisk SSD + 525gb Mx300 SSD Cooling: Be Quiet! Shadow Rock LP Case: Zalman T2 Sound: Logitech Z506 5.1 Mouse: Razer Deathadder Chroma Keyboard: DBPower LED

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Moress said:

 

Tell me, if you try to debate actual stats with loose analogies do you think it will work (These days its unfortunately yes)

Sorry but are you actually suggesting that because a GTX480 runs at 90C, if you set up four of them in SLI your room isn't going to get any hotter despite the 4x increase in waste heat output? Did you pay absolutely zero attention in school?

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Cookybiscuit said:

Tell me, if you take ten hairdryers that all run at the same temperature, do you think the room is going to be hotter if you turn one of them on or all ten?

There is no such thing has running a RX VEGA 64 in crossfire with 10 cards.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, H0R53 said:

The thing is, if you're rich enough to afford any of the competing top-of-the-line cards, IT SHOULDN'T FUCKING MATTER. STOP USING POWER DRAW IN GPU VALUE ARGUMENTS

That's what I'm thinking.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Cookybiscuit said:

Sorry but are you actually suggesting that because a GTX480 runs at 90C, if you set up four of them in SLI your room isn't going to get any hotter despite the 4x increase in waste heat output? Did you pay absolutely zero attention in school?

Don't care about Fermi Furnaces. I care about the fact you tried to debate evidence that the cards had a (room) temperature delta of exactly 1 degree with an analogy about hair-driers 

CPU: I5 4590 Motherboard: ASROCK H97 Pro4 Ram: XPG 16gb v2.0 4x4 kit  GPU: Gigabyte GTX 970 PSU: EVGA 550w Supernova G2 Storage: 128 gb Sandisk SSD + 525gb Mx300 SSD Cooling: Be Quiet! Shadow Rock LP Case: Zalman T2 Sound: Logitech Z506 5.1 Mouse: Razer Deathadder Chroma Keyboard: DBPower LED

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×