Jump to content

Nvidia's Domination of the Market - This is Bad

Khajiit Dealer

AMD has to focus on building more energy efficient GPUs or drop their prices more. When the competition was between Kepler and Hawaii, 290 consumed roughly 30W than 780. Now there is more than 100W difference between 390 and competing 970. 

nop, the realistic load is more like 63w...

(furmark is not realistic and the test below uses furmark. Why is furmark not realistic? because it INCREASES VOLTAGE/POWER LIMITS. THUS POWER DRAW GOES UP A LOT MORE THEN WHAT IT WOULD DURING A GAMING LOAD!.)

 

 
now bugger off with the bullshit mate. 63w using a program made specifically to make GPUs use more power then they should... yeah. great argument mate.
Lets start rating your cars performance once we remove the rev limiter and the engine just goes 100% until it seizes up.... yeah, because that is what furmark is.
 

 

A LOT of hypocritical kneejerk fanboys do.

People seem to fail to realize that the difference in power draw between the 970 and 390 per say is ONE OR TWO LIGHTBULBS.

just go turn off the light in another room where you/anyone else isnt using and the power difference is practically zero.

but people do not want to admitt that. Because it defeats their argument.

Hell, even if you made a fully OCd R9 390x + FX 8350 setup, with 1.6V RAM and only HDDs, you would still, even with all the power draw, save more power by SPENDING ONE MINUTE LESS EVERY TIME YOU TAKE A SHOWER.... One minute less of hot water being used is going to save you more in the long run then using the most efficient Intel + Nvidia setup you can imagine.

Why is that?

Because the time it takes to heat up 20-50litres of water to 70 degrees C (got to be over 63c or else there is a risk of salmonella developing in the pipes and boiler) takes more time and energy then most people spend on their PC gaming at 100% load.

Think for a moment. a normal water boiler you have on the kitchen sink uses 2KW, it uses 1-2 minutes to heat 2-3 litres of water...

The average boiler is 2-3KW, it needs to heat up AND MAINTAIN the heat of approximatly 200-300 litres... Every-time you open the tap, the outside water that entres the tank is around 4-8c, maybe a bit higher during summer in hot places.... now to take several hundreds of litres of water and heat all of it up to 70c, when you "start out" at 4-8c...

THINK people....

This is where your REAL power drain is.

a GPU using 60w more power at full load for 2-10 hours IS A BLOODY DROP IN THE OCEAN!!!

Just going from using normal thermostats to day/night cycle thermostats can save you more money per year on electrical heating THEN YOUR PC CONSUMES DURING THAT YEAR!

I work as a Electrician. These are simply facts of life, and it is part of my job to inform people how to save on electricity usage. Hell, we had 30min - 1 hour a day in school for half a year just learning how to stop wasting energy. And you dimwitted fanboys cry about ONE LIGHTBULB!!!

If you want to make friends with the polar bears, then believe me when i say, they dont care about a lightbulb, they dont care about ten million lightbulbs. Because the consumption difference between Nvidia and AMD, even on a global scale, is negligent to the point of ridiculousness.

And as for the power bill argument...

you are arguing about 5 dollars a year.

Skip out on three bottles of cola and you have made up the difference....

The heat argument.

Turn off your electrial heating devices in the room you are in during "gaming". Doing so will save you MORE money then buying parts that output less heat in a PC. Way more.

And aslong as your PC is running, it will maintain the heat in your room much more "power efficiently" then a electrical oven will.

 

http://energy.gov/articles/askenergysaver-home-water-heating

 

Now take your energy efficiency bullshit somewhere else.

The whole "efficiency" argument was created by Nvidia's marketing department to have a solid argument against AMD.

In truth, you do not save that much money at all. The difference by going to Nvidia is so slim, that you would practically not notice it over a whole year. I am talking of LESS then 20 bucks a year, if you were using your GPU an absurd amount of time pr day (like 8 hours a day, every day no exception)....

What can you get for 20 bucks? A pizza from Papa Johns and some Cola....

 

buy some less salt and you can afford running with AMD

Link to comment
Share on other sites

Link to post
Share on other sites

nop, the realistic load is more like 63w...

(furmark is not realistic and the test below uses furmark. Why is furmark not realistic? because it INCREASES VOLTAGE/POWER LIMITS. THUS POWER DRAW GOES UP A LOT MORE THEN WHAT IT WOULD DURING A GAMING LOAD!.)

 

 
now bugger off with the bullshit mate. 63w using a program made specifically to make GPUs use more power then they should... yeah. great argument mate.
Lets start rating your cars performance once we remove the rev limiter and the engine just goes 100% until it seizes up.... yeah, because that is what furmark is.
 

 

 

http://energy.gov/articles/askenergysaver-home-water-heating

 

Now take your energy efficiency bullshit somewhere else.

The whole "efficiency" argument was created by Nvidia's marketing department to have a solid argument against AMD.

In truth, you do not save that much money at all. The difference by going to Nvidia is so slim, that you would practically not notice it over a whole year. I am talking of LESS then 20 bucks a year, if you were using your GPU an absurd amount of time pr day (like 8 hours a day, every day no exception)....

What can you get for 20 bucks? A pizza from Papa Johns and some Cola....

 

buy some less salt and you can afford running with AMD

 

 

For your information I'm using a 290X twin frozr. I'm experiencing about $12 difference in monthly electricity bill compared to previous GTX 560ti. Electricity cost is not same in every region

Link to comment
Share on other sites

Link to post
Share on other sites

The main problem people that I know have with AMD compared to Nvidia is power consumption and heat. 

 

Honestly I think right now AMD needs to work on their software. The power consumption and heat isn't a huge deal. It's certainly something that's worth considering, but since most people have PSUs with double the wattage they actually need it shouldn't be a barrier for the majority of customers. Heat is managed by aftermarket coolers just fine. But the biggest stigma AMD suffers from is bad drivers... I haven't used AMD extensively so I don't know from first-hand experience, but I've damn sure heard a lot more complaints of driver issues and firmware issues with their cards (e.g. there was a problem for awhile with R9 390/390X cards causing computer to turn on immediately after being shut down). Furthermore Shadowplay is a lot better than AMD's alternative... it's both better optimized, produces better quality files at high bitrates (although AMD's VCE seems to do better at low bitrates) and smoother framerate files.

 

Hardware is what it is. It's not like they haven't known since the 7000 series that power consumption is something Nvidia has an edge on and they should at least consider. Efficiency is a high priority for both companies, since the lower the power consumption of the architecture is, the more shader units and higher the clock speeds you can squeeze out of the chip. But the last thing any customer wants to deal with is having to jump through hoops to get their games working properly, or to have driver issues... and I'm sure lots of people would like to have better recording quality, tools and ease of use on the AMD side of things.

 

So yeah... I think AMD has been doing fine in the hardware department. I'm pretty sure they'd see a significant boost to customer satisfaction and loyalty if they made the post-purchase experience better with better drivers/software.

Link to comment
Share on other sites

Link to post
Share on other sites

For your information I'm using a 290X twin frozr. I'm experiencing about $12 difference in monthly electricity bill compared to previous GTX 560ti. Electricity cost is not same in every region

i know, but your statement is a generalising one. If you are to be arguing correctly you should outline explicitly that in your region it may cost more. However depending on the electricity supplier of choice, the difference may be smaller even in your region.

 

however fact is, what i said holds true no matter where you live. There is other ways to make up that loss of money other then buying inferior hardware. And the ways i outlined not only saves you money, but it also helps the enviroment as a whole. Water waste is a massive issue globally, and should not be scoffed at.

Proper temperature control is also a major point where a lot of money can be saved, globally.

Link to comment
Share on other sites

Link to post
Share on other sites

i know, but your statement is a generalising one. If you are to be arguing correctly you should outline explicitly that in your region it may cost more. However depending on the electricity supplier of choice, the difference may be smaller even in your region.

 

however fact is, what i said holds true no matter where you live. There is other ways to make up that loss of money other then buying inferior hardware. And the ways i outlined not only saves you money, but it also helps the enviroment as a whole. Water waste is a massive issue globally, and should not be scoffed at.

Proper temperature control is also a major point where a lot of money can be saved, globally.

 

Yeah but $12/mo adds up to quite a lot just for one different component... even if one saves money in other places that's still money you'd rather not be spending and may be enough to justify getting a GTX 970 instead to some people.

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah but $12/mo adds up to quite a lot just for one different component... even if one saves money in other places that's still money you'd rather not be spending and may be enough to justify getting a GTX 970 instead to some people.

i also doubt about the 12$ pr month... because at a rate of 80w difference. you would have to use the card at FULL LOAD, at less then 100% load AMD and Nvidia is fairly close in power drain really, so at 100% load you got to be using the card for around

 

Difference in power between 560Ti and 290X is 80w

http://anandtech.com/bench/product/1059?vs=1337

 

according to these tables https://en.wikipedia.org/wiki/Electricity_in_Sri_Lanka

if we place him in the D1 >181 KWH/Time... making him pay 42 Sri Lankan Rupies pr KW/H

Also, using http://x-rates.com/calculator/?from=USD&to=LKR&amount=12

We see that 12 USD is roughly 1687 LKR....

 

Using this calculator

http://www.rapidtables.com/calc/electric/watt-to-kwh-calculator.htm

with the following input:

w = 80

Time = 24 hours

 

for EACH DAY, the difference is 1.92KW... if we just add it up to 2KW/Day that is 84LKR/day

so 84LKR/day x 30days = 2520LKR a month with 24/7 usage.... or 18 USD...

 

So.... 2520 - 1687 = 833 difference per month.... divide this by 42 LKR/1KW you get roughly 20KW/H less...

 

for the difference of a 560Ti and a 290X to reach 1KW, you need to use the 290X for roughly 13 Hours...

Going from there, we can deduce the following

 

20KW/H x 13 = 260 hours

 

one calendar month = 30 x 24 = 720 hours

 

720 - 260 = 460 / 30 = 15.33 HOURS OF 100% LOAD PER DAY

 

So to get "his" cost difference, he must be playing games, with the GPU pegged at 100% load, at least 15.33 HOURS OF 100% LOAD PER DAY

 

I call it bullshit.

12 dollars A YEAR. yes... but a month. NOPE

 

Unless you are running Folding @ Home, Boink or Mining for cryptocurrency, no normal person spends 15.3 hours EACH DAY, NO EXCEPTION, playing games. The moment you stop playing games and start browsing, most GPUs barely use any wattage AT ALL... So just making a post on these forums would require you to spend several minutes to "catch up" with a 12 dollar/month price tag.

Link to comment
Share on other sites

Link to post
Share on other sites

For your information I'm using a 290X twin frozr. I'm experiencing about $12 difference in monthly electricity bill compared to previous GTX 560ti. Electricity cost is not same in every region

How much electricity cost there mate? $/kWh 

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

i also doubt about the 12$ pr month... because at a rate of 80w difference. you would have to use the card at FULL LOAD, at less then 100% load AMD and Nvidia is fairly close in power drain really, so at 100% load you got to be using the card for around

 

Difference in power between 560Ti and 290X is 80w

http://anandtech.com/bench/product/1059?vs=1337

 

according to these tables https://en.wikipedia.org/wiki/Electricity_in_Sri_Lanka

if we place him in the D1 >181 KWH/Time... making him pay 42 Sri Lankan Rupies pr KW/H

Also, using http://x-rates.com/calculator/?from=USD&to=LKR&amount=12

We see that 12 USD is roughly 1687 LKR....

 

Using this calculator

http://www.rapidtables.com/calc/electric/watt-to-kwh-calculator.htm

with the following input:

w = 80

Time = 24 hours

 

for EACH DAY, the difference is 1.92KW... if we just add it up to 2KW/Day that is 84LKR/day

so 84LKR/day x 30days = 2520LKR a month with 24/7 usage.... or 18 USD...

 

So.... 2520 - 1687 = 833 difference per month.... divide this by 42 LKR/1KW you get roughly 20KW/H less...

 

for the difference of a 560Ti and a 290X to reach 1KW, you need to use the 290X for roughly 13 Hours...

Going from there, we can deduce the following

 

20KW/H x 13 = 260 hours

 

one calendar month = 30 x 24 = 720 hours

 

720 - 260 = 460 / 30 = 15.33 HOURS OF 100% LOAD PER DAY

 

So to get "his" cost difference, he must be playing games, with the GPU pegged at 100% load, at least 15.33 HOURS OF 100% LOAD PER DAY

 

I call it bullshit.

12 dollars A YEAR. yes... but a month. NOPE

 

Unless you are running Folding @ Home, Boink or Mining for cryptocurrency, no normal person spends 15.3 hours EACH DAY, NO EXCEPTION, playing games. The moment you stop playing games and start browsing, most GPUs barely use any wattage AT ALL... So just making a post on these forums would require you to spend several minutes to "catch up" with a 12 dollar/month price tag.

 

I told you from the bills which were offered to me by Ceylon Electricity Board during past few months and two years back. With me in home 240 - 250kWh per month. With 560 ti, it never reached 240kWh, although exceed 200kWh. Without me, 140 - 155kWh. 

 

WOW, you really calculated hours I spend on my computer. Did actually you go through my profile, find my country, our electricity generation in wikipedia and calculate this, must have been hard???

 

Your calculation is not wrong actually. If I stay home, usually computer is turned on for 14 hours per day, with a game like witcher 3, Far Cry 4 or GTA V.

 

My apologies if I exaggerated $12 per month, must be around $4-10.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×