Jump to content

Increased power consumption

Deadpool3049

Ok now I may or may not be barking up the wrong tree here but have gpu manufacturers got the memo about the cost of living increase in electricity and gas cost it just seem to me that you need and even bigger psu now to run the dam things mean extra cost on your electricity bill, also where is the green credentials for this and for the nay sayers yes I am talking about globe warming and yes it a real thing, the logic here just doesn't follow

Link to comment
Share on other sites

Link to post
Share on other sites

Well, I guess I agree... If what goes on continues, then in 2-3 years, it's normal for a gaming GPU to pull +500 watts and that IMO is not good and should not be allowed. I hope that at least in the EU that we get some sort of law saying how much power a personal computer not meant for work related stuff is allowed to pull. But I don't think we will see any sort of law on this any time soon. 

PC Setup: 

HYTE Y60 White/Black + Custom ColdZero ventilation sidepanel

Intel Core i7-10700K + Corsair Hydro Series H100x

G.SKILL TridentZ RGB 32GB (F4-3600C16Q-32GTZR)

ASUS ROG STRIX RTX 3080Ti OC LC

ASUS ROG STRIX Z490-G GAMING (Wi-Fi)

Samsung EVO Plus 1TB

Samsung EVO Plus 1TB

Crucial MX500 2TB

Crucial MX300 1TB

Corsair HX1200i

 

Peripherals: 

Samsung Odyssey Neo G9 G95NC 57"

Samsung Odyssey Neo G7 32"

ASUS ROG Harpe Ace Aim Lab Edition Wireless

ASUS ROG Claymore II Wireless

ASUS ROG Sheath BLK LTD'

Corsair SP2500

Beyerdynamic TYGR 300R + FiiO K7 DAC/AMP

RØDE VideoMic II + Elgato WAVE Mic Arm

 

Racing SIM Setup: 

Sim-Lab GT1 EVO Sim Racing Cockpit + Sim-Lab GT1 EVO Single Screen holder

Svive Racing D1 Seat

Samsung Odyssey G9 49"

Simagic Alpha Mini

Simagic GT4 (Dual Clutch)

CSL Elite Pedals V2

Logitech K400 Plus

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Deadpool3049 said:

Ok now I may or may not be barking up the wrong tree here but have gpu manufacturers got the memo about the cost of living increase in electricity and gas cost it just seem to me that you need and even bigger psu now to run the dam things mean extra cost on your electricity bill, also where is the green credentials for this and for the nay sayers yes I am talking about globe warming and yes it a real thing, the logic here just doesn't follow

GPUs take years to develop. The current or next gen has been in development for at least 4 years now. So back then this was not an issue, seeing as people horded those cards by the thousands and ran large scale mining ops. 

Personally I'd wait for more power efficient cards to upgrade to. Our municipal power company just raised the price per kw/h to 39.88 cents. AFAIK they will raise the price again in march to 56.9 cents, unless some miracle happens. 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Applefreak said:

GPUs take years to develop. The current or next gen has been in development for at least 4 years now. So back then this was not an issue, seeing as people horded those cards by the thousands and ran large scale mining ops. 

Personally I'd wait for more power efficient cards to upgrade to. Our municipal power company just raised the price per kw/h to 39.88 cents. AFAIK they will raise the price again in march to 56.9 cents, unless some miracle happens. 

Its insanity. I got dumb lucky by making a 5 year contract 1 month before covid energy prices ended. The moment that thing ends I honestly do not know what to do. My energy bill will at least 10x

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, Applefreak said:

GPUs take years to develop. The current or next gen has been in development for at least 4 years now. So back then this was not an issue, seeing as people horded those cards by the thousands and ran large scale mining ops. 

Personally I'd wait for more power efficient cards to upgrade to. Our municipal power company just raised the price per kw/h to 39.88 cents. AFAIK they will raise the price again in march to 56.9 cents, unless some miracle happens. 

Also the 2000 series showed how power efficient a card could be.

 

The 3000 series and 4000 series is actually very power efficient too but nvidis redlines them so hard that for a mere 5% performance drop the 4090 can consume up to 150w less

Link to comment
Share on other sites

Link to post
Share on other sites

52 minutes ago, Deadpool3049 said:

Ok now I may or may not be barking up the wrong tree here but have gpu manufacturers got the memo about the cost of living increase in electricity and gas cost it just seem to me that you need and even bigger psu now to run the dam things mean extra cost on your electricity bill, also where is the green credentials for this and for the nay sayers yes I am talking about globe warming and yes it a real thing, the logic here just doesn't follow

Pretty sure the users pretty much half the reason it's like this.

 

49 minutes ago, BetteBalterZen said:

Well, I guess I agree... If what goes on continues, then in 2-3 years, it's normal for a gaming GPU to pull +500 watts and that IMO is not good and should not be allowed. I hope that at least in the EU that we get some sort of law saying how much power a personal computer not meant for work related stuff is allowed to pull. But I don't think we will see any sort of law on this any time soon. 

Soon, maybe.

u5p9iw71pbk81.jpg

There is approximately 99% chance I edited my post

Refresh before you reply

__________________________________________

ENGLISH IS NOT MY NATIVE LANGUAGE, NOT EVEN 2ND LANGUAGE. PLEASE FORGIVE ME FOR ANY CONFUSION AND/OR MISUNDERSTANDING THAT MAY HAPPEN BECAUSE OF IT.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Poinkachu said:

Pretty sure the users pretty much half the reason it's like this.

 

Soon, maybe.

u5p9iw71pbk81.jpg

😨😨😨

 

😋👍

PC Setup: 

HYTE Y60 White/Black + Custom ColdZero ventilation sidepanel

Intel Core i7-10700K + Corsair Hydro Series H100x

G.SKILL TridentZ RGB 32GB (F4-3600C16Q-32GTZR)

ASUS ROG STRIX RTX 3080Ti OC LC

ASUS ROG STRIX Z490-G GAMING (Wi-Fi)

Samsung EVO Plus 1TB

Samsung EVO Plus 1TB

Crucial MX500 2TB

Crucial MX300 1TB

Corsair HX1200i

 

Peripherals: 

Samsung Odyssey Neo G9 G95NC 57"

Samsung Odyssey Neo G7 32"

ASUS ROG Harpe Ace Aim Lab Edition Wireless

ASUS ROG Claymore II Wireless

ASUS ROG Sheath BLK LTD'

Corsair SP2500

Beyerdynamic TYGR 300R + FiiO K7 DAC/AMP

RØDE VideoMic II + Elgato WAVE Mic Arm

 

Racing SIM Setup: 

Sim-Lab GT1 EVO Sim Racing Cockpit + Sim-Lab GT1 EVO Single Screen holder

Svive Racing D1 Seat

Samsung Odyssey G9 49"

Simagic Alpha Mini

Simagic GT4 (Dual Clutch)

CSL Elite Pedals V2

Logitech K400 Plus

Link to comment
Share on other sites

Link to post
Share on other sites

These prices are getting outrageous! 1.5% increase this quarter. 

image.png.e142e54b7b2a385c4a782299db0adcd0.png

I'm not actually trying to be as grumpy as it seems.

I will find your mentions of Ikea or Gnome and I will /s post. 

Project Hot Box

CPU 13900k, Motherboard Gigabyte Aorus Elite AX, RAM CORSAIR Vengeance 4x16gb 5200 MHZ, GPU Zotac RTX 4090 Trinity OC, Case Fractal Pop Air XL, Storage Sabrent Rocket Q4 2tbCORSAIR Force Series MP510 1920GB NVMe, CORSAIR FORCE Series MP510 960GB NVMe, PSU CORSAIR HX1000i, Cooling Corsair XC8 CPU block, Bykski GPU block, 360mm and 280mm radiator, Displays Odyssey G9, LG 34UC98-W 34-Inch,Keyboard Mountain Everest Max, Mouse Mountain Makalu 67, Sound AT2035, Massdrop 6xx headphones, Go XLR 

Oppbevaring

CPU i9-9900k, Motherboard, ASUS Rog Maximus Code XI, RAM, 48GB Corsair Vengeance LPX 32GB 3200 mhz (2x16)+(2x8) GPUs Asus ROG Strix 2070 8gb, PNY 1080, Nvidia 1080, Case Mining Frame, 2x Storage Samsung 860 Evo 500 GB, PSU Corsair RM1000x and RM850x, Cooling Asus Rog Ryuo 240 with Noctua NF-12 fans

 

Why is the 5800x so hot?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Each generation is more cost effective than the last. This includes power consumption. Perf/watt keeps on going up. 

You don't HAVE to buy the fastest card on the market. It's an option, not a requirement. And if all you do is play esports, you probably want to be using an OLED unit with VRR while setting the frame cap to 3Hz below your monitor's refresh rate. https://blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/15/

Chances are you didn't either, the 4090s are in short supply the last I checked. 

3900x | 32GB RAM | RTX 2080

1.5TB Optane P4800X | 2TB Micron 1100 SSD | 16TB NAS w/ 10Gbe
QN90A | Polk R200, ELAC OW4.2, PB12-NSD, SB1000, HD800
 

Link to comment
Share on other sites

Link to post
Share on other sites

I'm with @cmndr on this one.  Yes, the 4090 can sup down up to 600W of power but you don't have to buy it.  A 1660 super is still more than enough for 1080p gaming and that barely uses 125W

 With all the Trolls, Try Hards, Noobs and Weirdos around here you'd think i'd find SOMEWHERE to fit in!

Link to comment
Share on other sites

Link to post
Share on other sites

Besides what has already been said. GPUs are still luxury item that you only buy for A) For pleasure, B) For business. And if its latter, you embed price of having better hardware to cost that your customers are paying. Thats how business works, and thats why everything is more expensive also.

^^^^ That's my post ^^^^
<-- This is me --- That's your scrollbar -->
vvvv Who's there? vvvv

Link to comment
Share on other sites

Link to post
Share on other sites

There's a catch-22 to all of this, a hotter running GPU will consume more power, the issue is that the leakage current with LVCMOS technologies is exponentially tied to the junction temperature, and power dissipation is tied to the current squared, so you can imagine how that one goes... Basically, if it gets hotter it starts using more power, which means it gets hotter, which lets through a larger leakage current, meaning it uses even more power, etc. This goes on until either the heatsink starts to catch up and you reach thermal equilibrium, or until a part of the chip slows down due to thermal protection features. This also means if you reduce the power output even slightly, you get a lot of gains.

Link to comment
Share on other sites

Link to post
Share on other sites

The current price for a 4080 would build you a 13900k based system without a GPU. Let that sink in, the GPU costs more than the rest of an entire top tier build. The target market is not worried about saving $5 in electricity if they are paying used car prices for a fucking GPU.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×