Jump to content

Is 600W for a graphics card actually worth it for the performance buff or is it a cheap cop-out?

Actual_Criminal

The post is intended to just find out people's views (high-end users) on potentially using a 600W GPU. 

 

Obviously the majority of us have seen the leaks from reputable sources about the upcoming top-end cards having around 600W TDP. This is double the power of a RTX 3090.. If it turns out that the performance buff is around the average 20-30% (or even 40%) mark X%*, do you still say it is an 'upgrade'? 

 

The way I see it, it's like having a fast car and then inventing a 'better' car that is more powerful and takes you to your destination quicker by X%*, but it is significantly more inefficient because it uses double the fuel. Useful for some people of course, but I feel like it is sort of a cheat and not a proper generational leap.

 

Here in the UK, energy costs have doubled in the last 2 months and expect to triple by the end of the year + other cost of living increases. Meaning there is more of an incentive to use less electric where possible. Also, a lot of existing builds probably won't be able to just slot in a new 600W GPU without also upgrading the PSU and with disposable income down, inflation up and a supplier crisis with scalper prices, it really does not seem as appetizing as the 3000 series. 

 

What are people's thoughts? 

 

(Of course, this is just speculation on a GPU being 600W. It could well be 600W but provide 100%+ the performance for all we know until confirmed.)

CPU: AMD Ryzen 9 16-core 5950X

CPU Cooler: Artic Freezer 2 AIO 360mm Radiator

Motherboard: Asus ROG Strix X570-F Gaming

Memory: 32GB (2x16GB) G.Skill Trident Z Royal 3600 MHz CL16

GPU: Nvidia RTX 4080 MSI Ventus 3X 16GB GDDR6X

Storage OS: 500GB Samsung 980 Pro Gen4 M.2 NVme SSD

Storage Games: 2TB Corsair MP600 Gen4 M.2 NVme SSD + 2TB Samsung 860 Evo SSD + 500GB Samsung 850 Evo SSD

Storage Misc: 2TB Seagate Barracuda Compute 7200 RPM

PSU: Corsair HX Platinum 1000W 80+

Case: Fractal Design Meshify S2 ATX Mid Tower

Monitor: Dell Alienware AW3423DW 175Hz 1ms 3440p (widescreen) HDR400 OLED panel 34"  + Asus PG258Q 240Hz 1ms 1080p G-Sync TN panel 24.5"

Link to comment
Share on other sites

Link to post
Share on other sites

It's an interesting thought.

 

I think it will fall into the category of the users who just want raw performance. There are already users who want a 1200watt power supply just because. Or to have a 3080ti just to play league of legends or CS:GO, regardless of the actual use(GPU cores, VRAM, etc.) or the betterment of performance(Playing with a lower end card that does the same thing).

 

Then again, it will likely just be a phase where everything takes more power. GPU's haven't always been a high power consuming thing(with more power comes more power XD). I'm sure 10 or 20 years ago, no one would have guessed the current wattage usage and it allocations.

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Actual_Criminal said:

What are people's thoughts? 

as long as they continue to improve on efficiency and deliver GPU at all ranges of wattage, idgaf if they release a 1000W GPU (i welcome it, personally)

options is key

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

1000w just to play games is criminal, should be outlawed. It cost as much as my Aircon. what a waste of energy.

Ryzen 5700g @ 4.4ghz all cores | Asrock B550M Steel Legend | 3060 | 2x 16gb Micron E 2666 @ 4200mhz cl16 | 500gb WD SN750 | 12 TB HDD | Deepcool Gammax 400 w/ 2 delta 4000rpm push pull | Antec Neo Eco Zen 500w

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, SupaKomputa said:

1000w just to play games is criminal, should be outlawed. It cost as much as my Aircon. what a waste of energy.

Indeed, plus if you include overclocking for the CPU/GPU, can be even more. 

 

My office/gaming room is soo small, I have to have aircon when playing intensive games otherwise the room heats up over 30°c in the winter (which I guess is good and eliminates my need for heating, lol) and 35°c-ish in the summer. 

 

My current energy bill for electricity alone for a single male in a 2 bedroom apartment is £130 a month, increasing to £180 and will be around £250 by the end of the year. 

 

I imagine an average-sized family could have bills in excess of £700pm by the end of 2022, which is crazy!

CPU: AMD Ryzen 9 16-core 5950X

CPU Cooler: Artic Freezer 2 AIO 360mm Radiator

Motherboard: Asus ROG Strix X570-F Gaming

Memory: 32GB (2x16GB) G.Skill Trident Z Royal 3600 MHz CL16

GPU: Nvidia RTX 4080 MSI Ventus 3X 16GB GDDR6X

Storage OS: 500GB Samsung 980 Pro Gen4 M.2 NVme SSD

Storage Games: 2TB Corsair MP600 Gen4 M.2 NVme SSD + 2TB Samsung 860 Evo SSD + 500GB Samsung 850 Evo SSD

Storage Misc: 2TB Seagate Barracuda Compute 7200 RPM

PSU: Corsair HX Platinum 1000W 80+

Case: Fractal Design Meshify S2 ATX Mid Tower

Monitor: Dell Alienware AW3423DW 175Hz 1ms 3440p (widescreen) HDR400 OLED panel 34"  + Asus PG258Q 240Hz 1ms 1080p G-Sync TN panel 24.5"

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, SupaKomputa said:

1000w just to play games is criminal, should be outlawed. It cost as much as my Aircon. what a waste of energy.

Then, how much is acceptable? How do you draw the line? Does games that use more power need to be outlawed?

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, Actual_Criminal said:

.

pretty sure my old ass house cant handle a 600w gpu and even if it can it's cutting it close and im not gonna try. 

 

I might be simplifying here but a 3090ti is 2x a 1080ti...at twice the power, now round that down a bit to 300w, and the 600w card will be around 2x the powerdraw for 2x the performance, which is ~linear, im ok with that as long as the breakers can handle it. Since overclocking tends to be in the exponential part of the curve anyway.

 

all that said, 450w stock will be the max for me (and i'll undervolt further).

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Moonzy said:

Then, how much is acceptable? How do you draw the line? Does games that use more power need to be outlawed?

I personally draw the line at around the 250/300w mark for a gpu. To this day you get a ton of performance for that power.

 

It really feels like a we are pushing the limits to achieve the tiniest bit over our competitor stretch. Kinda like what intel and amd are doing. The 12900k is flat out better than a 5950x but also consumes a good bit more.

 

For me it's also a point of look at what laptops can do. A gpu like a mobile 3070 when NOT thermally limitd goes really really fast for a power budget  of like 120w.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, xg32 said:

all that said, 450w stock will be the max for me (and i'll undervolt further).

I agree with this, I think 300W is my preferred option, but can go up to 450W for the very highest GPUs, but even then I don't want to feel conscious that i'm rinsing my electrical bill everytime I go to game. More power will typically equal higher temperatures too.

CPU: AMD Ryzen 9 16-core 5950X

CPU Cooler: Artic Freezer 2 AIO 360mm Radiator

Motherboard: Asus ROG Strix X570-F Gaming

Memory: 32GB (2x16GB) G.Skill Trident Z Royal 3600 MHz CL16

GPU: Nvidia RTX 4080 MSI Ventus 3X 16GB GDDR6X

Storage OS: 500GB Samsung 980 Pro Gen4 M.2 NVme SSD

Storage Games: 2TB Corsair MP600 Gen4 M.2 NVme SSD + 2TB Samsung 860 Evo SSD + 500GB Samsung 850 Evo SSD

Storage Misc: 2TB Seagate Barracuda Compute 7200 RPM

PSU: Corsair HX Platinum 1000W 80+

Case: Fractal Design Meshify S2 ATX Mid Tower

Monitor: Dell Alienware AW3423DW 175Hz 1ms 3440p (widescreen) HDR400 OLED panel 34"  + Asus PG258Q 240Hz 1ms 1080p G-Sync TN panel 24.5"

Link to comment
Share on other sites

Link to post
Share on other sites

I feel that these high power consumptions are targeting the sort of gamer that have formerly ran SLI or Crossfire. Multi-gpu is dead and buried by this point, so scaling up to such high power consumption is probably the only avenue remaining to capture that sort of audience. 

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

For the record, I don't really have a problem with a 600w GPU SKU existing. I would have a problem if suddenly the midrange GPUs now take 300w instead of 150w. I don't really need to care based on power costs, we have a really bangin solar system that generates enough surplus in the day so it's not of much concern. I just don't like the heat output, noise output, or general wastefulness. I suspect most enthusiasts are more willing to put up with all of those than I am. It's one of my chief regrets in 'upgrading' to a 3080Ti FE from the reference 6800XT I was using. I don't believe my NCase would be able to hand much more in GPU wattage than the 350w mark in warmer climates either given how hot the tie gets.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm wondering how much power draw could be reduced by underclocking/undervolting.  I've heard of that being done with 3090s by miners seeking maximum power efficiency, and VFX artists trying to manage heat and power budget in their rendering machines.

The 4090 might have a lot of, uh, footroom for power savings at minimal performance loss.

Link to comment
Share on other sites

Link to post
Share on other sites

On 5/13/2022 at 8:26 PM, Actual_Criminal said:

(Of course, this is just speculation on a GPU being 600W. It could well be 600W but provide 100%+ the performance for all we know until confirmed.)

If the new leaks are true 600W isn't correct. According to kopite7kimi: RTX 4090, AD102-300, 16128FP32, 21Gbps 24G GDDR6X, 450W, ~2x3090.

Which means 4090 is twice the performance of 3090 at 450W. 3090 is 350W at least. So 100W more but twice the performance would be pretty good.

 

So I will definitely be looking out for a replacement for my 3080Ti if the leaks turn out to be true.

Desktop: i9-10850K [Noctua NH-D15 Chromax.Black] | Asus ROG Strix Z490-E | G.Skill Trident Z 2x16GB 3600Mhz 16-16-16-36 | Asus ROG Strix RTX 3080Ti OC | SeaSonic PRIME Ultra Gold 1000W | Samsung 970 Evo Plus 1TB | Samsung 860 Evo 2TB | CoolerMaster MasterCase H500 ARGB | Win 10

Display: Samsung Odyssey G7A (28" 4K 144Hz)

 

Laptop: Lenovo ThinkBook 16p Gen 4 | i7-13700H | 2x8GB 5200Mhz | RTX 4060 | Linux Mint 21.2 Cinnamon

Link to comment
Share on other sites

Link to post
Share on other sites

To me it depends on the cooling. 

My 3080 tis/3090s were a pain to cool and I ended up buying 3 cases and about 20 fans to get them at the same temperatures my 20 series cards ran at.

They use between 350 and 400 watts.

 

My coolest GPU is my FTW3 Ultra 3090ti that uses 450 watts. It is the only card I consider properly engineered. 

Right now it is using 10 watts less at idle than my FTW3 Ultra 3080 ti. If it is not being pushed it uses less power.

 

So if a 600 watt cards could bench in the 60s stock like my 3090 ti does I would consider it.

3 hours ago, Montana One-Six said:

If the new leaks are true 600W isn't correct. According to kopite7kimi: RTX 4090, AD102-300, 16128FP32, 21Gbps 24G GDDR6X, 450W, ~2x3090.

Which means 4090 is twice the performance of 3090 at 450W. 3090 is 350W at least. So 100W more but twice the performance would be pretty good.

 

So I will definitely be looking out for a replacement for my 3080Ti if the leaks turn out to be true.

I don't need the performance now but it would be fun to get one. 

 

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

just hot game in the winter to off see the cost of heating...i kid but well kinda tue. you could look at other things to save power i guess to keep your bill about the same.

 

it be nice if we could have settings per game like under volt if you only need 60 fps in mindcraft instead of 300 fps.

 

posably mine to hopefully make some moeny to pay for the off set in power cost...

 

j2cents made some good points on your house only able to do 15amps and upgrading to 20amps would be costly...

I have dyslexia plz be kind to me. dont like my post dont read it or respond thx

also i edit post alot because you no why...

Thrasher_565 hub links build logs

Corsair Lian Li Bykski Barrow thermaltake nzxt aquacomputer 5v argb pin out guide + argb info

5v device to 12v mb header

Odds and Sods Argb Rgb Links

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 5/15/2022 at 2:12 PM, jimm_eh said:

I'm wondering how much power draw could be reduced by underclocking/undervolting.  I've heard of that being done with 3090s by miners seeking maximum power efficiency, and VFX artists trying to manage heat and power budget in their rendering machines.

The 4090 might have a lot of, uh, footroom for power savings at minimal performance loss.

i run 260w at 210w and 350w at 280w, assuming the same 20% undervolt that barely hits performance (excluding oc) it's fair to assuming 450w cards can be ran at 350-360w, AD102 is rumored to be just a node change and more cores from GA102 with a power bum, the cards don't scale well past 2.1ghz.

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×