Jump to content

At what point is enough enough?

Brian McKee

With the 4000 series cards around the corner and rumored wattages of over 400 watts at what point do we just find this completely unacceptable? I've had microwave ovens close to this wattage in the past.

 

Ampere already has been compared to the infamous Fermi line of cards but what people fail to remember is that Fermi topped out at 250 watts. Looking at the old reviews https://www.anandtech.com/show/4008/nvidias-geforce-gtx-580/17 the TOTAL SYSTEM POWERDRAW with a 480 was at around 410-420~ watts not just a single component. One of the most insane gpus in ever released the R9 295x2, a card so hot it had to be released with a closed loop water block and widely considered impractical for use was 500 watts and that was 2 290xs stapled together.

 

Nvidia has been getting away with murder for a few years now with no sign of slowing down with some of the worst cards ever produced. At what point is enough enough with this? Or are we accepting literal spaceheaters so nvidia can maintain their "rightful" place on top of the performance charts?

Link to comment
Share on other sites

Link to post
Share on other sites

There is no such thing. I dont get these "cards are pulling too much power" argument. Demand for more powerful cards are increasing yet it is a problem when cards are pulling 500watts? Consumer is the one who has to decide if they actually need 40 teraflop card and the power consumption that comes with it.

mY sYsTeM iS Not pErfoRmInG aS gOOd As I sAW oN yOuTuBe. WhA t IS a GoOd FaN CuRVe??!!? wHat aRe tEh GoOd OvERclok SeTTinGS FoR My CaRd??  HoW CaN I foRcE my GpU to uSe 1o0%? BuT WiLL i HaVE Bo0tllEnEcKs? RyZEN dOeS NoT peRfORm BetTer wItH HiGhER sPEED RaM!!dId i WiN teH SiLiCON LotTerrYyOu ShoUlD dEsHrOuD uR GPUmy SYstEm iS UNDerPerforMiNg iN WarzONEcan mY Pc Run WiNdOwS 11 ?woUld BaKInG MY GRaPHics card fIX it? MultimETeR TeSTiNG!! aMd'S GpU DrIvErS aRe as goOD aS NviDia's YOU SHoUlD oVERCloCk yOUR ramS To 5000C18

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Levent said:

There is no such thing. I dont get these "cards are pulling too much power" argument. Demand for more powerful cards are increasing yet it is a problem when cards are pulling 500watts? Consumer is the one who has to decide if they actually need 40 teraflop card and the power consumption that comes with it.

Because power consumption matters. You wouldn't accept a 400 watt cpu for 50% gains would you? Hard to call it progress when the power targets are literally doubling. Nvidia would have been literally buried if Maxwell/Pascal were like this after the very unexceptional Fermi and Kepler.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Brian McKee said:

Because power consumption matters. You wouldn't accept a 400 watt cpu for 50% gains would you?

Are we now talking about CPUs? Because that is not the case if you look at AMD CPUs. To add to that shitty argument you got going on there, GTX580 has a TDP of 244W and has FP32 performance of 1.5tflops. RTX3070 (reference) card has a TDP of 220W and has 20tflops of FP32 performance.

 

If power consumption matters, chose your components accordingly.

mY sYsTeM iS Not pErfoRmInG aS gOOd As I sAW oN yOuTuBe. WhA t IS a GoOd FaN CuRVe??!!? wHat aRe tEh GoOd OvERclok SeTTinGS FoR My CaRd??  HoW CaN I foRcE my GpU to uSe 1o0%? BuT WiLL i HaVE Bo0tllEnEcKs? RyZEN dOeS NoT peRfORm BetTer wItH HiGhER sPEED RaM!!dId i WiN teH SiLiCON LotTerrYyOu ShoUlD dEsHrOuD uR GPUmy SYstEm iS UNDerPerforMiNg iN WarzONEcan mY Pc Run WiNdOwS 11 ?woUld BaKInG MY GRaPHics card fIX it? MultimETeR TeSTiNG!! aMd'S GpU DrIvErS aRe as goOD aS NviDia's YOU SHoUlD oVERCloCk yOUR ramS To 5000C18

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Brian McKee said:

Because power consumption matters. You wouldn't accept a 400 watt cpu for 50% gains would you? Hard to call it progress when the power targets are literally doubling. Nvidia would have been literally buried if Maxwell/Pascal were like this after the very unexceptional Fermi and Kepler.

Well to be fair that’s literally what you do when you OC a chip.  People had no problem taking a 65w r5 1600 and putting 170w into it to hit 4.2GHz. 
 

Same goes for the GPUs you’re whining about.  They’re WAY WAY WAY more power efficient than their predecessors, they are simply also capable of using more power.  A 3090ti CAN run at 100w, and it would still outperform a 970 using 150w any day of the week by a mile.  The thing is, PSU and AiB manufactures are capable of supporting higher power usage, so GPUs are gonna take advantage of that to get as much performance as they can. 

I edit the shit out of my posts.  Refresh before you respond.

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Levent said:

Are now talking about CPUs?

Does power consumption matter for one component but not the other? Just wondering.

 

Quote

To add to that shitty argument you got going on there, GTX580 has a TDP of 244W and has FP32 performance of 1.5tflops. RTX3070 (reference) card has a TDP of 220W and has 20tflops of FP32 performance.

 

I mean let's ignore how the comparing of tflops between architectures isn't good practice and look at the fact that the 1080ti somehow maintained the same power target as the 480 and was 500% faster on average. And has 11 teraflops. It's almost like changes in architectures and nodes should allow for substantially more performance at the same power target.

 

The 3070 compared to the 1080ti despite having double the teraflops is not double performance obviously. Not even close.

 

And for fun when you compare similar generation gaps from the past. The 1070 to 780ti has a similar performance difference but the 1070 was a 150 watt card, not over 200.

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Queen Chrysalis said:

Well to be fair that’s literally what you do when you OC a chip.  People had no problem taking a 65w r5 1600 and putting 170w into it to hit 4.2GHz.

I'm going to wager that 1-5% of owners did that to their 1600s. Most people leave their stuff stock.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Brian McKee said:

Does power consumption matter for one component but not the other? Just wondering.

 

 

I mean let's ignore the comparing of tflops between architectures and look at the fact that the 1080ti somehow maintained the same power target as the 480 and was 500% faster on average. And has 11 teraflops. It's almost like changes in architectures and nodes should allow for substantially more performance at the same power target.

Power consumption is power consumption, I fail to understand how they wouldnt matter everywhere.

 

RX480      has FP32 performance of 5.8tflops and TDP of 150w.

GTX480   has FP32 performance of 1.34tflops and TDP of 250w.

1080ti      has FP32 performance of 11.3tflops and TDP of 250w.

1080        has FP32 performance of 8.8tflops and TDP of 180w.

Your numbers are just wrong.

 

Changes in architecture and nodes can allow more performance at same power levels. People dont upgrade for efficiency, they upgrade for performance gains. If you get the most expensive flagship device you will be sacrificing power consumption. As I said:

22 minutes ago, Levent said:

If power consumption matters, chose your components accordingly.

 

 

mY sYsTeM iS Not pErfoRmInG aS gOOd As I sAW oN yOuTuBe. WhA t IS a GoOd FaN CuRVe??!!? wHat aRe tEh GoOd OvERclok SeTTinGS FoR My CaRd??  HoW CaN I foRcE my GpU to uSe 1o0%? BuT WiLL i HaVE Bo0tllEnEcKs? RyZEN dOeS NoT peRfORm BetTer wItH HiGhER sPEED RaM!!dId i WiN teH SiLiCON LotTerrYyOu ShoUlD dEsHrOuD uR GPUmy SYstEm iS UNDerPerforMiNg iN WarzONEcan mY Pc Run WiNdOwS 11 ?woUld BaKInG MY GRaPHics card fIX it? MultimETeR TeSTiNG!! aMd'S GpU DrIvErS aRe as goOD aS NviDia's YOU SHoUlD oVERCloCk yOUR ramS To 5000C18

 

Link to comment
Share on other sites

Link to post
Share on other sites

The really high power draw is only on the top of the line products, so NVidia can say they have the fastest GPU. 99.5% of systems will not be running 400+W GPUs, and the 0.5% that are, frankly don't really care. That's like saying Chevrolet cars have terrible fuel economy because the new Corvette only gets 19 mpg. 

Link to comment
Share on other sites

Link to post
Share on other sites

*shakes fist at drivers on motorway*

 

Who the hell NEEDS 700 hp?!  

Why do we make cars with top speeds OVER the maximum highway speed?

Why do we have motorcycles so fast that regulators required speed limiters? 

 

Because they can.

 

If I let my 3080 ftw3 run full out, it will pull 400W+. Do I need that? Nope. Do I run that way? Sometimes. Most times I am locked at 144fps though, so it's realistically pulling about 220-240W.

 

People love to have the fastest/most powerful thingamajig. It happens. And as long as people will buy it, manufacturers will make it.

Link to comment
Share on other sites

Link to post
Share on other sites

Nobody really cares about power draw. Like it or not, that's the truth.

 

What really made the Fermi cards infamous was heat. Same deal with some AMD cards from a few generations ago, the talking point was always "runs hot and power hungry" but people wouldn't have cared about the power draw if it weren't for the heat. If temps are kept under control and the performance gains are there people will accept higher power draw without qualms. 

 

Some PC enthusiasts actually not-so-secretly like their components being power hungry because they think it's synonymous with performance. I remember trying to in vain to tell people they didn't need a 1200W PSU for their systems with single 1080 ti's and getting an outright hostile response because I was trying to shrink their e-peen. 

Corps aren't your friends. "Bottleneck calculators" are BS. Only suckers buy based on brand. It's your PC, do what makes you happy.  If your build meets your needs, you don't need anyone else to "rate" it for you. And talking about being part of a "master race" is cringe. Watch this space for further truths people need to hear.

 

Ryzen 7 5800X3D | ASRock X570 PG Velocita | PowerColor Red Devil RX 6900 XT | 4x8GB Crucial Ballistix 3600mt/s CL16

Link to comment
Share on other sites

Link to post
Share on other sites

37 minutes ago, Middcore said:

What really made the Fermi cards infamous was heat. Same deal with some AMD cards from a few generations ago, the talking point was always "runs hot and power hungry" but people wouldn't have cared about the power draw if it weren't for the heat. If temps are kept under control and the performance gains are there people will accept higher power draw without qualms.

Why exactly? The heat numbers has to do more with thermal design of the coolers and less about the chip. Put a blower style cooler on a 3080 and it'll hit over 100 degrees. But high power is high power. 100% of that will be pumped out as waste heat into your room.

 

I also don't exactly buy this. People weren't exactly clamoring to buy stuff like the gtx 790 or r9 295x even when support for dual gpus was reasonable. I think there has absolutely been an attitude shift.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Brian McKee said:

Why exactly? The heat has to do more with thermal design of the coolers and less about the chip. Put a blower style cooler on a 3080 and it'll hit over 100 degrees.

 

I also don't exactly buy this. People weren't exactly clamoring to buy stuff like the gtx 790 or r9 295x even when support for dual gpus was reasonable. I think there has absolutely been an attitude shift.

GTX790 doesnt exist. People werent buying dual GPU cards because crossfire and SLI introduced many other issues on the software side (like lack of 100% scaling, micro-stutters, driver overheads etc). People bought GTX480s, GTX580s, GTX680s, HD 4870s because they performed better than previous generation.

 

Only attitude shift you can talk here is towards releasing cards with sufficient cooling (even that is not always the case).

 

mY sYsTeM iS Not pErfoRmInG aS gOOd As I sAW oN yOuTuBe. WhA t IS a GoOd FaN CuRVe??!!? wHat aRe tEh GoOd OvERclok SeTTinGS FoR My CaRd??  HoW CaN I foRcE my GpU to uSe 1o0%? BuT WiLL i HaVE Bo0tllEnEcKs? RyZEN dOeS NoT peRfORm BetTer wItH HiGhER sPEED RaM!!dId i WiN teH SiLiCON LotTerrYyOu ShoUlD dEsHrOuD uR GPUmy SYstEm iS UNDerPerforMiNg iN WarzONEcan mY Pc Run WiNdOwS 11 ?woUld BaKInG MY GRaPHics card fIX it? MultimETeR TeSTiNG!! aMd'S GpU DrIvErS aRe as goOD aS NviDia's YOU SHoUlD oVERCloCk yOUR ramS To 5000C18

 

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, Levent said:

GTX790 doesnt exist.

 

Meant the 690, my bad.

 

Quote

People werent buying dual GPU cards because crossfire and SLI introduced many other issues on the software side (like lack of 100% scaling, micro-stutters, driver overheads etc). People bought GTX480s, GTX580s, GTX680s, HD 4870s because they performed better than previous generation.

Still were the fastest in slot options. Plus there were plenty of games at the time that it worked well/acceptable with. I ran dual 6850s years ago as it was the only economical way for me to keep up. Wasn't exactly happy with the heat even that setup blasted into my room but I digress.

 

Quote

 

Only attitude shift you can talk here is towards releasing cards with sufficient cooling (even that is not always the case).

I just think people are more forgiving to nvidia. Ampere is just so much of a regression compared to the titan of architectures that were maxwell and pascal.

Link to comment
Share on other sites

Link to post
Share on other sites

55 minutes ago, Brian McKee said:

I just think people are more forgiving to nvidia. 

In a way, but it's not so much that they "forgive" Nvidia for this as they were disingenuous in ever pretending to care. 

 

Quote

Ampere is just so much of a regression compared to the titan of architectures that were maxwell and pascal.

Nobody cares about how efficient and clever the architecture is. GPU's are a means to an end. If it pushes better visuals and more fps to displays, and doesn't crash, people will like it. They don't have a reason to care how the sausage gets made. 

Corps aren't your friends. "Bottleneck calculators" are BS. Only suckers buy based on brand. It's your PC, do what makes you happy.  If your build meets your needs, you don't need anyone else to "rate" it for you. And talking about being part of a "master race" is cringe. Watch this space for further truths people need to hear.

 

Ryzen 7 5800X3D | ASRock X570 PG Velocita | PowerColor Red Devil RX 6900 XT | 4x8GB Crucial Ballistix 3600mt/s CL16

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Brian McKee said:

With the 4000 series cards around the corner and rumored wattages of over 400 watts at what point do we just find this completely unacceptable? I've had microwave ovens close to this wattage in the past.

I already have a card that uses over 400 watts and it is the coolest card I own. 

My EVGA FTW3 Ultra uses 450 watts stock and games at around 60c.

My 3080tis/3090s game in the 70s. Its vram only hits 60c on load where my other cards hit 90c.

 

If the 3090 ti was a cooling test for the 40 series I know EVGA got it right. 

3 hours ago, Brian McKee said:

Nvidia has been getting away with murder for a few years now with no sign of slowing down with some of the worst cards ever produced.

I have been happy with my Nvidia cards and my only complaints are usually about the amount of vram.

 

I also know that lots of research is needed not to end up with a hot card so there are makes and models I avoid. 

With the 30 series I have had to take what I could get so I ended up with some cards that are hotter than I like.

3 hours ago, Brian McKee said:

At what point is enough enough with this? Or are we accepting literal spaceheaters so nvidia can maintain their "rightful" place on top of the performance charts?

The only time computers were space heaters for me were in the late 80s and early 90s. It has not been an issue since. 

 

The computers that I have in the bedrooms where I live now got a little toasty since they were on floors, in corners but a ceiling fan fixed that.  

 

Right now I am in between a 450 watt 3090 ti and a 400 watt 3080 ti. Even if they are going full out I will not feel the heat since the cases and area around them are designed that way. 

 

Since I live in a tropical climate and like to stay cool there may come a time when the power draw of a computer matters but it is not now.

 

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, jones177 said:

I also know that lots of research is needed not to end up with a hot card so there are makes and models I avoid. 

With the 30 series I have had to take what I could get so I ended up with some cards that are hotter than I like.

The only time computers were space heaters for me were in the late 80s and early 90s. It has not been an issue since. 

 

The computers that I have in the bedrooms where I live now got a little toasty since they were on floors, in corners but a ceiling fan fixed that.  

 

Right now I am in between a 450 watt 3090 ti and a 400 watt 3080 ti. Even if they are going full out I will not feel the heat since the cases and area around them are designed that way. 

 

Since I live in a tropical climate and like to stay cool there may come a time when the power draw of a computer matters but it is not now.

 

That's just not how it works. Power is power no matter how they cool it in the computer. 400 watts being used by your card is another 400 watts being output into your room as heat at a 100% conversion ratio. I live in a very hot climate myself and I simply can not have something like that in my workspace, and we don't have per room HVAC controls.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Brian McKee said:

That's just not how it works. Power is power no matter how they cool it in the computer. 400 watts being used by your card is another 400 watts being output into your room as heat at a 100% conversion ratio. I live in a very hot climate myself and I simply can not have something like that in my workspace, and we don't have per room HVAC controls.

The rooms that my commuters are in are at 21c no matter how many watts they are using. 

The heat is not an issue because it is managed.  

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

In my opinion the enough is enough point was already with the RTX 3000 series, 3 slot coolers with 3 fans being the minimum to keep anything midrange and above cool, the RTX 4000 series might be even more insane, rumors are you'll need a new PSU unless you already have a 1000W one. I think the limiting factor is going to be keeping the card cool, and power hungry GPU's are going to need a well ventilated case.

Although most people probably don't care as an enormous GPU with a 1600W PSU is something gamers can brag about. If I plan on upgrading I might vote with my wallet and go with an AMD card, the RX 7000 series is rumored to be much more efficient. But the GPU market is almost a monopoly and people will still buy Nvidia cards,even though they screwed over customers with the all the crypto crap and overpriced cards.

Link to comment
Share on other sites

Link to post
Share on other sites

I think the limiting factor is going to be when people start tripping breakers in their house lol.. until then i'll take as much power as they can give as long as we can cool it.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Blademaster91 said:

Although most people probably don't care as an enormous GPU with a 1600W PSU is something gamers can brag about. If I plan on upgrading I might vote with my wallet and go with an AMD card, the RX 7000 series is rumored to be much more efficient. But the GPU market is almost a monopoly and people will still buy Nvidia cards,even though they screwed over customers with the all the crypto crap and overpriced cards.

Nvidia would certainly want it to be a monopoly. If AMD does pull of the same performance at a lower power target I wonder what the excuse will be for people still using nvidia in the enthusiast space? I'm assuming RT or DLSS or something.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×