Jump to content

Tongue-tied Ti(e) - Nvidia Allegedly Orders Partners to Halt RTX 3090 Ti Production

Lightwreather
1 minute ago, Moonzy said:

Why? Why should the limit of high end be 250W?

 

I'm all for efficiency, but why not make a 500W GPU that performs twice as fast?

That is because you can compare the architectural improvements and efficiency at the same power consumption. Power usage shouldn't be increasing every generation or we would have a problem.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, leadeater said:

True however the most vocal seem to be those that did previously buy a high end GPU and now can no longer or are unwilling to pay the current market price. So while this is most definitely the situation I don't see it changing and these vocal ones realistically only have one choice, accept and move on. The moving on may have more than one path but acceptance must come first.

*raises hand*

 

I do not expect to see launch 3080 pricing again, and have settled on a 3070 for now. There may be some easing of pressure throughout the year and a 3080 class GPU might re-enter my biting point again, or we repeat this all over again with 40 generation.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, HorseBattery said:

That is because you can compare the architectural improvements and efficiency at the same power consumption.

Why do you need them to be at the same power consumption to measure efficiency?

 

6 minutes ago, HorseBattery said:

Power usage shouldn't be increasing every generation or we would have a problem.

I would say if they're raising the power usage of their top end GPU without sacrificing efficiency, that's fine, as long as there's still low power consumption options available

 

If someone is willing to deal with a 1000W GPU, why not let them?

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Moonzy said:

Why do you need them to be at the same power consumption to measure efficiency?

 

I would say if they're raising the power usage of their top end GPU without sacrificing efficiency of course, that's fine, as long as there's still low power consumption options available

That is by definition how you measure efficiency. GPUs and CPUs do more work per Watt each generation. Just throwing more power at the problem isn't an improvement and can lead to a decrease in efficiency due to higher temps and increased voltage requirements.

 

As for the low power consumption options... They just don't really exist right now. The entire stack has been moved up and the low end was simply cut. This is because of the chip shortage leading to only higher end chips being manufactured.

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, HorseBattery said:

That is by definition how you measure efficiency. GPUs and CPUs do more work per Watt each generation. Just throwing more power at the problem isn't an improvement and can lead to a decrease in efficiency due to higher temps and increased voltage requirements.

 

As for the low power consumption options... They just don't really exist right now. The entire stack has been moved up and the low end was simply cut. This is because of the chip shortage leading to only higher end chips being manufactured.

 

giving more out of said chip is up to us the consumers and obviously its wanted

why would you set limits

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, pas008 said:

giving more out of said chip is up to us the consumers and obviously its wanted

why would you set limits

 

Because, as Kisai mentioned, the actual bottleneck is the screen resolution. The vast majority of people are on 1080p and 1440p. Going beyond that is largely unnecessary, especially with DLSS/RSR/XeSS.

 

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, HorseBattery said:

That is by definition how you measure efficiency

...?

So, say you want to compare the efficiency of a 3070 and 3080, you either crank the power limit of the 3070 way up or tune the 3080 way down?

 

That's not how calculating efficiency works, you take the work done, divide it by energy consumed, and you get work per energy consumed, that's efficiency

 

If I limit the power of a 3080 to a 3070, the 3080 will probably win in term of efficiency by a ton due to how power-efficiency curve (it's not linear) works

so no, you don't need them to be at the same power consumption to measure efficiency

 

16 minutes ago, HorseBattery said:

As for the low power consumption options... They just don't really exist right now. The entire stack has been moved up and the low end was simply cut. This is because of the chip shortage leading to only higher end chips being manufactured.

I'd say the performance per watt of ampere is on par with Turing, if not sightly better at stock settings (depending on workload)

You can set power limit on modern GPU, i set my GPU to 80% power limit for this reason

Some of them at 50-60% too

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, HorseBattery said:

 

Because, as Kisai mentioned, the actual bottleneck is the screen resolution. The vast majority of people are on 1080p and 1440p. Going beyond that is largely unnecessary, especially with DLSS/RSR/XeSS.

 

ok so you should only do what is needed

so you have economy cars with 4 cyclinders only

you only eat 1 dorito?

 

with that statement of kisai of x60 is perfect how do 1060 owners feel now? lol

think you forget system requirements keep increasing yr after yr

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Moonzy said:

If I limit the power of a 3080 to a 3070, the 3080 will probably win in term of efficiency by a ton due to how power-efficiency curve (it's not linear) works

so no, you don't need them to be at the same power consumption to measure efficiency

The efficiency curve is exactly why you have to be careful how you compare efficiency across products, unless you're only looking at single operating points for each.

 

Say GPU2 has 2x the performance at 2x the power consumption of GPU1. At that point you could say they're the same efficiency. A more common comparison is at either equal power, or equal performance. In either of those scenarios, GPU2 would likely win.

 

2 minutes ago, Moonzy said:

I'd say the performance per watt of ampere is on par with Turing, if not sightly better (depending on workload)

Depends on use case. I've not bothered looking at it for gaming, but for multiple compute uses the gap is pretty big. My 2070 beats 1080 Ti. 3070 beats 2080 Ti. And in this use case, beat means higher performance and lower power at the same time.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, Moonzy said:

Why? Why should the limit of high end be 250W?

 

I'm all for efficiency, but why not make a 500W GPU that performs twice as fast?

Why would you want a PC that doubles as a space heater, 500W for the GPU alone is ridiculous, add in about 150W for the CPU and its a literal space heater.  IMO you shouldn't reward Nvidia for ignoring efficiency and going for 4K UlTiMaTe GaMiNG instead that is a niche that the majoirty of gamers don't care about.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Blademaster91 said:

IMO you shouldn't reward Nvidia for ignoring efficiency

I'm not

50 minutes ago, Moonzy said:

I'm all for efficiency, but why not make a 500W GPU that performs twice as fast?

And I literally reduce my own GPU's power limit to capitalize on the sweet exponential increase in efficiency (and lower temps and longer lifespan*)

*Not really, but the chances of it dying at a given timeframe is reduced exponentially as well

 

I hate how companies are literally pumping 20% more power into a chip and only getting 5% more perf, that's why I undervolt/lower the power limit of almost everything I own, from CPU to GPU

 

14 minutes ago, Blademaster91 said:

Why would you want a PC that doubles as a space heater, 500W for the GPU alone is ridiculous, add in about 150W for the CPU and its a literal space heater

The same question can be asked, why would anyone want anything at all?

 

My horse isn't high enough to say what's necessary or not for everyone on mother earth, but I do believe that options are important

 

10 minutes ago, leadeater said:

Please give me back a usefulness to my 1600W PSU lol

I have two 1600W PSU and my PC is in another room, bring it on Nvidia

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, leadeater said:

Please give me back a usefulness to my 1600W PSU lol

 

Peak PSU efficiency is at about 50% power draw. I only have a 650W PSU, because my system only draws about 300W under load.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, HorseBattery said:

Peak PSU efficiency is at about 50% power draw. I only have a 650W PSU, because my system only draws about 300W under load.

It's actually not that simplistic to just say at 50% and not every PSU is anyway. Also the difference between peak at whatever percentage it is and at ~80% of the rated capacity is hardly different, especially on very good PSUs like pretty much anyone with a 1600W PSU has.

 

image.thumb.png.a9214217989beea8644e671f93b46aba.png

This is a "bad" PSU and try and tell me ~4% different really matters at all, a figure smaller on better PSUs.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

It's based on that it's price how it is and will be for a long time and IF it goes down it won't be nearly as much I bet you are thinking or hoping.

it's based on how the market reacts to it, buying it only encourages companies to keep those prices.

1 hour ago, leadeater said:

It's not FOMO at all, it's literally pointing out if you wait forever you wait forever and get nothing. If you want something and can get it but conversely refuse to buy it because it's too much that is your choice, going ahead and buying it is not root caused in FOMO at all.

Then if it's a want and not something you need, and you're afraid it might be more expensive later then its definitely FOMO.

1 hour ago, leadeater said:

Do whatever you want but I'll more than likely see you in 2 years with still no current generation GPU because the pricing is the same as it is now and if you've been waiting 2-3 years already then all you've done is extend your absence of having a GPU by a further 2 years or relented and purchased a previous generation(s) option.

If you've seen the Steam survey you'll see that most people are still on a GTX 1060, most people aren't on the current gen GPU's, and I don't expect that to change as long as the whole market is inflated past what most people can pay.

I'll upgrade eventually, I skipped the RTX 20 series because it wasn't worth it over the 10 series, and with RTX 30 series going to miners I don't expect to get an  RTX 3060Ti or RTX 3070, so at this point I don't even care anymore lol.

1 hour ago, leadeater said:

Basically you haven't achieved anything other than continued to stay being annoyed at the situation and I'd ask where exactly does that get you overall?

I would have to guess that a lot of people are annoyed at the situation, but most simply cannot afford the hobby as a GPU cost what a whole midrange system should cost.

1 hour ago, leadeater said:

Not buying in the hopes and prayers that pricing will go down and you'll regret spending the money is literally also FOMO, fear of missing out on the lower price.

That isn't FOMO, again it's simply not having the budget for it, and I think its the minority, but refusing to pay up for it out of moral reasons not wanting to support a market with no affordable mid range options. And I think the enthusiast gamers with piles of money fail to realize that most people usually build a PC in the $600-1000 range, and there isn't any way to do that anymore unless you get really lucky.

It isn't FOMO when I accepted that Nvidia and AMD don't care about mid range, and I've accepted that I can keep the card I have and just turn a few settings down and not worry about the latest games.

1 hour ago, Moonzy said:

pretty much why nvidia stopped giving MSRP, let market dictate the price

Except scalping isn't a market, though nvidia seems to think so, IMO they're becoming the apple of PC hardware, pricing things at whatever and people still line up and preorder the second its released.

1 hour ago, Moonzy said:

ditto, inflation is OP for the past couple years

inflation caused by several factors, but GPU pricing hasn't really followed inflation, price went way up after GTX 10 series.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, leadeater said:

It's actually not that simplistic to just say at 50% and not every PSU is anyway. Also the difference between peak at whatever percentage it is and at ~80% of the rated capacity is hardly different, especially on very good PSUs like pretty much anyone with a 1600W PSU has.

 

This is a "bad" PSU and try and tell me ~4% different really matters at all, a figure smaller on better PSUs.

 

The main benefit of the better power supplies is less cooling required and less noise. That 4% relative to the better PSU will translate into something like 30% more heat. And the higher on the curve the more heat each percentage point represents.

 

But really it is just a handy rule of thumb that allows overhead for power spikes and a bit of room for upgrades too. I could use a smaller PSU and it would be fine.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, HorseBattery said:

Peak PSU efficiency is at about 50% power draw. I only have a 650W PSU, because my system only draws about 300W under load.

My ax1600i runs at 94% efficiency at around 1500W sustained, I doubt running it at 50% would be much of a difference, really

 

9 minutes ago, leadeater said:

a figure smaller on better PSUs.

You're not looking at the other solution

3000W PSU

 

the 110v gang can step aside /s

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, Blademaster91 said:

That isn't FOMO, again it's simply not having the budget for it

Not if you are specifically not buying it because you refuse to pay that much, which was the point and that is FOMO more than my example by a lot.

 

25 minutes ago, Blademaster91 said:

Then if it's a want and not something you need, and you're afraid it might be more expensive later then its definitely FOMO.

If that how you want to see it then go ahead, I see it as an explicit knowledge that what I purchased would not be at that price again for a long time or never, I could have been salty about the situation and refused to buy it or I could have simply brought it because it was a good deal, I did the later. That's not FOMO that just taking the opportunity to accept a good deal.

 

I didn't have to just as much as you don't want to pay current market price. There is no reason at all to try and shoe horn FOMO in to this and the biggest offenders are the ones getting all annoyed at the pricing that they, essentially, proudly proclaim they refuse to buy anything when they can while denying the reality of the situation.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Moonzy said:

And I literally reduce my own GPU's power limit to capitalize on the sweet exponential increase in efficiency (and lower temps and longer lifespan*)

*Not really, but the chances of it dying at a given timeframe is reduced exponentially as well

 

I hate how companies are literally pumping 20% more power into a chip and only getting 5% more perf, that's why I undervolt/lower the power limit of almost everything I own, from CPU to GPU

I'm saying you shouldn't really need to undervolt a GPU yourself to get peak efficiency, even though thats been the case with the high end RTX 30 series cards if you don't want to risk the card failing sooner.

14 minutes ago, Moonzy said:

The same question can be asked, why would anyone want anything at all?

 

My horse isn't high enough to say what's necessary or not for everyone on mother earth, but I do believe that options are important

I don't mean to be on a high horse if it sounds that way, I just don't see the point of a Ti card while the non-Ti 3090 already consumed so much power some of them reach over 100C at stock on a card with a massive cooler. And the fact that Nvidia keeps adding sku's while the 3090 was difficult enough to find in stock.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Moonzy said:

You're not looking at the other solution

3000W PSU

 

the 110v gang can step aside /s

3000W is out of spec for 240V AC, maximum rated continuous power on a 10A 240V socket is "only" 1600W, with 2200W for high heat rated sockets and cables. You have to go up to 15A to use a 3000W, luckily I have 2 15A sockets for my UPS 😉

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, leadeater said:

3000W is out of spec for 240V AC, maximum rated continuous power on a 10A 240V socket is "only" 1600W, with 2200W for high heat rated sockets and cables. You have to go up to 15A to use a 3000W, luckily I have 2 15A sockets for my UPS 😉

Come to the UK. 240V 13A as standard. That'll just get you over 3kW. Just don't look at the power bill too often if you make use of it all the time. I don't think I've ever owned a single PC (even for benching) that took over 1kW under load, unless you include an external water chiller in the equation.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Blademaster91 said:

I'm saying you shouldn't really need to undervolt a GPU yourself to get peak efficiency

Well, that's what happens when companies squeeze every last drop of performance out of the chip when competition is fierce

 

Nvidia and Intel had good OC headroom back when AMD was hibernating

 

2 minutes ago, Blademaster91 said:

And the fact that Nvidia keeps adding sku's while the 3090 was difficult enough to find in stock.

I'd argue that if even the 3090 is sold out, that probably means that there are more people that are willing to buy an even higher end card compared to before

 

4 minutes ago, Blademaster91 said:

while the non-Ti 3090 already consumed so much power some of them reach over 100C at stock on a card with a massive cooler

Possibly why it's been halted

G6x are nasty little buggers, manufacturers probably never really stress tested it before releasing it into the wild, that's why they're all pretty much 100c+ on stock settings

Or they assumed that no one cares about it and only care about core temps

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, HorseBattery said:

The main benefit of the better power supplies is less cooling required and less noise. That 4% relative to the better PSU will translate into something like 30% more heat. And the higher on the curve the more heat each percentage point represents.

It still makes next to no difference, any good gold rated or better PSU regardless of maximum rated capacity is only going to be heat generated difference of around 10W. A large capacity PSU also has better passive cooling anyway so it's not really comparable.

 

A good modern 1600W PSU won't even turn the fan on until like 800W-1000W load while a 650W of the exact same efficiency rating will have it's fan turn on at 325W-350W, which PSU is quieter at 500W?

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, porina said:

Come to the UK. 240V 13A as standard. That'll just get you over 3kW. Just don't look at the power bill too often if you make use of it all the time. I don't think I've ever owned a single PC (even for benching) that took over 1kW under load, unless you include an external water chiller in the equation.

Nice though you'll still need the heat rated sockets and cables, residential power outlets aren't actually rated for 100% 24/7 funnily enough. I think it's around 80%, at least for the AU/NZ spec, so with a little extra safety that is why ~1600W is the most you see from the gamer PSU brands. Maximum you can buy from any server vendor that uses 10A IEC is 2200W, just above 90%.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×