Jump to content

Nvidia Expects GPU Shortages to Begin Receding in Mid-2022 amid a 3 year high of GPU shipments

Lightwreather

Summary

As initially reported by PCMag, Nvidia believes it will counteract the GPU shortage to a large degree by mid-2022.

 

Quotes

Quote

 Nvidia's Chief Financial Officer Colette Kress commented at the UBS Global MTM conference yesterday that supply will begin to improve in the second half of 2022.

"The company as a whole will take the appropriate work to continue to procure more supply," Kress said. She also said that Nvidia continues to increase supply during the shortage, "We've been able to grow quite well during this year, each quarter, sequentially growing. And we do continue to plan to do that for Q4."

"So, we believe we will be in a better situation in terms of supply when we look at the second half of next year," Kress said.

Kress said Nvidia had poured billions of dollars into spending agreements for long-term manufacturing capacity commitments. Unfortunately, Kress didn't say which fabs Nvidia dealt with.

That said, Kress also notes it's unclear when the graphics card market will reach a stabilization point and supply fully meets demand. She said she'll have to wait until after the holidays to see how inventory fares, so even Nvidia won't really know if the market is stabilizing or not until early next year.

 

My thoughts

This is great news. We might finally see an end to the "Great GPU Shortage of the 21st century" as I've dubbed it. However, this comment by the CFO indicates that we might be a waiting a bit longer than the end of July next year (unlike what the title implies). Sure, we will have better supply in the second half of the year (which could be literally any month in that period) but if the Demand isn't fully met, then it's still eh..... Here's to hoping that we can get GPUs at or below MSRP before June.

Also in somewhat related news, We've reached a 3 year high in terms of GPUs Sold

Sources

Tom's Hardware

PCMag (UK)

"The most important step a man can take. It’s not the first one, is it?
It’s the next one. Always the next step, Dalinar."
–Chapter 118, Oathbringer, Stormlight Archive #3 by Brandon Sanderson

 

 

Older stuff:

Spoiler

"A high ideal missed by a little, is far better than low ideal that is achievable, yet far less effective"

 

If you think I'm wrong, correct me. If I've offended you in some way tell me what it is and how I can correct it. I want to learn, and along the way one can make mistakes; Being wrong helps you learn what's right.

 

Link to comment
Share on other sites

Link to post
Share on other sites

That's hilarious. The average price of GPU's still hasn't dropped even a bit so I'm skeptical that by mid 2022, essentially in 5-6 months we'll see cards dropping in price. Maybe if new GPU's get announced by then since in mid 2022 it'll nearly be 2 years since the release of the 30 series then maybe people won't be inclined to buy GPU's so stock recovers somewhat?

Link to comment
Share on other sites

Link to post
Share on other sites

Lol, i'll believe it when i see it. A year ago they also said it's going to last until summer, then they said Q4, now they're saying well into 2022, next is going to be "until further notice"...

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

Sooooo... the supply will get better just as they prepare to launch the 4000 series cards? So, just in time for them to become "obsolete"?

 

Truly, Jensen's magnanimity knows no bounds /s

 

Okay, so any improvement in the situation is a win at this point, but the timing would certainly be... a bit of a coincidence.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Rauten said:

Sooooo... the supply will get better just as they prepare to launch the 4000 series cards? So, just in time for them to become "obsolete"?

 

Truly, Jensen's magnanimity knows no bounds /s

 

Okay, so any improvement in the situation is a win at this point, but the timing would certainly be... a bit of a coincidence.

Oh dear.

I completely forgot that a 4000 series might even exist around that time........

:facepalm:

"The most important step a man can take. It’s not the first one, is it?
It’s the next one. Always the next step, Dalinar."
–Chapter 118, Oathbringer, Stormlight Archive #3 by Brandon Sanderson

 

 

Older stuff:

Spoiler

"A high ideal missed by a little, is far better than low ideal that is achievable, yet far less effective"

 

If you think I'm wrong, correct me. If I've offended you in some way tell me what it is and how I can correct it. I want to learn, and along the way one can make mistakes; Being wrong helps you learn what's right.

 

Link to comment
Share on other sites

Link to post
Share on other sites

And yet i guarantee they will hold onto their new pricing tiers introduced with the 20 series and cemented in with the 30 series even once availability is back to normal. Even if retailers go back down to MSRP, the MSRPs set by Nvidia and AMD were raised in 2018.

 

MSRP:

x60 cards that used to be ~$200-250, since the 20 series are now $300-330

x70 cards that used to be ~$330-380, since the 20 series are now ~$500

x80ti cards that used to be ~$650-700, since the 20 series are now $1000+

 

 

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, J-from-Nucleon said:

Oh dear.

I completely forgot that a 4000 series might even exist around that time........

:facepalm:

I think they recently semi-confirmed their plans for it? Late 2022, so probably around November again.

GN talked about it in their last Hardware News recap.

Link to comment
Share on other sites

Link to post
Share on other sites

Would be something if the 4000 series ended up being clock-bumped rebrands at MSRPs reflecting current market rates.  Not as though Nvidia hadn’t done this before. The 700 series was mostly rebrands. 

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, Zodiark1593 said:

Would be something if the 4000 series ended up being clock-bumped rebrands at MSRPs reflecting current market rates.  Not as though Nvidia hadn’t done this before. The 700 series was mostly rebrands. 

No

Nvidia has lovelace, its probably aleady working on production in some form, it would be more of an issue for them to delay the architecture, and it would piss of users if the "new" generation is only 5% faster for the same price

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

"counteract the GPU shortage to a large degree" by selling GPUs nobody would want to buy for that price.

leaving those who wanted the performance of the older generations, in the dust and still short of "consumer" GPUs.

 

Also with how quickly Nvidia goes for 4000 cards and maybe their production and certain 2000 cards, while the recent cards... sold out, unless you want to pay a lot or want an 3090.

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, Zodiark1593 said:

Would be something if the 4000 series ended up being clock-bumped rebrands at MSRPs reflecting current market rates.  Not as though Nvidia hadn’t done this before. The 700 series was mostly rebrands. 

It won't be Nvidia in charge of the real pricing (not MSRP). Right now, GPUs are priced based on their Crypto ROI. So if the 4000 series offered a hypothetical 2x in performance over 3000, then expect a proportional 2x in scalping price relative to the 3000 series. And that's all assuming current crypto currency valuation doesn't fall any further.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, AndreiArgeanu said:

That's hilarious. The average price of GPU's still hasn't dropped even a bit so I'm skeptical that by mid 2022, essentially in 5-6 months we'll see cards dropping in price.

why? Its happens regularly. Just a few years ago there was a GPU shortage, and a few months later, an abundance so big of GPUs that nvidia was struggling to manage the huge surplus of cards they had and couldn't sell.

Link to comment
Share on other sites

Link to post
Share on other sites

It might change, as it has done multiple times before, but the flagship GPU crypto Ethereum is aiming to go POS mid-2022. If that significantly reduces mining interest, supply will easy anyway. In the GPU space, is anything else even close to it in profitability? Those in low power cost areas might switch to alternatives, but it might force higher power cost miners to drop out or move.

 

We will also have Intel as a 3rd player in the dGPU market by that point. So there's a lot of unknowns.

 

If we assume that we swing from a shortage to a surplus of GPUs on the market, pricing will drop for sure, but I still wouldn't expect them to return to Ampere launch levels. The world has changed since then. However there is still scope for much better pricing than today.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Alienware AW3225QF (32" 240 Hz OLED)
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, iiyama ProLite XU2793QSU-B6 (27" 1440p 100 Hz)
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Pandemic and recent mining surge since early 2020 has kinda settled as a norm. If NVIDIA were to try to normalize the MSRP trend, they would need to supply the barren shelves of a lot of retailers and websites. Such is the state that even used, mined Rx570/470 are selling like gold and a secondary scalper market has been created which is selling them near of above original MSRP in some parts of world. 

If supply isn't restored, the lowest denominator of available cards shall continue to be those Rx 4/5 series GTX 900 series. Developers would then need to accommodate those vast majorities and we are less likely to be seeing major game improvements (if we're talking about gaming). We are into the year two of console cycle and we are still waiting for announcements of games that blow away our mind. Just like, UC 4 trailers and gameplay did in 2014, early 2015.GOW, Ragnarok, or Horizon Forbidden west still seem polished ps4 games.

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/8/2021 at 2:19 AM, SolarNova said:

And yet i guarantee they will hold onto their new pricing tiers introduced with the 20 series and cemented in with the 30 series even once availability is back to normal. Even if retailers go back down to MSRP, the MSRPs set by Nvidia and AMD were raised in 2018.

 

MSRP:

x60 cards that used to be ~$200-250, since the 20 series are now $300-330

x70 cards that used to be ~$330-380, since the 20 series are now ~$500

x80ti cards that used to be ~$650-700, since the 20 series are now $1000+

 

 

Think it's fairly widely believed that 4000 series will see a (significant) bump in MSRP, according to tech rumors - so holding on to those previous MSRP's would be a win in my books, especially if we are able to buy card close to the MSRP.

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/7/2021 at 7:26 AM, AndreiArgeanu said:

That's hilarious. The average price of GPU's still hasn't dropped even a bit so I'm skeptical that by mid 2022, essentially in 5-6 months we'll see cards dropping in price. Maybe if new GPU's get announced by then since in mid 2022 it'll nearly be 2 years since the release of the 30 series then maybe people won't be inclined to buy GPU's so stock recovers somewhat?

 

That's because the logistics problems, scalpers, and crypto-miners made a perfect storm to make the price ridiculous. Nobody should be paying more than 10% over MSRP, never mind the 300% markup.

 

So do we expect nVidia to make the same mistake they made with the 20 series, where they produce a lot more cards at a higher price and the bottom fell out of the market when the crypto-miners didn't buy them then?

 

Chances are no. I think what we will see is nVidia produce more high-end, founders-edition, parts from the beginning, and probably cut the 4050 entirely out of the product lineup.

 

There's not much point producing GPU parts at low-performance tiers when the iGPU's have pretty much hit the same performance tier. Current 12th gen iGPU G7 80EU Xe parts are the same performance as 1050Ti's. So look where the ball is going, and by the time a 4050 comes out, intel's 13th or 14th gen parts will be out, and likewise their dedicated GPU cards will be out.

 

Even from a CUDA-only use, the x50 tier parts have been nothing short of useless and their only benefit over the iGPU has been that it can run the CUDA program at all. But why would you opt for a x50 part that takes 2 minutes to execute the program when you can get an x70/x80 part that takes 2 seconds.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

At this point, we're a Twitter post away from a GPU surplus so who knows.

 

However, the manufacturers might have data to back the theory that they'll be catching up.

 

Game development is slow too, your 2080 is still useable and any 30 series card will be useable for several years. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Heliian said:

Game development is slow too, your 2080 is still useable and any 30 series card will be useable for several years. 

until they don't. Everything goes RTX and 8k to 12k textures. New power demand and systems slapped onto newer cards while discontinuing the older ones.

We just have to see how they play their cards and how long this sort of hardware lasts. Game development isn't exactly slow either.

But like you say, it might be still "usable" for some years, it can quickly fall behind what performance one wanted out of an higher end card.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Quackers101 said:

until they don't. Everything goes RTX and 8k to 12k textures. New power demand and systems slapped onto newer cards while discontinuing the older ones.

We just have to see how they play their cards and how long this sort of hardware lasts. Game development isn't exactly slow either.

But like you say, it might be still "usable" for some years, it can quickly fall behind what performance one wanted out of an higher end card.

They're going to need to make the systems more efficient eventually.  

 

I'm down with rtx but 8k and 12k is commercial level shit at this point and won't be mainstream until 4k is mainstreamed and outdated.  

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Heliian said:

They're going to need to make the systems more efficient eventually.  

 

I'm down with rtx but 8k and 12k is commercial level shit at this point and won't be mainstream until 4k is mainstreamed and outdated.  

 

Nah. At this point 4K gaming is going to be the standard as 4K monitors end up being the only thing you can buy. It's been a bit of chicken-and-egg problem for a while, because while you could use a 4K monitor on a GTX 760, you wouldn't be able to play anything on it except in a 1080p window. The GTX 1080, works fine at 4K, but most games that were released after the GTX 1080, when run at 4K, only run at 4Kp30/4Kp48 not 4Kp60. And if you try to play something that has an actual high-resolution texture pack, like final fantasy 15, it just falls over.

 

So if a GTX 1080 and a RTX 3060 are roughly the same performance, then that means we're pretty close to the point that 4K monitors should also be standard.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Kisai said:

pretty close to the point that 4K monitors should also be standard.

Should be but it's not widespread adoption. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/7/2021 at 5:12 PM, Zodiark1593 said:

Would be something if the 4000 series ended up being clock-bumped rebrands at MSRPs reflecting current market rates.  Not as though Nvidia hadn’t done this before. The 700 series was mostly rebrands. 

Correct me if I'm wrong but that doesn't seem to be the reasonable thing to do imo. 

 

 

If they can put something new out that gets them more dies from the same waffer at the same/better performance and sell it for the same/higher price because it will get sold out instantly anyways then they only have to lose if they don't take the advantage of this. 

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, WereCat said:

If they can put something new out that gets them more dies from the same waffer at the same/better performance and sell it for the same/higher price because it will get sold out instantly anyways then they only have to lose if they don't take the advantage of this. 

agreed, like in the WAN show it's nothing wrong with Nvidia pushing forward and hopefully with new and fun tech.

It's just between all the **** nvidia has done the last 2 years or so, from creating mining cards only, seemingly holding some GPUs back, and various releases when they can't even do their own previous releases. then again, I guess that is just the team green for us.

Link to comment
Share on other sites

Link to post
Share on other sites

Recent LTT vids on RTX 2060 put my hopes up.

 

I think Nvidia is going to ramp up old by fairly capable GPU and at the same time not very profitable to mine like 2060.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Kisai said:

There's not much point producing GPU parts at low-performance tiers when the iGPU's have pretty much hit the same performance tier. Current 12th gen iGPU G7 80EU Xe parts are the same performance as 1050Ti's. So look where the ball is going, and by the time a 4050 comes out, intel's 13th or 14th gen parts will be out, and likewise their dedicated GPU cards will be out.

There's still a huge gap between today's best iGPUs and what would be a desktop 3050 level which doesn't exist. The 2060 12GB IMO would take that position. Intel I think have said they will continue to put lower EU counts in desktop CPUs since if people cared about performance they would use a dGPU. The high EU parts will be mobile only.

 

It is a bit hard to say exactly what is or isn't good enough for a gaming GPU, since there are different expectations of performance. IMO a 1050 has fallen off the acceptability level for modern AAA titles which I define as 1080p 60fps average at any quality setting. You're likely going to have to go low settings and even then may struggle to get 60 fps.

 

1 hour ago, Kisai said:

Even from a CUDA-only use, the x50 tier parts have been nothing short of useless and their only benefit over the iGPU has been that it can run the CUDA program at all. But why would you opt for a x50 part that takes 2 minutes to execute the program when you can get an x70/x80 part that takes 2 seconds.

Because it isn't a 60x difference between them. In Pascal era, the 1070 is less than 4x the FLOPS of a 1050. To Turing, a 2070 is around 2.5x a 1650. Of course, I'm assuming it is purely FLOPS limited, and not something like vram. There will be variations depending on the specific workload.

 

At least on nvidia side, I've seen big jumps in performance between generations since Maxwell. For the same gaming performance, compute is usually about a tier above per generation. I'm in a prime number finding challenge elsewhere at the moment with two GPUs on it, a 2070 and a 1080 Ti, both on about 70% power limit. They're doing work at the same rate. Generational changes could make a 3050 much faster than 1070, even if the two would be comparable in gaming performance. I think the support of new (smaller) data structures could make newer generation GPUs a lot faster for machine learning related use cases, but it isn't an area I focus on myself.

 

1 hour ago, Quackers101 said:

until they don't. Everything goes RTX and 8k to 12k textures.

PC gaming has always had a wide level of scaling. There will be limits on the lowest supported, but for at least the life of the next generation (3 years from now, 1 more year for current gen, 2 for next gen) no one will be forced to use "8k textures" or have nothing. As much as I do believe RTX is the future, in practical implementations seen to date there is still a great visual quality without it. I'm still using a 1080 Ti in one system which is still fine for high level 1440p gaming.

 

44 minutes ago, Kisai said:

So if a GTX 1080 and a RTX 3060 are roughly the same performance, then that means we're pretty close to the point that 4K monitors should also be standard.

We have one difference today that wasn't the case when the 1080 was current: upscaling. Putting asides arguments on the merits of actual implementations, I would hope we have better upscaling options included with future games, which will help take some load off "needing" high end GPUs for 4k gaming. It is a trick used on consoles going back a long way.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Alienware AW3225QF (32" 240 Hz OLED)
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, iiyama ProLite XU2793QSU-B6 (27" 1440p 100 Hz)
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×