Jump to content

Desktop GPU sales hit 20 year low, 42 percent fewer than last year.

Vordreller

Summary

Desktop GPU sales hit 20 year low, shipped 42 percent fewer than last year.

 

Quotes

Quote

The industry shipped around 6.9 million standalone graphics boards for desktop PCs — including the best graphics cards for gaming — and a similar number of discrete GPUs for notebooks in the third quarter. In total, AMD, Intel, and Nvidia shipped around 14 million standalone graphics processors for desktops and laptops, down 42% year-over-year based on data from JPR. Meanwhile, shipments of integrated GPUs totaled around 61.5 million units in Q3 2022.

 

My thoughts

Linking this to the increased prices... this feels really bad. It makes it feel as if the price increase was done not just because of cost, but they noticed they'd get less sales and so they upped the price in order to get more income, rather than just accept the reality of the market situation.

 

We've been hearing for months now that prices are high because of crypto miners. So, supply and demand, demand goes up while supply stays same -> prices go up.

 

But now it becomes public that the amount of units purchases has fallen. Which means less of the supply has been bought up. Which means demand wasn't actually as high as we were made to believe. And thus, the justification of the high prices was bullshit all along.

 

Companies selling less make less profits. If demand drops, units sold drop, and amount of income for the company drops. It sounds to me like they artificially claimed that demand was high, as to justify raising prices and thus keep their expected income somewhat stable, in face of the lowering demand.

 

Is that incorrect?

 

 

 

Sources

https://www.tomshardware.com/news/sales-of-desktop-graphics-cards-hit-20-year-low

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Vordreller said:

Companies selling less make less profits. If demand drops, units sold drop, and amount of income for the company drops. It sounds to me like they artificially claimed that demand was high, as to justify raising prices and thus keep their expected income somewhat stable, in face of the lowering demand.

My take is that during the pandemic, Nvidia learned that people will buy their GPUs at scalper prices. Because of this, Nvidia increased the price themselves because they figured they could make more money per card without necessarily shipping hardware that justified the higher prices.

 

I guess people aren't as willing to pay scalper prices as Nvidia thought they were.

Quote or tag me( @Crunchy Dragon) if you want me to see your reply

If a post solved your problem/answered your question, please consider marking it as "solved"

Community Standards // Join Floatplane!

Link to comment
Share on other sites

Link to post
Share on other sites

Here's hoping this drive prices down instead of upward... But knowing Nvidia's upward trend of the last decade.. I ain't holding my breath. 

CPU: AMD Ryzen 3700x / GPU: Asus Radeon RX 6750XT OC 12GB / RAM: Corsair Vengeance LPX 2x8GB DDR4-3200
MOBO: MSI B450m Gaming Plus / NVME: Corsair MP510 240GB / Case: TT Core v21 / PSU: Seasonic 750W / OS: Win 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Unfortunately, ARC GPU's aren't mature enough for most user's daily gaming.

I tried AMD once, had driver issues in 3/4 games I was trying to play (5700XT, about half a year after release)

Because of those two things, I'd still grab Nvidia off the shelf, but I can't convince myself to pay more than $500 for a GPU. I theoretically should, I would love to play more PC games on my OLED, but my PS5 does more than well enough for sightseeing games so a 3060 (paired with a 240Hz 1080p BenQ) stays in my rig for CS:GO, older games, and the occasional game that isn't released on PS. 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Colty said:

Unfortunately, ARC GPU's aren't mature enough for most user's daily gaming.

I tried AMD once, had driver issues in 3/4 games I was trying to play (5700XT, about half a year after release)

Because of those two things, I'd still grab Nvidia off the shelf, but I can't convince myself to pay more than $500 for a GPU. I theoretically should, I would love to play more PC games on my OLED, but my PS5 does more than well enough for sightseeing games so a 3060 (paired with a 240Hz 1080p BenQ) stays in my rig for CS:GO, older games, and the occasional game that isn't released on PS. 

I think as long as Intel sticks with it, they could make a pretty competitive GPU in the next 3 years or so.

 

I buy Nvidia purely because of their software and drivers at this point, but fortunately I'm not enough of a gamer to need the latest and greatest things. I just recently upgraded to a 2070 Super.

 

Artificial intelligence and machine learning might be what eventually pushes me over to buy the new Nvidia cards, but I'm not there yet.

Quote or tag me( @Crunchy Dragon) if you want me to see your reply

If a post solved your problem/answered your question, please consider marking it as "solved"

Community Standards // Join Floatplane!

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, Vordreller said:

But now it becomes public that the amount of units purchases has fallen. Which means less of the supply has been bought up. Which means demand wasn't actually as high as we were made to believe. And thus, the justification of the high prices was bullshit all along.

The discussion has been about price to end user; while MSRPs for cards were high, getting them at those prices was very hard and instead you'd have to pay significantly more due to scalpers and miners driving demand. What we're seeing now is an increase in MSRP; this is likely causes by increased production costs since the entire industry has been suffering from supply shortages. Nvidia could either take a hit to their margins or wind down production to match the demand.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, Vordreller said:

Linking this to the increased prices... this feels really bad. It makes it feel as if the price increase was done not just because of cost, but they noticed they'd get less sales and so they upped the price in order to get more income, rather than just accept the reality of the market situation.

Or... they're selling less because the price got increased?

 

I think they simply got used to customers paying whatever, so they created a GPU for that market. Turns out by the time their design is done that market (miners) is no longer there. And gamers (at last some of them), no longer faced with a shortage, apparently aren't willing to pay these prices.

Remember to either quote or @mention others, so they are notified of your reply

Link to comment
Share on other sites

Link to post
Share on other sites

I wonder how many read the articles regarding this, vs just jumping to conclusions based on their preconceived notion about the situation.

 

People are spending less time at home so as a result, fewer people are buying gaming stuff.

With ever increasing inflation and many people having unsure financial futures, people are less likely to spend money on frivolous things like gaming.

This is happening across the industry, not just Nvidia and their gaming graphics cards.

A large amount of people bought computers at the start and during the pandemic. As a result, the sales are no longer as spread out.

 

 

There are several other factors at play as well, and this is not something that is exclusive to the GPU market.

It's very rare for such a big change to the market to be caused by a single thing. It's a very complex system with lots of different variables.

Link to comment
Share on other sites

Link to post
Share on other sites

49 minutes ago, Vordreller said:

Desktop GPU sales hit 20 year low, shipped 42 percent fewer than last year.

People bought GPUs in droves during the pandemic. We saw this with the sales numbers these companies were quoting - Nvidia's sales figures throughout the pandemic were really good.

 

These GPUs don't need replacing yet. So people aren't.

CPU: i7 4790k, RAM: 16GB DDR3, GPU: GTX 1060 6GB

Link to comment
Share on other sites

Link to post
Share on other sites

We did have massive sales in past years leading to shortages, and elevated pricing. I do wonder, if we specifically exclude crypto boom times, how are sales now in comparison? We can't rule out macroeconomic environment too. New GPUs may be less of a priority for discretionary spend for many as everything else is getting more expensive too. Companies including but not limited to nvidia will be aware of this and will try to avoid over producing.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, porina said:

I do wonder, if we specifically exclude crypto boom times, how are sales now in comparison?

I imagine overal GPU sales are nowhere near as bad as the official numbers suggest, but that people are going for used cards rather than new ones, because there are plenty around and the value is so much better.

 

Right now you can get 3090s for ~£800 on eBay, which is a great value compared to the £1200 4080. Alternatively second-hand 3080s aren't too hard to find in the £550-600 range, which is about the price that 3070s are selling for new.

CPU: i7 4790k, RAM: 16GB DDR3, GPU: GTX 1060 6GB

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Crunchy Dragon said:

Artificial intelligence and machine learning might be what eventually pushes me over to buy the new Nvidia cards, but I'm not there yet.

Your 2070 Super should be more than enough for most basic to intermediate stuff. By the time you feel the need for a more powerful GPU (and it'll likely be due to vram issues), you could go for an used 3090 for cheap or just use a cloud instance.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, tim0901 said:

I imagine overal GPU sales are nowhere near as bad as the official numbers suggest, but that people are going for used cards rather than new ones, because there are plenty around and the value is so much better.

For sure used GPUs can offer substantial value for some people prepared to go that route, but personally I'm not that comfortable buying something high cost used from sources I don't know. Especially if there is a fair chance it has been mined heavily for the last two years regardless of what the seller puts on the listing. I did get a 2080 Ti used when it became clear that my chances of getting a 3080 around launch time weren't looking great, but I moved it on again as soon as I got a 3070 new.

 

Right now I look at the GPU market and think, while I wouldn't mind more performance than a 3070, there is nothing giving a clear performance improvement at a price I'm willing to spend. I'm not spending current 40 series money. I'm not paying current rate for a new 30 series. AMD remain as irrelevant as ever. The way things are going I'm not hopeful that lower 40 series will offer sufficient reason to upgrade. I might sit it out until 50 series or Arc B comes out. Even if my 3070 dies I still have a spare 2070 and 1080 Ti I could use as placeholders rather than pay for a new GPU now.

 

Hmm... looking at what I wrote, if I'm at all representative of part of the market that might go some way to describe the topic of this thread!

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Uh ya, gpu mining is essentially dead, the used market is flooded, people weren't going to wait 2 years to get an affordable new gpu and now high msrps.  

 

Also, you don't need to buy a new gpu every year and nvidia already drove away anyone who was interested in getting into it, they're not likely to bother again. 

Link to comment
Share on other sites

Link to post
Share on other sites

It would be interesting to see these stats over a, lets say, ten year period. Just to see what the demand was like as a trend and to see what the sort of impact the pandemic and crypto boom actually had to sales/demand.

 

Just looking back over a year, or even 2-4years, probably doesn't give enough info as to whether things have "returned to normal".

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, Heliian said:

you don't need to buy a new gpu every year  

In the longer term there should be an ongoing upgrade cycle. nvidia has been releasing generations on a regular 2 year or so cycle, with AMD being more erratic. There may be a bit of a rush around the times of releases.

 

6 minutes ago, SADS said:

It would be interesting to see these stats over a, lets say, ten year period. Just to see what the demand was like as a trend and to see what the sort of impact the pandemic and crypto boom actually had to sales/demand.

Look at the source link, there are some charts going back many years there. Most interestingly is AMD falling significantly, that my prediction Intel would overtake them for 2nd place could be a lot earlier than I thought. Then again that data was before RDNA3 launch so that should give an uptick to AMD for now and hold off Intel for a bit longer.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, porina said:

In the longer term there should be an ongoing upgrade cycle. nvidia has been releasing generations on a regular 2 year or so cycle

Well, in the past the GPU's usually got significantly more powerful with each generations, and monitor resolutions were increasing rapidly.

But for the past few generations, the monitor resolutions have stopped increasing and the GPU's started to become good enough to play on high graphics at 4k resolution, so people have less of a need to upgrade. Even less for people who are playing at 1080p or 1440p.

Link to comment
Share on other sites

Link to post
Share on other sites

Also GPUs from past generations are still extremely powerful still. My 3070 already crushes 4K gaming, and from the distance I sit from my monitor I don't even need to play in 4K. 

 

Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Gaires said:

But for the past few generations, the monitor resolutions have stopped increasing and the GPU's started to become good enough to play on high graphics at 4k resolution, so people have less of a need to upgrade. Even less for people who are playing at 1080p or 1440p.

I'd kinda agree that at 1440p or lower, mid range GPUs like a 3060 or equivalent are sufficient for a high end experience. But for a great 4k experience I still think there is a lot of performance to price improvement needed. For native rendering only the 4090 makes it so you don't have to worry about performance, with lesser GPUs still needing upscaling technologies and/or reduced settings to help out.

 

59 minutes ago, Shreyas1 said:

Also GPUs from past generations are still extremely powerful still. My 3070 already crushes 4K gaming, and from the distance I sit from my monitor I don't even need to play in 4K. 

Guess it depends on expectations but I've used a 3070 with 4k TV, and I consider it entry level for that use case. For sure, it works with appropriate settings, but I think 3080 is the >60fps high+ sweet spot. I'm temporarily not using that TV so I'm reduced to a 1440p monitor, and in that scenario I don't find the 3070 lacking.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, porina said:

 

 

Guess it depends on expectations but I've used a 3070 with 4k TV, and I consider it entry level for that use case. For sure, it works with appropriate settings, but I think 3080 is the >60fps high+ sweet spot. I'm temporarily not using that TV so I'm reduced to a 1440p monitor, and in that scenario I don't find the 3070 lacking.

I mean tbf my expectations aren't the highest, but for Halo infinite at least I was playing 4K multiplayer yesterday and didn't notice anything off. That's a pretty recent game, and it doesn't feel awful. And again, I don't think 4K is actually needed for the majority of people (myself included) who sit a few feet away from their monitor and don't have something the size of a TV. 
 

Games like Elden ring which are also quite recent are able to run well on my steam deck, much less a 3070, so no problems there also. And I barely notice the fact that it's running at 720p 45 fps low settings since it's not that close to my face. Really makes me think if people actually notice any differences in playing at 4K if they are sitting at a normal distance from their screen.

 

I also haven't really seen any mind blowing game graphics in a while that are too intensive for modern hardware, like crysis, or even on the level of battlefield 1. Instead it seems like poorly performing games are just not well optimized nowadays rather than trying something extremely graphically intensive. 

 

Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

So.... did I buy too early?

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, Shreyas1 said:

I don't think 4K is actually needed for the majority of people (myself included) who sit a few feet away from their monitor and don't have something the size of a TV. 

When using my 4k TV for gaming, I sit far enough that I can see the whole thing comfortably, and relatively speaking further than a monitor as I struggle to use it as a desktop even with UI scaling. There is a "depends on the game" element, as some scale better than others. The more photo-realistic the game, the lower the resolution can be and still not be so noticeable.

 

I do feel that 4k on TV is probably an area of growth since many TVs have been 4k for quite a while, and is a selling point more so for current gen consoles. 

 

24 minutes ago, Shreyas1 said:

I also haven't really seen any mind blowing game graphics in a while that are too intensive for modern hardware, like crysis, or even on the level of battlefield 1. Instead it seems like poorly performing games are just not well optimized nowadays rather than trying something extremely graphically intensive. 

I find it is slightly older games, from say 2+ years ago that struggle most at 4k on lower hardware. Modern games support upsampling which makes it much easier to drive. Older games before then, that didn't get patched for it, still need to render natively.

 

I get a feeling that "badly optimised game" is a phrase that's thrown around whenever devs do try to raise the bar and not be held back by ancient hardware. Which is not to say there aren't badly optimised games, we have plenty of recent examples of that. I do feel we have to give up on supporting older hardware if we're to make better use of newer features and bring forward next level improvements.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

I'm surprised nobody is talking about this little but equally important detail. (Bolded for emphasis)

Quote

Despite slowing demand for discrete graphics cards for desktops (unit sales were down 31.9% year-over-year), Nvidia not only managed to maintain its lead, but it actually strengthened its position with an 86% market share, its highest ever, according to JPR. By contrast, AMD's share dropped to around 10%, its lowest market share in a couple of decades. As for Intel, it managed to capture 4% of the desktop discrete GPU market in just one quarter, which is not bad at all. Of course, the majority of AIB parts that Intel sold in Q3 2023 were entry-level models, but those were demanded by Intel's customers due to brand awareness and similar factors.

Market share graph in the spoiler:

Spoiler

ga4ysdRtWufiCuUx3tFmHD-1200-80.png.webp

This is bad news for everyone. Nvidia is bordering on Microsoft levels of market share (i.e. monopoly levels), and the consequences of Radeon Technology Group being unable to get their S%#@ together for years on end have finally caught up with them (for too many reasons to get into here). 

 

On topic: I'm content with running my 2060 KO into the ground, as the games I play aren't all that demanding in the first place, and the most demanding game I do have is 5-7 years old, or along those lines. I suspect (but could be wrong) that I will still be fine as I upgrade to 4K in the near future. If not, integer scaling is my friend.

Mayonnaise is an instrument!  

Current Build - Ryzen 7 3800x (eco mode enabled), MSI B550M MAG Mortar, G.Skill Ripjaws V 32 GB (2x16) 3200 14-14-14-34, EVGA 2060 KO Ultra, EVGA G2 550W, Phanteks Enthoo Pro M, Creative Sound Blaster Audigy Rx

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×