Jump to content

Nvidia Expects GPU Shortages to Begin Receding in Mid-2022 amid a 3 year high of GPU shipments

Lightwreather
6 hours ago, porina said:

There's still a huge gap between today's best iGPUs and what would be a desktop 3050 level which doesn't exist. The 2060 12GB IMO would take that position. Intel I think have said they will continue to put lower EU counts in desktop CPUs since if people cared about performance they would use a dGPU. The high EU parts will be mobile only.

 

Well it's unlikely to see high-EU on a desktop CPU iGPU because it lowers the yield. That said, you can see where Intel is aiming with the PCIe cards, to tandem use the iGPU and dGPU Intel parts to get "ahead" of nVidia/AMD vs a nVidia/AMD dGPU alone, since neither of those cards can use the intel iGPU resources, and AMD hasn't bothered with iGPU's on their mid/high-end parts. 

 

But I'm of the opinion that a 1080p60 passable gaming performance can be done on the iGPU on laptops, and Intel should make those iGPU parts on all but the i7/i9 parts. The i7/i9 parts could do better with just the video encoder/decoder engines and the basic safe-mode video capability as most people do not use the iGPU graphics in tandem with the dGPU like they would on a laptop.

 

6 hours ago, porina said:

It is a bit hard to say exactly what is or isn't good enough for a gaming GPU, since there are different expectations of performance. IMO a 1050 has fallen off the acceptability level for modern AAA titles which I define as 1080p 60fps average at any quality setting. You're likely going to have to go low settings and even then may struggle to get 60 fps.

Well since the lack of PS5 supply is holding back AAA games, and thus anything that isn't PS5 grade, will be a rather poor experience on a PC port. Which means RTX 2080 performance is the baseline for PS5 ports on a PC. Assuming things like GPU-SSD streaming even work.

 

6 hours ago, porina said:

Because it isn't a 60x difference between them. In Pascal era, the 1070 is less than 4x the FLOPS of a 1050. To Turing, a 2070 is around 2.5x a 1650. Of course, I'm assuming it is purely FLOPS limited, and not something like vram. There will be variations depending on the specific workload.

No, you're misunderstanding where I got the numbers from. The ML stuff I've been experimenting with, I have a GTX 1050Ti Laptop, GTX 1080 Desktop, and a RTX 2070 laptop, and the same scripts and inputs have predictable timings, with the 1050Ti with 4GB of memory taking 10's of seconds, the 1080 8GB takes one second, and the RTX 2070 8GB seems instant, but if I tell you what the script is actually doing, it makes sense why the RTX is essentially instant, because the script is using full-precision tensors, which actually makes use of the tensor cores on the RTX card. If I apply the style transfer, the 1050Ti can't even do it, because it blows the VRAM requirements up. For all intents the 1080 and the RTX 2070 both have 8GB of memory and have similar-enough performance with the ML stuff, despite them going through different logic on the GPU.

 

 

6 hours ago, porina said:

At least on nvidia side, I've seen big jumps in performance between generations since Maxwell. For the same gaming performance, compute is usually about a tier above per generation. I'm in a prime number finding challenge elsewhere at the moment with two GPUs on it, a 2070 and a 1080 Ti, both on about 70% power limit. They're doing work at the same rate. Generational changes could make a 3050 much faster than 1070, even if the two would be comparable in gaming performance. I think the support of new (smaller) data structures could make newer generation GPUs a lot faster for machine learning related use cases, but it isn't an area I focus on myself.

 

I think the 3050 will be hobbled by memory bandwidth, because if you've noticed, nVidia's parts scale rather linearly with the memory bandwidth. So a 1050 and a 3050 are designed for a 1080p60 performance spec, at MOST, and that's why they also keep being 4GB parts.

 

 

 

6 hours ago, porina said:

PC gaming has always had a wide level of scaling. There will be limits on the lowest supported, but for at least the life of the next generation (3 years from now, 1 more year for current gen, 2 for next gen) no one will be forced to use "8k textures" or have nothing. As much as I do believe RTX is the future, in practical implementations seen to date there is still a great visual quality without it. I'm still using a 1080 Ti in one system which is still fine for high level 1440p gaming.

I hate to say it, but the "PC" version of games, tends to compromise, and that has been the case since 1987. When a new game comes out, you traded load time, because graphics adapters were rather consistent until 1993 and we started seeing "accelerator" cards. With the improvements in CPU and GPU over time, you will not find a game that will scale to all systems currently in use. You sure as hell are not going to run GTAV on a 386 from 1988, let alone a Pentium 4 from 1998 or a Core2Duo from 2008. You likely can't run it on anything built before 2014 without making it look like a PS2 game.

 

We saw this scaling problem with Cyberpunk 2077 first-hand, and Final Fantasy 15. You can have the highest-end hardware that came out the day the games came out, but their 4K texture packs make high-end configurations fall over if they're run at the intended settings, and if you attempt to scale them back to the performance of a PS4, well it ends up looking like a PS2.

 

6 hours ago, porina said:

We have one difference today that wasn't the case when the 1080 was current: upscaling. Putting asides arguments on the merits of actual implementations, I would hope we have better upscaling options included with future games, which will help take some load off "needing" high end GPUs for 4k gaming. It is a trick used on consoles going back a long way.

 

Yeah, and that might "save" the next generation of games from needing the bleeding edge GPU's to play a PS5 port of a game, but you know it's going to be misused in ways to not lower the requirements, but to cut-corners in development time, so that a native 4K game becomes worse than the 4K upscale, because the developer just doesn't make any 4K assets to save time.

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Kisai said:

Current 12th gen iGPU G7 80EU Xe parts are the same performance as 1050Ti's. So look where the ball is going, and by the time a 4050 comes out, intel's 13th or 14th gen parts will be out, and likewise their dedicated GPU cards will be out.

That's true but at the moment Intel doesn't have any desktop APU's that compare with what AMD has at this point which is pretty old regardless. I'd be great to see Intel boost those Xe cores up and pack them into an i3 and I'm hoping that Intel will actually end up releasing their GPUs soon.

Link to comment
Share on other sites

Link to post
Share on other sites

GPUs could fall from the skies and that wouldn't be enough to quell the army of scalpers and their bots that sweep them up and onto eBay. We've gone from a supply shortage to a supply disruption. As long as store shelves are empty people will perceive that there is no option but to pay the scaler prices on eBay. Online sellers only play into that by inflating their prices even under the guise of questionable bundles. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/7/2021 at 11:12 AM, Zodiark1593 said:

Would be something if the 4000 series ended up being clock-bumped rebrands at MSRPs reflecting current market rates.  Not as though Nvidia hadn’t done this before. The 700 series was mostly rebrands. 

I mean at this point I would take it, plus I would still rather take a 700 series generation, the last couple generations we only get new cards every 2 years when it used to be every year. The 780 and 780 Ti did use a bigger chip than the 600 series and the rest mostly from previous gen when down one tier.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×