Jump to content

Nvidia is ‘no longer a graphics company’

Fasterthannothing

Summary

As AI gains popularity so does the need for compute performance and Nvidia wants to transition to that marketplace 

 

Quotes

Quote

 It’s no secret that Nvidia has quickly morphed into an AI company. Although it creates some of the best graphics cards for PC gamers, the company’s supercomputing efforts have catapulted it into being a trillion-dollar company, and that transformation was spurred on by the monumental rise of ChatGPT. That shift, from a graphics company to an AI company, was intentional choice by Nvidia’s CEO Jensen Huang.

In a moment of saying the quiet part out loud, Greg Estes, the vice president of corporate marketing at Nvidia, said: “[Jensen] sent out an email on Friday evening saying everything is going to deep learning, and that we were no longer a graphics company. By Monday morning, we were an AI company. Literally, it was that fast.”

 

My thoughts

I saw this coming from a mile away every since the 10 series they have stopped being competitive price wise for normal users. Which to be honest why should they the money is still there without trying. Businesses will pay way more than even the consumers buying a 4090 and AI is eating GPUs faster than the cryptocurrency hype train and this time it's businesses with deep pockets forking over the cash. 

 

Sources

 https://www.digitaltrends.com/computing/nvidia-said-no-longer-graphics-company/

Link to comment
Share on other sites

Link to post
Share on other sites

Apple is no longer a "computer company", but they still make computers.  Even though iPod, iPad, then iPhone blew up Apple's revenue.

 

It's hard to pass up on a side business even if it's only a few billion a year.  Think of it as training for new engineers before they get to work with the AI teams if they are good enough.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Fasterthannothing said:

My thoughts

I saw this coming from a mile away every since the 10 series they have stopped being competitive price wise for normal users. Which to be honest why should they the money is still there without trying. Businesses will pay way more than even the consumers buying a 4090 and AI is eating GPUs faster than the cryptocurrency hype train and this time it's businesses with deep pockets forking over the cash. 

 Emphasis mine- 

Nvidia has the head start on AI HW, but it won't remain for long. It's going to come down to who can secure those fab contracts. MS, Amazon, and AMD could replace Nvidia once market saturation is met.

So what happens in that aftermath? Nvidia comes groveling back to the consumer market they once abandoned?

Smart money is they split the company so one doesn't cannibalize the other, but we shall see.

Link to comment
Share on other sites

Link to post
Share on other sites

Jensen has been attempting to make Nvidia something closer to IBM (more older than current) for the better part of 2 decades. He's finally gotten there.  Maybe.

Link to comment
Share on other sites

Link to post
Share on other sites

It's not going to last.

 

Right now Nvidia has near monopoly on ML training accelerators. Because CUDA is awesome, and AI reserchers love CUDA. Eventually, the market will move from training to inference, with specialized inference accelerators and the bulk of silicon sold will be specialized accelerator tailored to the workload at scale it has to run.

 

Nvidia is good at making general accelerator architectures that can do both GPU and tensor acceleration with good drivers. Nvidia won't make a specialized accelerator that only runs copilot on Azure, a specialized accelerator for Amazon LLM, a specialized accelerator for Google LLM, a Facebook accelerator. A Twitter Gork chatbot accelerator etc..

And Nvidia will return to selling GPUs. It's just Nvidia was really lucky that it found ETH miners to sell GPU at 5X price, then when that demand died, it found AI startup that pay 10X price for an H100. It's a bit much betting on a third customer willing to pay multiples.

Link to comment
Share on other sites

Link to post
Share on other sites

Eh, while I hope Nvidia doesn't stop making cards, it wouldn't surprise me if they slowed down on them. I'd love to see AMD pick up the crown but they seem unwilling to invest in their driver teams. 

 

Also, if Nvidias cards were priced too high for the average gamer, they wouldn't be dominating the PC space like they are. Lots of people that game have the money to invest in the hobby they love. PC gaming, as a hobby, is actually quite cheap. 

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Fasterthannothing said:

Summary

As AI gains popularity so does the need for compute performance and Nvidia wants to transition to that marketplace 

 

Quotes

 

My thoughts

I saw this coming from a mile away every since the 10 series they have stopped being competitive price wise for normal users. Which to be honest why should they the money is still there without trying. Businesses will pay way more than even the consumers buying a 4090 and AI is eating GPUs faster than the cryptocurrency hype train and this time it's businesses with deep pockets forking over the cash. 

 

Sources

 https://www.digitaltrends.com/computing/nvidia-said-no-longer-graphics-company/

I think it will be short lived. As we saw with the Bitcoin and Ethereum mining boom and busts, "GPU" cards burn far too much energy to be used when a specific ASIC could do it better, faster and cheaper for the same power envelope.

 

What I predict, is that certain AI trainer hardware will be produced at some point, much like Antminers, and stuff similar to it, that will be used to train a specific kind of thing once we stop seeing improvements in certain uses. 

 

We've hit the ceiling on a lot of "general purpose" AI stuff, if only because there's not very many corners left to cut, and we need to swing back to quality instead of quantity, and that would just be the same algorithms again, but this time using float32 and float64's instead of float8 and float16's in the otherwise same thing.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Without looking up the numbers again, when I looked recently (within last 2 weeks) nvidia's gaming revenue was roughly half that of AMD's total company revenue. Even if it is a smaller proportion of the company than ever, it is not small money!

 

Will AMD "save" us? Don't get your hopes up. From yesterday's AMD event:

AMD showed off MI300X and MI300A generally claiming advantages over nvidia's H100/GH (but what about H200?).

 

It does feel like we're in a boom phase for AI. High end hardware for it is hard to come by so any supply can help. On going specific silicon, it may still be a bit early for that. Software for AI is still developing quickly. What's great today isn't what was great a year or two ago. If you make hardware too optimised on a particular path, you may lose out with near future shifts. Some degree of generalisation would be safer until the area matures some more.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, StDragon said:

So what happens in that aftermath? Nvidia comes groveling back to the consumer market they once abandoned?

Sorry, I must have missed something, but when did Nvidia abandon the consumer market?

Them being an "AI company" does not mean they are abandoning the consumer market, just like AMD saying AI is their number 1 priority doesn't mean they are abandoning the consumer market.

 

3 hours ago, Kisai said:

What I predict, is that certain AI trainer hardware will be produced at some point, much like Antminers, and stuff similar to it, that will be used to train a specific kind of thing once we stop seeing improvements in certain uses. 

No need for predictions. Several companies including Microsoft, Amazon and Google have already announced and launched "AI trainer hardware". 

Google has been using their Tensor Processing Unit (TPU) (not the same as "Google Tensor" that's in phones) in their data centers since 2015. In 2018 they made them available for third-party use.

AWS introduced their "Trainium" chips in late 2020, and last month introduced their second generation.

Microsoft announced their Maia 100 AI accelerator about 3 weeks ago.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LAwLz said:

Sorry, I must have missed something, but when did Nvidia abandon the consumer market?

Them being an "AI company" does not mean they are abandoning the consumer market, just like AMD saying AI is their number 1 priority doesn't mean they are abandoning the consumer market.

Not yet anyways. But the GPU market is very competitive and makes me wonder how much more Nvidia is willing to further develop into the driver stack to maintain the hardware?

 

Or to put it another way observationally: AI is a bolt-on to their GPU architecture. But at what point will Nvidia view GPU technology as a bolt-on to AI hardware? And if the latter, why do that when AI can be developed in HW from the ground up and dedicate all resources to it?

 

If Nvidia had to start all over again knowing what they know now, I doubt they would even bother with GPU technology. And why should they when data center AI hardware is far more profitable? For now, they're developing and maintaining GPU technology with existing market momentum. That lead could wane once Intel and AMD catch up.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, 05032-Mendicant-Bias said:

Right now Nvidia has near monopoly on ML training accelerators. Because CUDA is awesome, and AI reserchers love CUDA. Eventually, the market will move from training to inference, with specialized inference accelerators and the bulk of silicon sold will be specialized accelerator tailored to the workload at scale it has to run.

But training won't stop. Yeah, eventually the inference market will be way bigger, and nvidia GPUs are still being used for that since building your model for a specific hardware is kinda annoying, but there will still be new models, fine-tunings and whatnot that will require training, along with updating existing models with new data.

 

5 hours ago, Kisai said:

I think it will be short lived. As we saw with the Bitcoin and Ethereum mining boom and busts, "GPU" cards burn far too much energy to be used when a specific ASIC could do it better, faster and cheaper for the same power envelope.

 

What I predict, is that certain AI trainer hardware will be produced at some point, much like Antminers, and stuff similar to it, that will be used to train a specific kind of thing once we stop seeing improvements in certain uses. 

 

We've hit the ceiling on a lot of "general purpose" AI stuff, if only because there's not very many corners left to cut, and we need to swing back to quality instead of quantity, and that would just be the same algorithms again, but this time using float32 and float64's instead of float8 and float16's in the otherwise same thing.

Those already existi, but are not as easy nor flexible as a regular CUDA-enabled GPU. You're also stuck with a specific cloud vendor for each device, and can't get one of those at your house to practice on.

17 minutes ago, StDragon said:

Or to put it another way observationally: AI is a bolt-on to their GPU architecture. But at what point will Nvidia view GPU technology as a bolt-on to AI hardware? And if the latter, why do that when AI can be developed in HW from the ground up and dedicate all resources to it?

Nvidia has been working on CUDA since 2006 for GPGPU, which was used in 2012 in the first time to accelerate a CNN model. From then on Nvidia just kept improving on that. The boom may have happened now, but Nvidia has the grasp in that market since a log time ago.

You're also not clear when you say "when AI can be developed in HW from the ground up and dedicate all resources to it?". Are you talking about training or inference?

 

21 minutes ago, StDragon said:

If Nvidia had to start all over again knowing what they know now, I doubt they would even bother with GPU technology. And why should they when data center AI hardware is far more profitable? For now, they're developing and maintaining GPU technology with existing market momentum. That lead could wane once Intel and AMD catch up.

As I said above, they got to dominate the market by making it so regular consumers could have access to hardware to do actual work on, and that easily migrates to when you move to a data center, it's the exact same software stack, but with more horsepower.

Their DC offerings are still GPUs, and both Intel's and AMD's offerings on the AI part are also just GPUs.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, igormp said:

But training won't stop. Yeah, eventually the inference market will be way bigger, and nvidia GPUs are still being used for that since building your model for a specific hardware is kinda annoying, but there will still be new models, fine-tunings and whatnot that will require training, along with updating existing models with new data.

 

Those already existi, but are not as easy nor flexible as a regular CUDA-enabled GPU. You're also stuck with a specific cloud vendor for each device, and can't get one of those at your house to practice on.

Nvidia has been working on CUDA since 2006 for GPGPU, which was used in 2012 in the first time to accelerate a CNN model. From then on Nvidia just kept improving on that. The boom may have happened now, but Nvidia has the grasp in that market since a log time ago.

You're also not clear when you say "when AI can be developed in HW from the ground up and dedicate all resources to it?". Are you talking about training or inference?

 

As I said above, they got to dominate the market by making it so regular consumers could have access to hardware to do actual work on, and that easily migrates to when you move to a data center, it's the exact same software stack, but with more horsepower.

Their DC offerings are still GPUs, and both Intel's and AMD's offerings on the AI part are also just GPUs.

And even if we assume that Nvidia had to abandon gaming in order to develop AI (which they don't because as you said, it's CUDA and they like that people can have these cards for development at home instead of being tied to a specific cloud vendor), gaming is still a massive income source with billions upon billions of dollars in revenue.

Companies typically don't shut down large profit-making divisions. 

 

I think the comparison with Apple earlier in the thread is pretty good. iOS devices are by far the biggest money maker for Apple, yet they continue to improve (quite a lot as well) on their Mac offerings.

 

 

I will remain optimistic and say that hopefully, this (AI) is a rising tide that will lift all boats.

Just because AI improves does not mean gaming will suffer. It is entirely possible that resources spent on developing GPUs for AI will also result in better GPUs for gaming.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, LAwLz said:

gaming is still a massive income source with billions upon billions of dollars in revenue.

Companies typically don't shut down large profit-making divisions. 

It's their second largest market:

image.thumb.png.1456c1295ac41928191f4ae47567f99d.png

 

Yeah, no reason they'd just kill it, but they are making it way more niche as time goes by, they already noticed that they can drive margins up by ignoring the entry market. It also helps that any chip that doesn't go into DC hands can easily be used as a gaming chip, 100% fab occupation.

 

6 minutes ago, LAwLz said:

Just because AI improves does not mean gaming will suffer. It is entirely possible that resources spent on developing GPUs for AI will also result in better GPUs for gaming.

People may complain, but stuff like DLSS is really cool. Maybe a switch 2 with an orin-based tegra and DLSS can make a portable device play games with really nice graphics due to it. People may not like that idea, but such things are meant to only improve over time, so if you can get prettier images with less hardware I see that as a win.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, StDragon said:

Not yet anyways. But the GPU market is very competitive and makes me wonder how much more Nvidia is willing to further develop into the driver stack to maintain the hardware?

 

Or to put it another way observationally: AI is a bolt-on to their GPU architecture. But at what point will Nvidia view GPU technology as a bolt-on to AI hardware? And if the latter, why do that when AI can be developed in HW from the ground up and dedicate all resources to it?

 

If Nvidia had to start all over again knowing what they know now, I doubt they would even bother with GPU technology. And why should they when data center AI hardware is far more profitable? For now, they're developing and maintaining GPU technology with existing market momentum. That lead could wane once Intel and AMD catch up.

None of what you said here makes business sense. 

Nvidia has zero incentive at the moment or in the foreseeable future to back out of consumer hardware. 
The fact that AI is not a major part of their portfolio does not discredit the rest of their portfolio which makes them money and cross-licenses almost all the tech between segments. 

 

  

2 hours ago, igormp said:

Yeah, no reason they'd just kill it, but they are making it way more niche as time goes by, they already noticed that they can drive margins up by ignoring the entry market. It also helps that any chip that doesn't go into DC hands can easily be used as a gaming chip, 100% fab occupation.

Yea the math changed on how profitable low end is, the cost per mm^2 is higher than ever as well as all the supporting structures around it to make a viable chip. As well as min-maxing revenue per wafer by making DC chips. Its not even for profit margins but revenue flow. It's not even an inflation argument I'm making. The costs to improve just increased faster than economies of scale decreased to balance it out. The whole value curve shifted up when wafers went from 5k to over 20k and ram failed to follow Moore's law while also using more power than ever.  Its just near impossible to design a card that costs under 50 USD to build and have it be viable in the market outside of playing quake III. 


Not saying Nvidia isnt taking advantage and getting more profit margins than ever, but even if they were taking cuts similar to Nvidia of old, the entire product stack and pricing is still dead. 

ARC doesnt make money. They are just happy to be alive. 

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, LAwLz said:

And even if we assume that Nvidia had to abandon gaming in order to develop AI (which they don't because as you said, it's CUDA and they like that people can have these cards for development at home instead of being tied to a specific cloud vendor), gaming is still a massive income source with billions upon billions of dollars in revenue.

Companies typically don't shut down large profit-making divisions. 

 

I think the comparison with Apple earlier in the thread is pretty good. iOS devices are by far the biggest money maker for Apple, yet they continue to improve (quite a lot as well) on their Mac offerings.

 

 

I will remain optimistic and say that hopefully, this (AI) is a rising tide that will lift all boats.

Just because AI improves does not mean gaming will suffer. It is entirely possible that resources spent on developing GPUs for AI will also result in better GPUs for gaming.

I could think of two reasons why you wouldn't support a big market. One would be that by making gpus for gamers they somehow cannibalize their AI market with cheaper gamer oriented cards being able to do AI related stuff for significantly less. The other would be that they don't have the capacity to meet the demand of both markets so they have to prioritize one and if that one has so much demand it could be no longer worth it to serve the other market. 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Brooksie359 said:

I could think of two reasons why you wouldn't support a big market. One would be that by making gpus for gamers they somehow cannibalize their AI market with cheaper gamer oriented cards being able to do AI related stuff for significantly less.

Consumer facing AI solutions are needed regardless. We are transitioning with CPUs coming with NPUs. Using dGPUs remains a higher tier for those that need a bit more. If you're really serious, then pro solutions are there above that.

 

Take the RTX 6000 Ada for example. It has 48GB ECC VRAM and more cores than a 4090. Then again, it costs about 4x of a 4090. Lower down, there's the RTX 4500 Ada which is roughly same core as 4070 Ti, with 24GB ECC VRAM for only 3x the cost. It will be an interesting balance for them with AMD gaining AI software support traction and offering more VRAM on consumer tier GPUs. With the ongoing increase in interest in AI they know others will want to capture a bigger piece of the pie too, and they're going to have to defend their position in that context regardless of their own gaming GPUs.

 

As unlikely as it is, if nvidia were to one day say they're not making new gaming GPUs, chances are they will try to spin off and sell that business. It is no small money. It might be a fun exercise to think who would want it and could afford it. It probably can't be a clean separation with some required IP retained by nvidia for their AI side and having to be licensed.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, porina said:

Consumer facing AI solutions are needed regardless. We are transitioning with CPUs coming with NPUs. Using dGPUs remains a higher tier for those that need a bit more. If you're really serious, then pro solutions are there above that.

Do, or do not. There is no try.

CPUs integrated NPUs isn't going to be running LLMs or anything meaningful. At best they can provide real-time analysis of streamed data such as video and audio to cleanup web meetings. Or working with the iGPU, enhanced video upscaling. But running something like Stable Diffusion, you'll need your own dGPU or ideally run that from the cloud where it can leverage a larger dataset. For mobile devices, NPUs are already in SOCs where it can tease out the stream of data from accelerometers to assist in what actual physical activity is going on and categorize it. NPUs could in theory be used to enhance WiFi connectivity via integration with the DSP to analyze RF signals.

For anything else more complex, you'll be accessing it from the cloud due to the compute required.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, StDragon said:

CPUs integrated NPUs isn't going to be running LLMs or anything meaningful.

This is something I'm struggling to scope in size and capability. We have a whole bunch of different execution resources available to us already. How does each one scale in each segment of potential AI usage? CPU without specific AI acceleration, with VNNI or similar, with NPU, with iGPU? Then we get different size dGPUs, and for those really throwing money at it, dedicated silicon.

 

I may be out of date, but previously CPU could still be used for datasets that can't be chopped down to fit GPUs. Does that work with AI stuff too?

 

Users should scale hardware according to their use case. In the context of this thread, my point remains that even if nvidia switches aggressively to AI first and depreciates gaming, GPUs may not go away.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/7/2023 at 5:46 AM, StDragon said:

Nvidia has the head start on AI HW, but it won't remain for long. It's going to come down to who can secure those fab contracts. MS, Amazon, and AMD could replace Nvidia once market saturation is met

I hope AMD will launch some GPU that has some unique gimmick that accelerates AI faster than Nvidia,

 

how would they name it? Ai-zen? Rise-again? AI-RXCELLERATED GPU?

Note: Users receive notifications after Mentions & Quotes. 

Feel free to ask any questions regarding my comments/build lists. I know a lot about PCs but not everything.

PC:

Ryzen 5 5600 |16GB DDR4 3200Mhz | B450 | GTX 1080 ti

PCs I used before:

Pentium G4500 | 4GB/8GB DDR4 2133Mhz | H110 | GTX 1050

Ryzen 3 1200 3,5Ghz / OC:4Ghz | 8GB DDR4 2133Mhz / 16GB 3200Mhz | B450 | GTX 1050

Ryzen 3 1200 3,5Ghz | 16GB 3200Mhz | B450 | GTX 1080 ti

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, porina said:

.my point remains that even if nvidia switches aggressively to AI first and depreciates gaming, GPUs may not go away.

That depends on the shareholders (and they view the world with a short quarterly view). There's too much momentum for Nvidia to just kill their GPU brand, so they will continue to innovate and provide driver support so long as that doesn't cut into the larger profit margins that AI hardware has to offer. 
 

17 hours ago, Brooksie359 said:

The other would be that they don't have the capacity to meet the demand of both markets so they have to prioritize one and if that one has so much demand it could be no longer worth it to serve the other market. 

This.

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, porina said:

This is something I'm struggling to scope in size and capability. We have a whole bunch of different execution resources available to us already. How does each one scale in each segment of potential AI usage? CPU without specific AI acceleration, with VNNI or similar, with NPU, with iGPU? Then we get different size dGPUs, and for those really throwing money at it, dedicated silicon.

 

I may be out of date, but previously CPU could still be used for datasets that can't be chopped down to fit GPUs. Does that work with AI stuff too?

 

Users should scale hardware according to their use case. In the context of this thread, my point remains that even if nvidia switches aggressively to AI first and depreciates gaming, GPUs may not go away.

if they stop making GPUs, AMD can instantly start copying CUDA and other Nvidia stuff

Note: Users receive notifications after Mentions & Quotes. 

Feel free to ask any questions regarding my comments/build lists. I know a lot about PCs but not everything.

PC:

Ryzen 5 5600 |16GB DDR4 3200Mhz | B450 | GTX 1080 ti

PCs I used before:

Pentium G4500 | 4GB/8GB DDR4 2133Mhz | H110 | GTX 1050

Ryzen 3 1200 3,5Ghz / OC:4Ghz | 8GB DDR4 2133Mhz / 16GB 3200Mhz | B450 | GTX 1050

Ryzen 3 1200 3,5Ghz | 16GB 3200Mhz | B450 | GTX 1080 ti

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, podkall said:

if they stop making GPUs, AMD can instantly start copying CUDA and other Nvidia stuff

Not CUDA as that will be used in their datacenter HW. But I could see a future where Nvidia cross licenses GPU tech with AMD much how Intel and AMD cross-license CPU tech.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×