Jump to content

NVIDIA Plans to Launch Export-compliant GeForce RTX 4090 "D" Cards for China

9 minutes ago, leadeater said:

Developing new hardware and software obviously costs money and there is a return on investment factor but even so if you GPU die is 50% the size of what it used to be do achieve the same thing, with DLSS on, then it shouldn't cost the same let alone more. In general people don't like to pay more for less and that has for now been the sentiment around DLSS, Nvidia profiting off of charging more for less.

Users buy products, not die area. Silicon cost is a factor. As part of my previous exercise in estimating costs per die, I still have some numbers for possible wafer costs. These will be wrong, but could be indicative. TSMC 10/7/5 are $6k/9k/17k respectively. I saw a very low confidence value of 5k for Samsung 8nm, but given it is enhanced 10nm class it isn't far off the TSMC cost. Nvidia's custom 4N I'm assuming costs similar to N5 which it is related to. So the move from Ampere to Ada could be 2x or even 3x the cost per area.

 

9 minutes ago, leadeater said:

Lots of analysis can be done but the end consumer is ultimately annoyed because the cost has increased disproportionate of inflation and long term market trend which actually has real impact. It's really difficult to explain to anyone why they could buy xx70 before but can no longer, there isn't going to be any reason in the world to dissuade that annoyance.

I remind again that inflation is a measure of the increase, not the cause. I have actually owned every 70 class GPU since Maxwell. How does the pricing look?

£219 Jul 2016 EVGA 970 SC GAMING ACX 2.0 (22 months after release, end of life sales)

£399 Jul 2016 1070 FE (1 month after release)

£460 Oct 2018 Gigabyte 2070 Windforce (at launch)

£720 Feb 2021 MSI 3070 GAMING X TRIO (5 months after launch, crypto era shortages and pricing)

£589 Apr 2023 4070 FE (at launch)

 

The 970 was bought the same time as I got its successor 1070, because it was quite cut down in price by that point. From memory the typical price earlier in its life was more around the £300 ball park. The other GPUs I got basically as soon as I could after launch and there is an increasing trend to pricing. Official UK inflation from 2016 to April 2023 is about 30%. That would put the 1070 FE at £520 in the money at the time of the 4070 FE's release. Similarly for the 2070, that would be £566 with 23% inflation to April last year. Inflation adjusted, prices are increasing but is it that bad? The specific 70 models I bought were +9% from Pascal to Turing, and +4% from Turing to Ada. I'm leaving out Ampere since that was badly affected in crypto era. If you can afford a 1070 in 2016, you can afford a 4070 now.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, porina said:

Users buy products, not die area. Silicon cost is a factor.

Yes but that point was to normalize everything and make the die size a factor. Users may not buy "die area" but they actually do, that's part of the product cost. A GTX 1080 and a GTX 1080 DLSS edition 50% the die size shouldn't, at least not manufacturing cost, be the same.

 

The people most annoyed are the ones in the know.

 

16 minutes ago, porina said:

Inflation adjusted, prices are increasing but is it that bad?

Yes until something like the last 6-12 months. MSRP for a good bit of time there was worthless, I think we are all glad that is over.

 

16 minutes ago, porina said:

I'm leaving out Ampere since that was badly affected in crypto era.

Ampere wasn't the only generation effect by that, also it wasn't really crypto it was total demand from multiple sources and manufacturing and logistic issues. That's why RTX 30 was more effected than the last 2 times "crypto was to blame".

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, leadeater said:

Yes but that point was to normalize everything and make the die size a factor. Users may not but "die area" but they actually do, that's part of the product cost.

Yes it factors into the end product, but vast majority of people don't try to separate it out as an isolated factor.

 

4 minutes ago, leadeater said:

The people most annoyed are the ones in the know.

I have to laugh at that. I find the convoluted reasons used to show nvidia are bad quite amusing.

 

4 minutes ago, leadeater said:

Yes until something like the last 6-12 months. MSRP for a good bit of time there was worthless, I think we are all glad that is over.

It's only in crypto era that MSRP had become more aspirational than indicative so we're mostly back to normal now.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, porina said:

TSMC 10/7/5 are $6k/9k/17k respectively. I saw a very low confidence value of 5k for Samsung 8nm, but given it is enhanced 10nm class it isn't far off the TSMC cost. Nvidia's custom 4N I'm assuming costs similar to N5 which it is related to. So the move from Ampere to Ada could be 2x or even 3x the cost per area.

Just as a side thing since it's interesting to look at, transistor density shows quite an interesting story.

 

GTX 600 series (GK104, 28nm): 12,040,816/mm2

GTX 700 series (GK110, 28nm): 12,620,320/mm2

GTX 900 series (GM204, 28nm): 13,065,326/mm2

GTX 10 series (GP102, 16nm): 25,477,707/mm2

RTX 20 series (TU104, 12nm): 24,668,435/mm2

RTX 30 series (GA102, 8nm): 45,035,009/mm2

A100 40GB/80GB (GA100, 7nm): 65,617,433/mm2

RTX 40 series (AD102, 4nm): 125,390,304/mm2

Apple M2 TSMC 5nm: 141,143,260/mm2

Apple M3 TSMC 3nm: 171,232,876/mm2

 

That TU104 slight decrease is interesting. GA102 is 1.8x TU104, GA100 is 2.66x TU104, AD102 is 2.78x GA102, AD102 is 1.9x GA100.

RTX 4090 52.8% increase over 3090 (1440p)

RTX 3090 21% increase over 2080 Ti (1440p), 3090 Ti wasn't used due to poor Titan RTX data 

RTX 2080 Ti  22% increase over GTX 1080 Ti (1440p), Titan Xp also not used due to poor data

 

The increase in transistors needed and the density of them to deliver performance increases is quite wild. When you're ~2x or greater transistors for comparatively smaller performance gains to me signals quite a big issue, we are not going to be able to do those sorts of density gains to increase transistors like this for long. It's partly why I'm surprised Nvidia did the RTX 4090 like they did, I would have made it smaller, stayed on 4nm and just made RTX 50 series "larger". I honestly don't think AMD would be jumping past that.

Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, porina said:

I have to laugh at that. I find the convoluted reasons used to show nvidia are bad quite amusing.

haha true, no shortage of those

 

34 minutes ago, porina said:

It's only in crypto era that MSRP had become more aspirational than indicative so we're mostly back to normal now.

290X's didn't go to the same kind of insanity though, what got blamed and what was actually the problem I don't think are the same. At least what was common between both those times was the impossibility to buy.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×