Jump to content

RTX 4060 debate, potentially not terrible? *Nope, it is.

Edit* Officially, the specs have been released...

image.png.f37f72581d49511922f586156bcffca1.png

 

Speculative hot take: RTX 4060 isn't that bad, but we'll see when benchmarks come out.

3072 cores versus the previous speculation of 3840 is a large difference...

 

 

Gauging on the specs, the RTX 4060 should perform about -10% to the RTX 4060ti, which puts it between the 3060 12GB and 3060ti.

 

 

image.png.891a054d079d831951bcab35a8361580.png

GeForce RTX 4060 Launching June 29th, Starting At $299 | GeForce News | NVIDIA

 

In Nvidia's own promotion material, they're marketing strategy is somewhat decent here. This is a genuine argument especially in a place like Europe where electricity is far more expensive than in the US (as demonstrated above).

 

Considering the RTX 3060 12GB released at $330 back in early 2021, the inflation between then and now is about 11.5%. Factoring in that cost and gauging off the closest proportional silicon binning intergenerationally (RTX 3050 8GB), that would put the ~cost at $287. Considering its (hopefully MSRP) $300 for the next generation, that's not terrible, especially in comparison to the rest of the RTX 4000 series which I've discussed in detail in another thread.

 

 

Since the card will likely have a PCIe 4.0 8x connection like the RTX 4060ti, the discussion comes up as a drop in upgrade on an older PCIe 3.0 system. I've seen varied reports on this, where it can be anywhere from substantial to insignificant, depending on the system and games tested.

 

Considering the variables, $300 in 2023 for potentially -10% from the 4060ti puts the card in a whole other tier. 25% reduced cost for only 10% reduced performance is the best CUDA cores/$ for the RTX 4000 series so far, by a lot. Maybe it'll be worth it for 1080p/1440p gamers?

 

VRAM is a point of concern, but only for some titles, something that DLSS can alleviate. Yes a 16GB model would be better, but that might come at the cost of performance due to memory bandwidth limitations on a 128bit bus, and a disproportionately higher price tag.

 

 

If you look at the metric $/core, this is the breakdown based on rounded up MSRP (because $0.99 nonsense is stupid). Lower = better.

 

4090 24GB $1600-  0.0977
4080 16GB $1200-  0.123
4070ti 12GB $800- 0.104
4070 12GB $600-   0.102
4060ti 8GB $400-  0.0920
4060 8GB $300-    0.0781

4060 8GB $300-    0.0977, AKA, worse than the RTX 4060ti.

 

This value should go down as you go lower in tiers, but noting the sharp differences with the RTX 4090 and RTX 4060 8GB versus the rest.

 

image.png

image.png

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

If that's the case then that makes the 4060ti even more pointless than it is. But I'll reserve m  judgement until reviews.

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Agall said:

Speculative hot take: RTX 4060 isn't that bad, but we'll see when benchmarks come out.

 

Gauging on the specs, the RTX 4060 should perform about -10% to the RTX 4060ti, which puts it between the 3060 12GB and 3060ti.

 

 

image.png.891a054d079d831951bcab35a8361580.png

GeForce RTX 4060 Launching June 29th, Starting At $299 | GeForce News | NVIDIA

 

In Nvidia's own promotion material, they're marketing strategy is somewhat decent here. This is a genuine argument especially in a place like Europe where electricity is far more expensive than in the US (as demonstrated above).

 

Considering the RTX 3060 12GB released at $330 back in early 2021, the inflation between then and now is about 11.5%. Factoring in that cost and gauging off the closest proportional silicon binning intergenerationally (RTX 3050 8GB), that would put the ~cost at $287. Considering its (hopefully MSRP) $300 for the next generation, that's not terrible, especially in comparison to the rest of the RTX 4000 series which I've discussed in detail in another thread.

 

image.png.edc01296fe3e6e23b55671f83d11bfc6.png

 

Since the card will likely have a PCIe 4.0 8x connection like the RTX 4060ti, the discussion comes up as a drop in upgrade on an older PCIe 3.0 system. I've seen varied reports on this, where it can be anywhere from substantial to insignificant, depending on the system and games tested.

 

Considering the variables, $300 in 2023 for potentially -10% from the 4060ti puts the card in a whole other tier. 25% reduced cost for only 10% reduced performance is the best CUDA cores/$ for the RTX 4000 series so far, by a lot. Maybe it'll be worth it for 1080p/1440p gamers?

 

VRAM is a point of concern, but only for some titles, something that DLSS can alleviate. Yes a 16GB model would be better, but that might come at the cost of performance due to memory bandwidth limitations on a 128bit bus, and a disproportionately higher price tag.

image.png

 

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Agall said:

Speculative hot take: RTX 4060 isn't that bad, but we'll see when benchmarks come out.

 

Gauging on the specs, the RTX 4060 should perform about -10% to the RTX 4060ti, which puts it between the 3060 12GB and 3060ti.

 

 

image.png.891a054d079d831951bcab35a8361580.png

GeForce RTX 4060 Launching June 29th, Starting At $299 | GeForce News | NVIDIA

 

In Nvidia's own promotion material, they're marketing strategy is somewhat decent here. This is a genuine argument especially in a place like Europe where electricity is far more expensive than in the US (as demonstrated above).

 

Considering the RTX 3060 12GB released at $330 back in early 2021, the inflation between then and now is about 11.5%. Factoring in that cost and gauging off the closest proportional silicon binning intergenerationally (RTX 3050 8GB), that would put the ~cost at $287. Considering its (hopefully MSRP) $300 for the next generation, that's not terrible, especially in comparison to the rest of the RTX 4000 series which I've discussed in detail in another thread.

 

image.png.edc01296fe3e6e23b55671f83d11bfc6.png

 

Since the card will likely have a PCIe 4.0 8x connection like the RTX 4060ti, the discussion comes up as a drop in upgrade on an older PCIe 3.0 system. I've seen varied reports on this, where it can be anywhere from substantial to insignificant, depending on the system and games tested.

 

Considering the variables, $300 in 2023 for potentially -10% from the 4060ti puts the card in a whole other tier. 25% reduced cost for only 10% reduced performance is the best CUDA cores/$ for the RTX 4000 series so far, by a lot. Maybe it'll be worth it for 1080p/1440p gamers?

 

VRAM is a point of concern, but only for some titles, something that DLSS can alleviate. Yes a 16GB model would be better, but that might come at the cost of performance due to memory bandwidth limitations on a 128bit bus, and a disproportionately higher price tag.

image.png

The 4060 looks like an ok buy. It essentially looks like it’ll perform like a 7600 xt/7700 variant when/if amd releases one

 

i really hope the 50 series will be better around the board tho since even the 4070 looks better value than the 4070 ti

Message me on discord (bread8669) for more help 

 

Current parts list

CPU: R5 5600 CPU Cooler: Stock

Mobo: Asrock B550M-ITX/ac

RAM: Vengeance LPX 2x8GB 3200mhz Cl16

SSD: P5 Plus 500GB Secondary SSD: Kingston A400 960GB

GPU: MSI RTX 3060 Gaming X

Fans: 1x Noctua NF-P12 Redux, 1x Arctic P12, 1x Corsair LL120

PSU: NZXT SP-650M SFX-L PSU from H1

Monitor: Samsung WQHD 34 inch and 43 inch TV

Mouse: Logitech G203

Keyboard: Rii membrane keyboard

 

 

 

 

 

 

 

 

 


 

 

 

 

 

 

Damn this space can fit a 4090 (just kidding)

Link to comment
Share on other sites

Link to post
Share on other sites

If the 4060 was on par, or better than the 3060TI, then it would be worth it.

 

As it is, I would not recommend the 4060, absolutely especially if you can get a 3060TI for less money.

"Don't fall down the hole!" ~James, 2022

 

"If you have a monitor, look at that monitor with your eyeballs." ~ Jake, 2022

Link to comment
Share on other sites

Link to post
Share on other sites

I'll wait and see. The 4060 and rx 7600 are probably not bad cards, it's just very underwhelming as long at there is clearance sale on last Gen. 

I have a 3060 12gv and had decided to treat myself with a higher tier card with my Christmas bonus, but I'm waiting to see if the 7700 xt can impress. The uplift from 3060 seems decent enough, but not enough to warrant an upgrade for me. 

mITX is awesome! I regret nothing (apart from when picking parts or have to do maintainance *cough*cough*)

Link to comment
Share on other sites

Link to post
Share on other sites

Now I know why some RX 7600 get slight price cut to $250.

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, xAcid9 said:

Now I know why some RX 7600 get slight price cut to $250.

Problem in general with the RX 7600 is its practically just a 6nm version of the RX 6600 XT without almost two years of driver maturity.

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, DeerDK said:

I'll wait and see. The 4060 and rx 7600 are probably not bad cards, it's just very underwhelming as long at there is clearance sale on last Gen. 

I have a 3060 12gv and had decided to treat myself with a higher tier card with my Christmas bonus, but I'm waiting to see if the 7700 xt can impress. The uplift from 3060 seems decent enough, but not enough to warrant an upgrade for me. 

I find it hard to give GPU upgrade recommendations without a solid +50% performance increase personally. The easiest way to do so is a GPU tier upgrade, from a xx60 to xx80 as an example. Otherwise, waiting a few generations is the other option. Higher tiers will almost always come with a higher wattage draw, however.

 

 

We still run into the scenario where something like a GTX 1080ti is competitive against new $400 cards, where the 1080ti came out in 2017. We've gotten a lot of progress in performance in the last 6 years, but a high wattage GPU from that long ago is still capable in 2023 for that very reason. Things haven't changed much in efficiency though, considering the GTX 1080ti was only a 250W TDP card compared to its new-ish equivalent at 200W. Considering the RTX 4060ti is on par as well at only 160W is quite a nice jump down in wattage, but that's what happens when they produce on denser lithography. 16nm vs 8nm versus 5nm.

 

 

28 minutes ago, Sarra said:

If the 4060 was on par, or better than the 3060TI, then it would be worth it.

 

As it is, I would not recommend the 4060, absolutely especially if you can get a 3060TI for less money.

 

It'll likely be within margin in standard rasterization, but we'll see a huge push towards DLSS2/3 performance gapping that it'll inevitably do. Nvidia has a valid argument when they discuss how RTX 4000 series cards have higher performance/watt while also boasting higher framerates with DLSS2/3 enabled. Yes, most games don't have that feature, but are those games one's that require more rasterization performance?

 

I honestly dislike Nvidia's argument, but its a practical one with merit. The games that require higher performance have DLSS technology, so where it matters the most is where that technology is available.

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

58 minutes ago, Hinjima said:

 

Lol the timing, I swear I didn't see this video until you posted it. Surprisingly similar argument.

 

One thing I forgot to mention in OP was that its also the cheapest NVENC AV1 encoder right now. The only thing cheaper would be Intel Arc, which I'm about to throw my A380 in my main rig to test in Diablo 4 and Warframe today. I have the 7mm NH-D15 offset bracket showing up today, so I thought it would be a good time to throw my A380 in for a ride.

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Agall said:

Speculative hot take: RTX 4060 isn't that bad, but we'll see when benchmarks come out.

 

Gauging on the specs, the RTX 4060 should perform about -10% to the RTX 4060ti, which puts it between the 3060 12GB and 3060ti.

Maybe we're working off different reference points but I simultaneously think you're looking too high (-10% to 4060 Ti) and too low (between 3060 and 3060 Ti).

 

I'm going to reference a 3060. nvidia's own post claims +20% without framegen. Leaked 3DMark results showing +20-30% depending on the test, and +27% from the video above. 

 

In the 20-30% over 3060 area, limiting to previous and current gen we have 7600, 3060 Ti, 6700XT. It's quite a tight grouping there so position could be higher or lower depending on game mix and settings. Based on historical trends I think it coming in around the 3060 Ti level on average would be a nice place to end up. This will finally be the volume GPU for current gen. 7600 will be cheaper for those who strive for ultimate raster perf/$ but the premium for a 4060 will be taken, especially given it'll likely have better RT and other feature support.

 

Edit: looked up some more test data and my 1st look was a little off. 3060 Ti would be on the higher end of potential performance from what I've seen so far. I think it will still be slightly ahead of the 7600 in raster.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

47 minutes ago, Agall said:

... 

We still run into the scenario where something like a GTX 1080ti is competitive against new $400 cards, where the 1080ti came out in 2017. We've gotten a lot of progress in performance in the last 6 years, but a high wattage GPU from that long ago is still capable in 2023 for that very reason. Things haven't changed much in efficiency though, considering the GTX 1080ti was only a 250W TDP card compared to its new-ish equivalent at 200W.... 

Agreed. I'm very enthralled by the 4070, even if it cost more than double what I was willing to pay for a gpu pre-crisis. 

But that efficiency... Damn... 

And with enough muscles for me to have it for a long time at be able to turn on all the eye candy. 

But... I'll stay strong and see if AMD can challenge it. Maybe it will drop a bit in price while I wait. 

mITX is awesome! I regret nothing (apart from when picking parts or have to do maintainance *cough*cough*)

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, DeerDK said:

Agreed. I'm very enthralled by the 4070, even if it cost more than double what I was willing to pay for a gpu pre-crisis. 

But that efficiency... Damn... 

And with enough muscles for me to have it for a long time at be able to turn on all the eye candy. 

But... I'll stay strong and see if AMD can challenge it. Maybe it will drop a bit in price while I wait. 

Its efficient because its 5nm but also a lesser tier GPU. I have another thread where I discuss this in detail, but here's the chart that shows the RTX 4000 series binning scheme relative to RTX 3000 series. RTX 3000 series was quite good compared to previous generation with regards to binning scheme, basically +1, but the RTX 4000 series is proportionally bad to baseline, basically -1. The difference between the two being that most SKUs are -2 bins below the previous generation when talking RTX 3000 vs 4000.

 

RTX 3000

CUDA Cores

%cores

%cores

CUDA Cores

RTX 4000

GA102

10752

100%

100%

18432

AD102

3090ti 24GB $2000

10752

100%

 

 

 

3090 24GB $1500

10496

97.6%

 

 

 

3080ti 12GB $1200

10240

95.2%

 

 

 

 

 

 

88.9%

16384

4090 24GB $1600

3080 12GB $800

8960

83.3%

 

 

 

3080 10GB $700

8704

81.0%

 

 

 

3070ti 8GB $600

6144

57.1%

 

 

 

3070 8GB $500

5888

54.8%

52.8%

9728

4080 16GB $1200

3060ti 8GB $400

4864

45.2%

41.7%

7680

4070ti 12GB $800

3060 12GB $330

3584

33.3%

31.9%

5888

4070 12GB $600

3050 8GB $250

2560

23.8%

23.6%

4352

4060ti 8GB $400

 

 

 

20.8%

3840

4060 8GB $300

 

For context, showing the RTX 4070 12GB being practically on par with the RTX 3060 12GB in binning scheme for the actual GPU Nvidia selected for the card.

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

@Agall

 

It may not be 100% terrible, but that doesn't change the fact that ALL variations of the RTX 4060 are a literal SCAM.

 

The fact of the matter is that according to Die size and BUS width, these cards are all actually RTX 4050's, not 4060's.

 

Its a SCAM because Nvidia replaced the 5 with a 6 in order to charge $300-$500 for cards that are worth nowhere near that.

 

RTX 4060 NON-Ti should be $200. RTX 4060-Ti should be $280, and RTX 4060-Ti 16GB should be $320.

 

And until those cards get at least CLOSE to those prices, NOBODY in the entire species of human-kind should even be considering one. Do NOT allow Nvidia to continue screwing their customers over like this by enabling them with sales, its downright irresponsible to buy one of these cards at launch MSRP.

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, Agall said:

Its efficient because its 5nm but also a lesser tier GPU. I have another thread where I discuss this in detail, but here's the chart that shows the RTX 4000 series binning scheme relative to RTX 3000 series. RTX 3000 series was quite good compared to previous generation with regards to binning scheme, basically +1, but the RTX 4000 series is proportionally bad to baseline, basically -1. The difference between the two being that most SKUs are -2 bins below the previous generation when talking RTX 3000 vs 4000.

 

RTX 3000

CUDA Cores

%cores

%cores

CUDA Cores

RTX 4000

GA102

10752

100%

100%

18432

AD102

3090ti 24GB $2000

10752

100%

 

 

 

3090 24GB $1500

10496

97.6%

 

 

 

3080ti 12GB $1200

10240

95.2%

 

 

 

 

 

 

88.9%

16384

4090 24GB $1600

3080 12GB $800

8960

83.3%

 

 

 

3080 10GB $700

8704

81.0%

 

 

 

3070ti 8GB $600

6144

57.1%

 

 

 

3070 8GB $500

5888

54.8%

52.8%

9728

4080 16GB $1200

3060ti 8GB $400

4864

45.2%

41.7%

7680

4070ti 12GB $800

3060 12GB $330

3584

33.3%

31.9%

5888

4070 12GB $600

3050 8GB $250

2560

23.8%

23.6%

4352

4060ti 8GB $400

 

 

 

20.8%

3840

4060 8GB $300

 

For context, showing the RTX 4070 12GB being practically on par with the RTX 3060 12GB in binning scheme for the actual GPU Nvidia selected for the card.

I'm not really sure what to extract from that chart I relation to me drooling over a 4070 🙂 I'm not saying that Nvidia aren't overpricing their products. 

That said, the numbers are somewhat arbitrary to me as I don't know whether it's reasonable to demand some number of components for a specific tier. 

Haven't the components improved from generation to generation? 

 

mITX is awesome! I regret nothing (apart from when picking parts or have to do maintainance *cough*cough*)

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, WallacEngineering said:

@Agall

 

It may not be 100% terrible, but that doesn't change the fact that ALL variations of the RTX 4060 are a literal SCAM.

 

The fact of the matter is that according to Die size and BUS width, these cards are all actually RTX 4050's, not 4060's.

 

Its a SCAM because Nvidia replaced the 5 with a 6 in order to charge $300-$500 for cards that are worth nowhere near that.

 

RTX 4060 NON-Ti should be $200. RTX 4060-Ti should be $280, and RTX 4060-Ti 16GB should be $320.

 

And until those cards get at least CLOSE to those prices, NOBODY in the entire species of human-kind should even be considering one. Do NOT allow Nvidia to continue screwing their customers over like this by enabling them with sales, its downright irresponsible to buy one of these cards at launch MSRP.

I know this, I even have a whole thread discussing it. Keep in mind that TSMC is supposedly more expensive than Samsung, though I don't know by how much more, just that Nvidia has said so, something like 10% (of manufacturing, not total cost). Inflation at 11.5% between 2021 and 2023 factored in as well.

 

Not everyone wants to spend an absurd amount of money on a graphics card. Back in the day, you could snag a GTX 660/760 for $200-250, and if you wanted more, snag another to match or beat the flagship with SLI. SLI is dead for a reason, but it eliminates an entire upgrade path for a lot of people.

 

I imagine there's plenty of people who wish they could've just snagged another GTX 1080ti to run SLI in 2022/2023, but I imagine most games just don't support it anymore. People now are seemingly forced to spend +$500 for a graphics card to get any measure of long term value.

 

The GTX 1060 6GB being one of the most popular cards, and one of the best long term value cards for 1080p that this card and the RTX 3060 12GB were aimed at succeeding. That card launched at $300 in mid 2016, which would be $360 today with inflation. Comparably, the GTX 1060's die is similar in size to the RTX 4060's die, only be 10mm squared smaller, which tracks somewhat with improvements between 16nm and 5nm.

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

@Agall

 

See everything makes sense when you compare it to the old GTX 1000 series sure, but you must compare them to previous gen RTX 3060s because its far more accurate and relevant.

 

And when you do that, they are just GARBAGE

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, WallacEngineering said:

See everything makes sense when you compare it to the old GTX 1000 series sure, but you must compare them to previous gen RTX 3060s because its far more accurate

What does that even mean? Comparisons to 30 series are used as it is the most recent, but it is only one data point. In the longer term trend, it is 30 series that is unusual as it offered above-average silicon at most tiers. 30 series owners are least likely to be looking at 40 series anyway. Too soon.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, porina said:

What does that even mean? Comparisons to 30 series are used as it is the most recent, but it is only one data point. In the longer term trend, it is 30 series that is unusual as it offered above-average silicon at most tiers. 30 series owners are least likely to be looking at 40 series anyway. Too soon.

 

Because comparisons to cards almost a decade old is irrelevant

 

Steve from GN even said in the review: "bad enough to warrant the word: 'scam'"

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, WallacEngineering said:

Because comparisons to cards almost a decade old is irrelevant

In absolute terms, I'd agree, but it is the trend that I find more interesting. I posted the table in other similar threads but don't have time to find it right now. If you look at Pascal, Turing, Ampere, now Ada, it is Ampere that stands out, not Ada. Ada is more consistent with history than Ampere. Basically you're punishing nvidia for making Ampere too good.

 

28 minutes ago, WallacEngineering said:

Steve from GN even said in the review: "bad enough to warrant the word: 'scam'"

I can't take that him seriously for his in house content. He only seems to make any sense when he's talking to someone external. When Intel visited him for early Arc promotion he was visibly squirming from knowing he said rubbish in the past.

 

 

1 hour ago, Agall said:

Keep in mind that TSMC is supposedly more expensive than Samsung, though I don't know by how much more.

I tried to look up cost per wafer. I do not trust all the sources I found, nor was I able to cross check them. This isn't exactly information the fabs give out. Customers will have special pricing.

 

Samsung
8nm 5k
3nm 20k

 

TSMC
16/12nm 4k
10nm 6k
7nm 6k (2018), 9.3k (2020) - speculated due to increased demand. Zen 2 came out 2019.
5mn 16k-17k
3nm 20k

 

If these numbers are even ball park correct, then the silicon area cost might be 3x between Ampere and Ada. Had nvidia gone TSMC for Ampere instead of Samsung, it would have been significantly more expensive.

 

I'm off to bed now. It could be an interesting exercise to take the value above, take the known last gen GPU sizes and work out the cost for each. Bonus points if you use a yield calculator. I might do it tomorrow.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, porina said:

Basically you're punishing nvidia for making Ampere too good.

 

I can't take that him seriously for his in house content. He only seems to make any sense when he's talking to someone external. When Intel visited him for early Arc promotion he was visibly squirming from knowing he said rubbish in the past.

 

@Agall

 

Not really. Its more of the fact that nearly a decade ago was basically an entirely different world as far as PC Tech goes. Ray Tracing didn't exist, pricing and tiers of GPUs were pretty substantially different, there was no 600-watt capable monster GPU, and we weren't even quite on DDR4 yet, let alone DDR5.

 

You can't really make comparisons that far back because the economy and science has changed so significantly that the comparisons are basically unable to be fair or 1-to-1.

 

I mean if you want to compare to GTX 1000 series then you might as well include the 900 and 700 series as well.

 

And the fact of the matter is the rest of the RTX 4000 cards show SIGNIFICANT performance increases over their RTX 3000 predecessors - even the RTX 4070 NON-Ti (although its not quite as significant)

 

Whats odd is this taper of improvement over last gen. The 4090 is miles beyond what the 3090 could have ever even imagined to be, the 4080 is hugely faster than the 3080, and the 4070 is significantly faster than the 3070, although its not absolutely massive.

 

As you work your way down the product stack, the increases over last gen get smaller and smaller. And now include AMDs upcoming 7800-XT rumored performance figures and ull see that the same exact taper is happening over at AMD as well, even if its not as extreme.

 

The taper is very strange although I do not mind it as long as improvements are made. The issue with the 4060s is there IS NO IMPROVEMENT whatsoever at 1440p and the improvement at 1080p is miniscule because of the fact that they are 50-class cards in disguise.

 

When in the history of GPUs has a next-generation literally gone BACKWARDS in cuda-core count? The answer is NEVER until now, because in reality, Nvidia is just choosing to charge people 60-class pricing for 50-class cards.

 

Thats all this ever was and will ever be, end of story. Its already been proven that these are 50-class cards rebranded as 60-class cards because Nvidia has become so complacent in their market domination that they TRULY feel that they can get away with literal SCAMS. Thats all it is.

 

As for Steve, sure he isn't perfect, nobody is. But Steve is by FAR the most intelligent, deep-diving, and accurate YouTube reviewer, and there's just no arguing with that. His scientific principals and in-depth testing has far exceeded anything LMG or JaysTwoCents can do for years now, although of course Linus is hoping to change this with the Labs development.

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, WallacEngineering said:

Not really. Its more of the fact that nearly a decade ago was basically an entirely different world as far as PC Tech goes. Ray Tracing didn't exist, pricing and tiers of GPUs were pretty substantially different, there was no 600-watt capable monster GPU, and we weren't even quite on DDR4 yet, let alone DDR5.

Pick your date of comparison. "nearly a decade" is just over 7 years. 10 series released in 2016. We were in DDR4 era with Skylake release the previous year, and AMD would finally catch up in another year (2017) with Ryzen launch.

 

2 minutes ago, WallacEngineering said:

You can't really make comparisons that far back because the economy and science has changed so significantly that the comparisons are basically unable to be fair or 1-to-1.

You're the one arguing details, not me. Again, I'm just looking at trends, particularly at silicon level, not necessarily so much at product level.

 

2 minutes ago, WallacEngineering said:

I mean if you want to compare to GTX 1000 series then you might as well include the 900 and 700 series as well.

I chose not to include Maxwell as nvidia used a different die naming structure then, which would take more effort to try and line up.

 

2 minutes ago, WallacEngineering said:

When in the history of GPUs has a next-generation literally gone BACKWARDS in cuda-core count? The answer is NEVER until now, because in reality, Nvidia is just choosing to charge people 60-class pricing for 50-class cards.

You're putting too much weight into the names. Core count is only one factor contributing to performance, and clock differences probably will more than offset that. Also that's assuming the cores are unchanged.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, porina said:

Basically you're punishing nvidia for making Ampere too good.

It isn't that Ampere is "too good" it is the fact that Nvidia has cut down dies on all the lower end ADA cards to the point that Ada cards aren't an upgrade as those cards should be over Ampere, and at times Ampere cards are faster. The consumer should be punishing Nvidia by not buying these cards, and its an insult to the consumer that the 4060 only has 8GB of vram when you could buy a 3060 12GB instead. Nvidia knows the x60 class is terrible so they're going to try to market the cards based on power consumption.

 

19 minutes ago, WallacEngineering said:

Thats all this ever was and will ever be, end of story. Its already been proven that these are 50-class cards rebranded as 60-class cards because Nvidia has become so complacent in their market domination that they TRULY feel that they can get away with literal SCAMS. Thats all it is.

Well that is what Nvidia has done with the 40 series cards, except the 4090, move the cards down a tier and sell them as a tier higher with a higher tier price. I think its clear Nvidia doesn't care about the consumer market anymore,  given how much more of a market they have in AI compared to consumer gaming cards.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, porina said:

You're putting too much weight into the names. Core count is only one factor contributing to performance, and clock differences probably will more than offset that. Also that's assuming the cores are unchanged.

 

No actually, Im doing the opposite.

 

I fully acknowledge the fact that the name "RTX 4060" and then whatever variation of the card is completely pointless and meaningless because in reality they are 50-class cards.

 

It doesn't matter what Nvidia calls them, what matters is the performance improvement gained for the money you pay versus previous gen.

 

While I agree that core count isn't everything and that its fairly obvious that it isn't everything, clearly it has relevance and importance because once again - RTX 4060s show basically NO improvement over RTX 3060s, especially at 1440p.

 

Changing your wording around doesn't change the argument or the validity of said argument.

 

Again - its very simple: Nvidia is charging 60-class pricing for 50-class cards. What is there to figure out?

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Blademaster91 said:

Well that is what Nvidia has done with the 40 series cards, except the 4090, move the cards down a tier and sell them as a tier higher with a higher tier price. I think its clear Nvidia doesn't care about the consumer market anymore,  given how much more of a market they have in AI compared to consumer gaming cards.

 

Ya this is pretty true, Nvidia certainly is focusing on AI more than the consumer market, but unfortunately for them this doesn't excuse their behavior snd I think we can all agree on that.

 

Moving the tiers around can be okay - again as long as improvements are made for the money you are spending.

 

So the RTX 4090, 4080, and 4070 cards can be moved around a bit without too much issue as they still provide substantial improvements over previous gen.

 

But moving 4050s up to be 4060s and then showing us that you expect us to pay $300-$500 for no improvement whatsoever is just asinine.

 

Its like slapping a Porsche badge on your Honda Civic and then trying to sell it used for $50,000. Thats quite literally what Nvidia is doing here, although obviously the money difference is nowhere near as extreme.

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×