Jump to content

Wonder why RTX 4000 series feels like poor value?

RTX 3000

CUDA Cores

%cores

%cores

CUDA Cores

RTX 4000

GA102

10752

100%

100%

18432

AD102

3090ti 24GB $2000

10752

100%

 

 

 

3090 24GB $1500

10496

97.6%

 

 

 

3080ti 12GB $1200

10240

95.2%

 

 

 

 

 

 

88.9%

16384

4090 24GB $1600

3080 12GB $800

8960

83.3%

 

 

 

3080 10GB $700

8704

81.0%

79.2%

14592

4090D 24GB $1600

3070ti 8GB $600

6144

57.1%

55.6%

10240

4080 S 16GB $1000

3070 8GB $500

5888

54.8%

52.8%

9728

4080 16GB $1200

3060ti 8GB $400

4864

45.2%

45.8%

8448

4070ti S 16GB $800

 

 

 

41.7%

7680

4070ti 12GB $800

 

 

 

38.9%

7168

4070 S 12GB $600

3060 12GB $330

3584

33.3%

31.9%

5888

4070 12GB $600

3050 8GB $250

2560

23.8%

23.6%

4352

4060ti 8GB $400

 3050 6GB $250

 2048

 19.0%

16.7%

3072

4060 8GB $300

 

 

 

13.9%?

2560?

4050 6GB? $250?

The important takeaways:

 

Nvidia is upselling their binning scheme for what cut of silicon they call what tier of card with respect to RTX 3000 series. What I demonstrate here is that each card is upsold at least 1 tier, if not 2 tiers, like the example of the RTX 3050 8GB versus RTX 4060ti 8GB.  they're practically speaking ~24% the CUDA cores of the fully unlocked architecture, yet 2 tiers apart in name.

 

Each RTX 4000 series card excluding the RTX 4090, is about twice the price of its equivalent tier of RTX 3000 series. Even then, the RTX 4090 falls into a silicon bin between $800 and $1200, which isn't too far off of half of $1600.

 

Yes, the die sizes aren't 1:1, but they're close. We're talking 628 versus 608, being 96.8% the same, which tracks down proportionally to the lesser GPUs with comparable CUDA core counts (naming isn't 1:1 so I compare die sizes).

 

@LinusTech talks about how we shouldn't compare GPU dies like this and should look purely at the price/performance, but I think RTX 4000 series shows where this is limited. We should clearly be getting better value and performance out of this generation and we're not. The only way I see to explain why is with this matrix above.

 

I've done some data gathering and analysis when comparing benchmarks based on this adjusted matrix, but I think its still warranted. When I last did so it was only when the RTX 4090, 4080, 4070ti released, so I'm unsure if it still tracks. The theory being that we should be able to measure the actual Ampere to Ada Lovelace generational improvement by using this adjusted matrix.

 

I don't think this is a hot take, but I've gotten plenty of pushback on this subject since the RTX 4080/4070ti was announced with specs. Feel free to question the applicability of this, but ideally refute it with benchmarks showing non proportional gaps between the adjusted intergenerational gap (if you can). 

 

If you look at the metric $/core, this is the breakdown based on rounded up MSRP (because $0.99 nonsense is stupid). Lower = better.

 

4090 24GB $1600-  0.0977
4080 16GB $1200-  0.123
4070ti 12GB $800- 0.104
4070 12GB $600-   0.102
4060ti 8GB $400-  0.0920

4060 8GB $300-    0.0977

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

You only looked at Ampere. Also look at Turing, Pascal, maybe Maxwell is stretching it a bit as I think their die naming was different then. Basically Ampere is the exception and offered far more than historic trends. Ada is more in line with Turing and Pascal, at least based on die class. I didn't look at it from core count.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, porina said:

You only looked at Ampere. Also look at Turing, Pascal, maybe Maxwell is stretching it a bit as I think their die naming was different then. Basically Ampere is the exception and offered far more than historic trends. Ada is more in line with Turing and Pascal, at least based on die class. I didn't look at it from core count.

You're right, the GTX 1080 and 1080ti has a substantial gap, like 35% in performance, because of a similar die size and CUDA core count difference. RTX 3000 series binning scheme overall was quite good, but its what we've learned to expect since its the last generation.

 

RTX 2000 was an absolute nightmare of a generation, so I tend to ignore it in all honesty.

 

This is now though, with substantially better competition from Radeon in especially raw rasterization performance. Nvidia thinks they can justify this through their RT performance and DLSS. 

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Agall said:

This is now though, with substantially better competition from Radeon in especially raw rasterization performance. Nvidia thinks they can justify this through their RT performance and DLSS. 

What competition? I don't see the situation changed compared to RDNA2 vs Ampere. AMD only appears better value if you look at raster perf only, and are lacking pretty much everywhere else. I see them falling to 3rd place behind Intel in a couple gens if they don't step up their game.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, porina said:

What competition? I don't see the situation changed compared to RDNA2 vs Ampere. AMD only appears better value if you look at raster perf only, and are lacking pretty much everywhere else. I see them falling to 3rd place behind Intel in a couple gens if they don't step up their game.

I agree entirely, but rasterization performance is still the dominant aspect in gaming. Ray tracing is still not that wide spread and FSR demonstrates that you don't need tensor architecture to perform good upscaling. 

 

I will say that Intel Arc is getting better and likely will be competitive with Radeon in its next generation (if they keep progressing) at the low end. I personally own an Intel Arc A380 since it was only $120 with a dedicated AV1 encoder and 4 display outputs. Surprisingly capable card overall, but I don't see myself recommending the A750/A770 with its Radeon competition being generally better at everything for the same price.

 

Radeon's going to need another generation of MCM GPU architecture to really refine it, but they've been <1 generation behind on ray tracing.

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Agall said:

I agree entirely, but rasterization performance is still the dominant aspect in gaming. Ray tracing is still not that wide spread and FSR demonstrates that you don't need tensor architecture to perform good upscaling. 

RT is getting added where it matters: in new big budget releases. There's not a lot we can do about the back catalog although the occasional update happens, like Witcher 3, and Portal RTX. I consider raster "good enough" already on the mid to upper range. RT perf makes more of a difference.

 

9 hours ago, Agall said:

I personally own an Intel Arc A380 since it was only $120 with a dedicated AV1 encoder and 4 display outputs. Surprisingly capable card overall, but I don't see myself recommending the A750/A770 with its Radeon competition being generally better at everything for the same price.

I got an A380 too. Once I popped it in a supported system and exorcised traces of nvidia driver left behind, it worked fine. If nvidia disappeared overnight, I can see myself picking Arc over Radeon on the lower end.

 

9 hours ago, Agall said:

Radeon's going to need another generation of MCM GPU architecture to really refine it, but they've been <1 generation behind on ray tracing.

I see manufacturing methods like this as increasing value, but not increasing performance. At least, not in the current implementations. If AMD could do something with their GPU more like what Apple, or even Intel Sapphire Rapids did, that would be more interesting. I don't see AMD's RT perf in a generational sense, but more a perf tier sense. They're still about a tier below nvidia on RDNA2 and RDNA3. Because they increased raster so much, proportionately it made RT acceptable on the higher end but I think the upper mid range will suffer for it when they get around to releasing it.

 

Oh, I dug out my table of dies, and updated it for recent releases.

 

  900 1000 2000 3000 4000
90       GA102 AD102
80 Ti GM200 GP102 TU102 GA102  
80 GM204 GP104 TU104 GA102 AD103
70 Ti/Super   GP104 TU104 GA104 AD104
70 GM204 GP104 TU106 GA104 AD104
60 Ti/Super     TU106 GA104 AD106
60 GM206 GP106 TU106 GA106

 

 

GA102 on the 3080 set up unrealistic expectations for some.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia is banking on the PCI-e 4.0 and future versions for the features. Using up silicon for AI stuff, which demand support from the rest of the system. Mainly the CPU. Thats why they can say "you can manage with only 8GB". If the GPU can share direct link to RAM and rely on CPU features for some tasks, TECHNICALLY it works. But you are adding so many points of failure into the chain. And it has become apparent, sales people have abused the natural points of failure to up sell "bottleneck".

 

That said, dont forget Nvidia is a majority market share holder. It kinda has to ration resources. If you scale the availability of VRAM and other components over millions of units, offering 8GB cars in 2023 with AI gimmicks start to make sense. Still, gouging price to all hell is one really crappy practice. THIS is what makes Nvidia so bad for value: over saturated the market and under performing in the near future.

You will be upgrading with Nvidia MUCH MUCH sooner than with any current AMD offering.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, porina said:

RT is getting added where it matters: in new big budget releases. There's not a lot we can do about the back catalog although the occasional update happens, like Witcher 3, and Portal RTX. I consider raster "good enough" already on the mid to upper range. RT perf makes more of a difference.

 

I got an A380 too. Once I popped it in a supported system and exorcised traces of nvidia driver left behind, it worked fine. If nvidia disappeared overnight, I can see myself picking Arc over Radeon on the lower end.

 

I see manufacturing methods like this as increasing value, but not increasing performance. At least, not in the current implementations. If AMD could do something with their GPU more like what Apple, or even Intel Sapphire Rapids did, that would be more interesting. I don't see AMD's RT perf in a generational sense, but more a perf tier sense. They're still about a tier below nvidia on RDNA2 and RDNA3. Because they increased raster so much, proportionately it made RT acceptable on the higher end but I think the upper mid range will suffer for it when they get around to releasing it.

 

Oh, I dug out my table of dies, and updated it for recent releases.

 

  900 1000 2000 3000 4000
90       GA102 AD102
80 Ti GM200 GP102 TU102 GA102  
80 GM204 GP104 TU104 GA102 AD103
70 Ti/Super   GP104 TU104 GA104 AD104
70 GM204 GP104 TU106 GA104 AD104
60 Ti/Super     TU106 GA104 AD106
60 GM206 GP106 TU106 GA106

 

 

GA102 on the 3080 set up unrealistic expectations for some.

RTX 3080 isn't the only card that's upsold. If it was just the RTX 4080, then sure, don't buy it especially at $1200, but every card is upsold 2 tiers this generation minus the RTX 4090 (which is only upsold like 1.5). 

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, PriitM said:

Nvidia is banking on the PCI-e 4.0 and future versions for the features. Using up silicon for AI stuff, which demand support from the rest of the system. Mainly the CPU. Thats why they can say "you can manage with only 8GB". If the GPU can share direct link to RAM and rely on CPU features for some tasks, TECHNICALLY it works. But you are adding so many points of failure into the chain. And it has become apparent, sales people have abused the natural points of failure to up sell "bottleneck".

 

That said, dont forget Nvidia is a majority market share holder. It kinda has to ration resources. If you scale the availability of VRAM and other components over millions of units, offering 8GB cars in 2023 with AI gimmicks start to make sense. Still, gouging price to all hell is one really crappy practice. THIS is what makes Nvidia so bad for value: over saturated the market and under performing in the near future.

You will be upgrading with Nvidia MUCH MUCH sooner than with any current AMD offering.

Publicly traded companies will act in the interest of shareholders, so why would they make an RTX 3080 20GB or RTX 3070ti 16GB that could stand the test of time in a marketplace where their competition is already doubling up the memory bus?

 

That just prevents people from buying their product next year.

 

I also suspect that the shareholders were expecting crypto mining level profits, so Nvidia is taking an opportunity to maintain their profits by increasing margins. Just use a lower bin of silicon and double up the memory bus, now they have a 20-30% performance gap in what should be like 2x with a bigger memory bus to last generation.

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

I bought my Nvidia card used - I'm also considering buying another one used - 3090.

 

The longer I wait - perhaps, some used 4080s will enter the used marketplace.   F-Nvidia - I will help out some guy selling their card.   

 

But, when I see like posts like this - it just seems like 'anti-Nvidia' rhetoric because the context or vibe seems to be 'AMD has value.'   I disagree.   I don't think AMD cards are providing much value.   Yes, if all you do is game and you want decent vram - I guess, it's okay.   But, their new cards are crappy value.    They're way overpriced - and only a little better than 6900 and 6950 XT cards?    A used one would provide value.  Their new 7900 series?   Nah.    They need further price drops for that to happen.

 

What are the complaints with AMD cards:

7900 series - VR bugs, problems; poor productivity for the $$ you spend; driver issues, stuttering crashes

6800 / 6900 series - poor productivity (if the card is new - for the $$ you spend); driver issues, stuttering/crashes

 

This is just what I read.   I assert, that the poor productivity - is absolutely true - there's practically no evidence that they're any good productivity except for a few benchmarks for some video editing software - which perked my interest in the 7900 in the first place.   But, overall, the 7900 series is poor value for the $$ - it's inferior to the 40 series - in power consumption/efficiency, productivity

 

The only arguments in its favour - is FSR* (one can debate FSR vs DLSS - I don't really care about this), price is lower than the higher tier 4080/4090 - um, okay, so?, Adrenalin software program is better than Nvidia's - score one win.   

 

I don't perceive that many advantages - except the price is somewhat lower and you get more VRAM / per card comparison - but, this is mostly the older gen - and only with the higher tier cards of the previous gen - and it's often compared to 40 series lower tier/mid tier cards.

Try comparing 6700, 6800, 6900 vs 3090 (where the comparisons should be) - the 3090 used vs 6900 or 6950 XT.

 

7900 - 20GB; 7900 XTX - 24GB - this is good but the only adv. - these cards also had overheating issues - and problems connecting to multiple monitors plus idle power consumption issues - I think these issues are still ongoing? 

 

Nvidia has some power connector issues - melting - but, I suspect that should be solved soon - the AMD card problems sound more serious - and it's unknown whether it's hardware or software (driver) related? 

 

For only gaming - sure, AMD cards are better value - but, the fact AMD cards offer little else should be cause for complaint and criticism, not celebration.   It just means Nvidia has no competition.   

 

Hopefully, Intel changes things - but, I won't be holding my breath on that one.   

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, Paul17 said:

I bought my Nvidia card used - I'm also considering buying another one used - 3090.

 

The longer I wait - perhaps, some used 4080s will enter the used marketplace.   F-Nvidia - I will help out some guy selling their card.   

 

But, when I see like posts like this - it just seems like 'anti-Nvidia' rhetoric because the context or vibe seems to be 'AMD has value.'   I disagree.   I don't think AMD cards are providing much value.   Yes, if all you do is game and you want decent vram - I guess, it's okay.   But, their new cards are crappy value.    They're way overpriced - and only a little better than 6900 and 6950 XT cards?    A used one would provide value.  Their new 7900 series?   Nah.    They need further price drops for that to happen.

 

What are the complaints with AMD cards:

7900 series - VR bugs, problems; poor productivity for the $$ you spend; driver issues, stuttering crashes

6800 / 6900 series - poor productivity (if the card is new - for the $$ you spend); driver issues, stuttering/crashes

 

This is just what I read.   I assert, that the poor productivity - is absolutely true - there's practically no evidence that they're any good productivity except for a few benchmarks for some video editing software - which perked my interest in the 7900 in the first place.   But, overall, the 7900 series is poor value for the $$ - it's inferior to the 40 series - in power consumption/efficiency, productivity

 

The only arguments in its favour - is FSR* (one can debate FSR vs DLSS - I don't really care about this), price is lower than the higher tier 4080/4090 - um, okay, so?, Adrenalin software program is better than Nvidia's - score one win.   

 

I don't perceive that many advantages - except the price is somewhat lower and you get more VRAM / per card comparison - but, this is mostly the older gen - and only with the higher tier cards of the previous gen - and it's often compared to 40 series lower tier/mid tier cards.

Try comparing 6700, 6800, 6900 vs 3090 (where the comparisons should be) - the 3090 used vs 6900 or 6950 XT.

 

For only gaming - sure, AMD cards are better value - but, the fact AMD cards offer little else should be cause for complaint and criticism, not celebration.   It just means Nvidia has no competition.   

 

Hopefully, Intel changes things - but, I won't be holding my breath on that one.   

Personally, I've bought several cards in each of last generation, RX 6000 and RTX 3000 and have purchased Intel Arc and RTX 4000 series. Every point of purchase has a specific goal in mind and the objective is to find the best card for that role. Not every purchase is value orientated and it doesn't have to be, but the problem the RTX 4000 series has is that it tries to make a value argument disingenuously.

 

Lesser quality silicon for a higher price being the TLDR. I'm simply pointing out the obvious once you've pulled back the curtain of a product tier hierarchy like their xx50,60,70,80,90 system.

 

If you're comfortable with second hand parts, then go for it. Just realize that not everyone is, just like how everyone doesn't want to spend $1600 on a graphics card regardless of if they can afford it. Every product has its compromises, whether its specific features, price, efficiency, etc. Same goes for Radeon or Intel GPUs, not everyone should buy them because of the nuances. Personally, my brother didn't get an upgrade from an RTX 3070 to an RX 6900 XT for this very reason.

 

Outside of clearly bad cards like the RTX 4060ti, things are more nuanced and subjective. I can point out how terrible a deal of an RTX 4080 16GB is all I want but if someone isn't willing to spend RTX 4090 tier price but is willing to spend $1200 while also wanting the best ray tracing performance they're willing to pay for, its hard to argue against.

 

People will say, "there's no such thing as a bad product but a bad price" forget that the name is part of the product. RTX 4060ti as example being a bad product simply because its not a xx60ti tier card. The silicon itself is clearly capable at a low wattage, but its really an RTX 4050/4050ti part. If they called it the RTX 4050ti 8GB and sold it at the same price, then it wouldn't be a bad product but just a bad price.

 

Same goes for the previously named RTX 4080 12GB, which would've been an equivalently bad product, but isn't nearly as bad named the RTX 4070ti. 

 

I believe a product can be bad simply because of how its named and marketed, especially when that product is designed around a tiering system. Image if any other company manipulated their product hierarchy structure in order to increase their profits.

 

It would be like a metal manufacturing company upselling the grade of steels they brand because they figured out a manufacturing technique that can increase the hardness by 10% for the same cost, so they simply just manipulate the hierarchy of their product to increase profits. Yet if you look at the specs, you see a lesser grade of steel with better hardness as before but now also at a higher price.

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

53 minutes ago, Agall said:

Personally, I've bought several cards in each of last generation, RX 6000 and RTX 3000 and have purchased Intel Arc and RTX 4000 series. Every point of purchase has a specific goal in mind and the objective is to find the best card for that role. Not every purchase is value orientated and it doesn't have to be, but the problem the RTX 4000 series has is that it tries to make a value argument disingenuously.

 

Lesser quality silicon for a higher price being the TLDR. I'm simply pointing out the obvious once you've pulled back the curtain of a product tier hierarchy like their xx50,60,70,80,90 system.

 

If you're comfortable with second hand parts, then go for it. Just realize that not everyone is, just like how everyone doesn't want to spend $1600 on a graphics card regardless of if they can afford it. Every product has its compromises, whether its specific features, price, efficiency, etc. Same goes for Radeon or Intel GPUs, not everyone should buy them because of the nuances. Personally, my brother didn't get an upgrade from an RTX 3070 to an RX 6900 XT for this very reason.

 

Outside of clearly bad cards like the RTX 4060ti, things are more nuanced and subjective. I can point out how terrible a deal of an RTX 4080 16GB is all I want but if someone isn't willing to spend RTX 4090 tier price but is willing to spend $1200 while also wanting the best ray tracing performance they're willing to pay for, its hard to argue against.

 

People will say, "there's no such thing as a bad product but a bad price" forget that the name is part of the product. RTX 4060ti as example being a bad product simply because its not a xx60ti tier card. The silicon itself is clearly capable at a low wattage, but its really an RTX 4050/4050ti part. If they called it the RTX 4050ti 8GB and sold it at the same price, then it wouldn't be a bad product but just a bad price.

 

Same goes for the previously named RTX 4080 12GB, which would've been an equivalently bad product, but isn't nearly as bad named the RTX 4070ti. 

 

I believe a product can be bad simply because of how its named and marketed, especially when that product is designed around a tiering system. Image if any other company manipulated their product hierarchy structure in order to increase their profits.

 

It would be like a metal manufacturing company upselling the grade of steels they brand because they figured out a manufacturing technique that can increase the hardness by 10% for the same cost, so they simply just manipulate the hierarchy of their product to increase profits. Yet if you look at the specs, you see a lesser grade of steel with better hardness as before but now also at a higher price.

I totally agree with you.   I wasn't really critizing your post per se.   Not at all.  I agreed with all of it.   But, I think AMD needs to be slotted in there as well.

 

I'm not comfortable at all with second hand cards.   Actually, I am very hesitant and it was nerve-wracking and unsettling shopping for one - I hated dealing with sellers and I am sure many were annoyed with me too - it goes both ways - everyone is out to get something very exact and know what they want.   But, I looked at the advantages - avoiding giving Nvidia my money and avoid paying tax.   Those are pretty good? 

 

But, then there is a lot of risk, too, which was the real negative part.   But, AMD cards - I was looking at those but the more I researched 6000 series - I was considering the 6950 XT - and I found some 2nd hand ones - it just disappointed me to no end.  Not only were the AMD AIB partners - with no transferable warranty - but, other than gaming - they provided really poor performance on the productivity area - and thus, were poor value.   Ppl wanted as much as a 3080 Ti or 3090 card!   

New AMD cards - are a slightly different story - but not much better - a lot of the same complaints can be applied to them.   I tried to summarize them in my previous post.  

I understand - many gamers won't care about these concerns and complaints - but, they should - even if it doesn't apply to them.  

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Paul17 said:

I totally agree with you.   I wasn't really critizing your post per se.   Not at all.  I agreed with all of it.   But, I think AMD needs to be slotted in there as well.

 

I'm not comfortable at all with second hand cards.   Actually, I am very hesitant and it was nerve-wracking and unsettling shopping for one - I hated dealing with sellers and I am sure many were annoyed with me too - it goes both ways - everyone is out to get something very exact and know what they want.   But, I looked at the advantages - avoiding giving Nvidia my money and avoid paying tax.   Those are pretty good? 

 

But, then there is a lot of risk, too, which was the real negative part.   But, AMD cards - I was looking at those but the more I researched 6000 series - I was considering the 6950 XT - and I found some 2nd hand ones - it just disappointed me to no end.  Not only were the AMD AIB partners - with no transferable warranty - but, other than gaming - they provided really poor performance on the productivity area - and thus, were poor value.   Ppl wanted as much as a 3080 Ti or 3090 card!   

New AMD cards - were a slightly different story - but not much better - a lot of the same complaints can be applied to them.   I tried to summarize them in my previous post.  

Where I can't find a scenario where Radeon is comparable to my OP is just the way their RDNA product stack has been. RX 5000 series only had a few cards, the highest being the 5700 XT. RX 6000 series is a whole generation, but not only do we only have the two (probably highest, potentially not) high end cards and one very low end, but its first generation MCM architecture.

 

With those variables in mind, I don't think there's enough data to make a comparable matrix with Radeon right now. Radeon overall has progressed immensely in the RX 6000 series and RX 7000 series so far is setting a solid roadmap for sustainable production with MCM, but its still up in the air.

 

Intel Arc has been good so far, in my opinion, better than I expected. I'm happy to have invested in it with an Intel Arc A380, and I'd buy the 16GB A770 variant if I could justify a place for it. The A380 however in my 3950x system was playing Diablo 3 (high/med) and HoTS (high preset) 4K at a relatively smooth 60-100 fps. I've even thought about throwing it into my 7950x3D build to test some games out like Warframe, but just haven't gotten the time with Diablo 4 and other projects I'm working on.

 

I've thought about building a 13100F rig and throwing this A380 in it to have an SFX budget Intel rig just to play with and benchmark. Its always interesting to me to see how good of performance you can get in most games with a system that costs less than most singular parts in my rig.

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

Forgot to mention it earlier, one of the problems I have with this type of thread is how do you decide what a product "should" be? Based on die naming, nvidia were not wrong to come up with the 4080 12GB initially. The 4080 16GB would historically better fit as a Ti. Is core count more important? Should it be relative to the top end, as that implies the top end shouldn't move outside history gen on gen?

 

It's not an exact science with some variability, but generally take an nvidia of the newer generation, is roughly equivalent to about a tier to tier-and-half higher previous gen*. 1070 was comparable to 980 Ti. 2070 slotted between 1080 and Ti. 3070 was almost a 2080 Ti. 4070 is comparable to 3080. Just realised, I have owned every nvidia 70 GPU from Maxwell to Ada, but only 80 Ti GPUs from Maxwell to Turing. I'm not so familiar with the 60 tier GPUs, so it would be interesting to see how that progressed generationally. Don't have the time to go over reviews of the time right now. Note I'm basing this off testing done at the release of the newer gen of each pairing. Relative position can change with much newer games since older cards seem to fall off a little more there.

 

*Again Ampere is a bit of an exception, since the 80/Ti/90 were all very close to each other. The 80 was just too good.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, porina said:

Forgot to mention it earlier, one of the problems I have with this type of thread is how do you decide what a product "should" be? Based on die naming, nvidia were not wrong to come up with the 4080 12GB initially. The 4080 16GB would historically better fit as a Ti. Is core count more important? Should it be relative to the top end, as that implies the top end shouldn't move outside history gen on gen?

 

It's not an exact science with some variability, but generally take an nvidia of the newer generation, is roughly equivalent to about a tier to tier-and-half higher previous gen*. 1070 was comparable to 980 Ti. 2070 slotted between 1080 and Ti. 3070 was almost a 2080 Ti. 4070 is comparable to 3080. Just realised, I have owned every nvidia 70 GPU from Maxwell to Ada, but only 80 Ti GPUs from Maxwell to Turing. I'm not so familiar with the 60 tier GPUs, so it would be interesting to see how that progressed generationally. Don't have the time to go over reviews of the time right now. Note I'm basing this off testing done at the release of the newer gen of each pairing. Relative position can change with much newer games since older cards seem to fall off a little more there.

 

*Again Ampere is a bit of an exception, since the 80/Ti/90 were all very close to each other. The 80 was just too good.

Historical precedence, like a previous generation, is how I decide what 'should' be done. Deviation from such is precisely what I'm pointing out, since those cards have set a certain reputation for themselves based on such.

 

The RTX 3060ti/3070 were great mid range cards at a decent price, I can't say the same for the 4060ti/4070.

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

On 6/19/2023 at 3:36 PM, Agall said:

@LinusTech talks about how we shouldn't compare GPU dies like this and should look purely at the price/performance, but I think RTX 4000 series shows where this is limited. We should clearly be getting better value and performance out of this generation and we're not. The only way I see to explain why is with this matrix above.

Good chart. One should note that architecturally, Ampere and Lovelace are extremely similar. The biggest differences being the added cache in Lovelace, and obviously going to TSMC 4nm (really is a modified 5nm) from Samsung 8nm (which was really a modified 10nm), with some other differences around smaller parts of the overall die such as the optical flow accelerator and such. But really the core of both architectures are otherwise largely the same and I think warrant comparisons.

 

Also, regardless of architecture, die sizes have been a fairly reliable indication of where a given product is in the stack and in turn the respective pricing. A 379mm2 die product selling for $1200 is just absurd and there is no manufacturing cost rationale for it, and the same can be said for the rest of the AD104 and AD106 products.

Zen 3 Daily Rig (2022 - Present): AMD Ryzen 9 5900X + Optimus Foundations AM4 | Nvidia RTX 3080 Ti FE + Alphacool Eisblock 3080 FE | G.Skill Trident Z Neo 32GB DDR4-3600 (@3733 c14) | ASUS Crosshair VIII Dark Hero | 2x Samsung 970 Evo Plus 2TB | Crucial MX500 1TB | Corsair RM1000x | Lian Li O11 Dynamic | LG 48" C1 | EK Quantum Kinetic TBE 200 w/ D5 | HWLabs GTX360 and GTS360 | Bitspower True Brass 14mm | Corsair 14mm White PMMA | ModMyMods Mod Water Clear | 9x BeQuiet Silent Wings 3 120mm PWM High Speed | Aquacomputer Highflow NEXT | Aquacomputer Octo

 

Test Bench: 

CPUs: Intel Core 2 Duo E8400, Core i5-2400, Core i7-4790K, Core i9-10900K, Core i3-13100, Core i9-13900KS

Motherboards: ASUS Z97-Deluxe, EVGA Z490 Dark, EVGA Z790 Dark Kingpin

GPUs: GTX 275 (RIP), 2x GTX 560, GTX 570, 2x GTX 650 Ti Boost, GTX 980, Titan X (Maxwell), x2 HD 6850

Bench: Cooler Master Masterframe 700 (bench mode)

Cooling: Heatkiller IV Pro Pure Copper | Koolance GPU-210 | HWLabs L-Series 360 | XSPC EX360 | Aquacomputer D5 | Bitspower Water Tank Z-Multi 250 | Monsoon Free Center Compressions | Mayhems UltraClear | 9x Arctic P12 120mm PWM PST

Link to comment
Share on other sites

Link to post
Share on other sites

@Agall

 

Ive said this a few times before:

 

I don't think the pricing of these new gen cards is absolutely horrid, but it isn't great either. Mediocre could be a good word for it I guess.

 

Last gen was absolutely horrid pricing. There is a reason why the RTX 3090-Ti started out as a $2000 MSRP card that is now worth ~$600 when considering its used status versus what is available from current gen with a warranty.

 

AMD is a big reason tho, they offer more performance and more VRAM at substantially less cost at every level up to the RTX 4080, although its the same thing they have always done because of AMDs lack of performance in professional workloads.

 

There are 3 major things most people overlook that make 2022-2023 generation cards seem like worse of a value than they actually are:

 

1.) Inflation. If the GTX 1080-Ti were released today I guarantee you it would be a $1200+ card. We make more money and spend more money than we did back in the days of the GTX 1000 series. The value of the dollar keeps on dropping and will until we hit another depression and the economy is forced to rebuild.

 

2.) Massive generational uplift. GPUs used to see an uplift in performance of about 30% from generation to generation. Now its 50-75%. The RTX 4090 compares to the RTX 3090 and the RX 7900-XTX compares to the RX 6900-XT.

 

3.) We are approaching the limits of what traditional silicon transistors can do. As we approach a brick wall limitation of a given technology, developing that technology further gets exponentially more expensive. Even bearing Inflation in mind, the only possible trend from here on out is upwards in price for how much uplift you receive in return versus previous gen. In other words, price to performance gains from generation to generation will decline from now until the day carbon nanotubes are commonplace and cheap to manufacture.

 

Years ago when TSMC first began developing the first 5nm Transistors, articles came out stating TSMC engineers had serious concerns that 5nm transistors were so dense that a CPU core may be impossible to cool properly, with the core generating more heat more quickly than the IHS and cooling solution can dissipate - Meaning thermal runaway no matter your cooling solution.

 

In the end, they found a way to make it work, but look how hot the current generation of CPUs is...

 

Their concerns may not have been too far from reality and we may soon be forced to switch to carbon nanotubes which will be extremely expensive for the first few generations that have them.

 

But for now, the solution is simple: If you only play games or maybe just do a bit of video editing on the side for fun: Get an AMD card. If you actually use your GPU for genuine computational/professional workloads that actually make you money, then accept Nvidia's rediculous overpricing and remind yourself that your shiny new RTX 4080 or whatever will pay you back over time.

 

AMD drivers are pretty good these days. Still not perfect but plenty good enough for anyone who takes the time to set up their card and drivers properly. Doesn't take that long anyways.

 

Ive been on AMD since the RX 5700-XT first came to market. I can attest that there is nothing to be afraid of, its just a different process is all.

 

Unfortunately Linus is correct. You should only measure by price-to-performance ratio because we will never know how much inflation has eaten into the profits of these manufacturers, and how much the increase in cost to further technology has to be passed onto us consumers.

 

In other words, comparing dies from generation to generation will unfortunately never lead to an accurate prediction because of these changes in manufacturing cost plus the changes in architecture or features that also add or subtract from cost. Theres just no real way to compare die costs because we consumers are not privy to that information as it is obvious trade secrets.

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

This sounds like a team green loser problem.

I’ll care about nvidias pricing and terrible market abuse when they can provide something at least on par with the $329, 16gb Arc A770


So while nvidia butt blasts their customers and won’t even give them a reach around, AMD sits in the corner eating paint chips, and intel half bakes another gpu project, the only viable answer is to reject modernity and buy a 750ti.

 

Link to comment
Share on other sites

Link to post
Share on other sites

i get the charts, i get the dialogue.... but experience for my terms is also important.... i bought a 3080 back when that was launched, i LOVED that card, it was great value gave 6200dkk for a Asus Rog card, marked wise was way cheaper than the 2080 it replaced..

 

for me it replace my 1080TI which were still a great card, i felt a move in performance, but not like an insane move..

 

I got to play with a 4090 here a while back. i get it is like 2.5x the price of my 3080 but that card has INSANE performance. it is just such a big leap, but maybe it is just this Tick Tock shit because the 2080 was also quite expensive. but there you did NOT get the value compared to the 1080TI that was a way cheaper card.

 

i also bought a 4070TI for my brother that also is performancewise close to a 3090.. a 3090 in DK was 11k+ the 4070ti is 6100dkk so i don´t know...

 

i don´t think it is that insane, although of course i miss the old days, but i do still remember paying 4k for a Orchid Rightous Voodoo card when they were launched, and also 4.5k for the first 450mhz Pentium III...

 

and money back then was worth way more.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Agall said:

The RTX 3060ti/3070 were great mid range cards at a decent price, I can't say the same for the 4060ti/4070.

Specifically on the 4070, I think it is the sweet spot of current offerings on team green. The absolute pricing is affordable to most building a mid-high end system. Performance is basically 3080 level. Plenty for 1440p, and decent at 4k with sensible settings. It is priced lower than a 3080 at launch MSRP (£589 vs £649), and comparable to cheapest new 3080 today (one is offered at £580 from a large supplier I often use).

 

3070 did launch lower in UK at £469 but outside of the few people that grabbed FE models, I'm not sure that price was achieved. Thanks to shortages I wasn't able to buy one for 4 months and had to pay £720 for the privilege.

 

1 hour ago, 8tg said:

I’ll care about nvidias pricing and terrible market abuse when they can provide something at least on par with the $329, 16gb Arc A770

I suspect neither red or green will directly compete with current gen Arc because Intel is in growth position, and the established duo are more in sustain/defend position. I'd even question if Intel are making a profit on Arc dGPUs. Building up market share and getting established are more important to them for now, even if they make a short term loss.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, Agall said:

People will say, "there's no such thing as a bad product but a bad price" forget that the name is part of the product. RTX 4060ti as example being a bad product simply because its not a xx60ti tier card. The silicon itself is clearly capable at a low wattage, but its really an RTX 4050/4050ti part. If they called it the RTX 4050ti 8GB and sold it at the same price, then it wouldn't be a bad product but just a bad price.

I've been following this with interest, although I don't fully understand the nuances. What some people say "there's no such thing as a bad product but a bad price" is rubbish. If that were true Youtubers like Krazy Ken would have squat to do. There are tons of bad products out there - anyone remember the Yugo? Or how about the MSI and Gigabyte mobos causing the CPU melt-downs? Someone might argue that was a settings problem but if that were true, no one would've needed a BIOS update. Then there were the Samsung SSDs that had firmware that was corrupting them. Bad. 

 

Anyways, that said the cards are not bad - Nvidia uses slimy marketing tactics and manipulates the market. It's no wonder that they continue to operate based on greed - people are willing to pay for it.

 

Is the a380 or another Intel a worthy contender of any recent AMD or Intel products? How does it compare to an RX6800 XT? What is that comparable to from Nvidia?

I've been using computers since around 1978, started learning programming in 1980 on Apple IIs, started learning about hardware in 1990, ran a BBS from 1990-95, built my first Windows PC around 2000, taught myself malware removal starting in 2005 (also learned on Bleeping Computer), learned web dev starting in 2017, and I think I can fill a thimble with all that knowledge. 😉 I'm not an expert, which is why I keep investigating the answers that others give to try and improve my knowledge, so feel free to double-check the advice I give.

My phone's auto-correct is named Otto Rong.🤪😂

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, RevGAM said:

I've been following this with interest, although I don't fully understand the nuances. What some people say "there's no such thing as a bad product but a bad price" is rubbish. If that were true Youtubers like Krazy Ken would have squat to do. There are tons of bad products out there - anyone remember the Yugo? Or how about the MSI and Gigabyte mobos causing the CPU melt-downs? Someone might argue that was a settings problem but if that were true, no one would've needed a BIOS update. Then there were the Samsung SSDs that had firmware that was corrupting them. Bad. 

 

Anyways, that said the cards are not bad - Nvidia uses slimy marketing tactics and manipulates the market. It's no wonder that they continue to operate based on greed - people are willing to pay for it.

 

Is the a380 or another Intel a worthy contender of any recent AMD or Intel products? How does it compare to an RX6800 XT? What is that comparable to from Nvidia?

AMD video cards - can be argued - that they're 'bad' products.  I would only consider a 'cheap' or used 6000 series card.   Look at what it offers:   high power consumption.   It's only good for gaming.   Performance in other areas - you name it, Compute/GPGPU/video editing - is mediocre at best.   If you are paying full price for any of the cards - I think it's bad.   You may argue (will argue) because the gamers' perspective - is 'noting else matters.'   AMD has professional cards - but, does anyone use them?   Why would they pick one over an Nvidia (professional or consumer) card?   Why are consumer cards not usable in areas other than gaming?   Nvidia cards - might be expensive or gimped - but, I see value in a used one (at least - providing it's fully functional) - since, it provides the widest options of use.   

 

Over $1000+ for a card that is mainly a 'gaming' card?   Ppl see value in that?   Well, I cannot see the logic.

 

The 7900 series -seems to have made strides in other areas somewhat - and is an improvement over the 6800/6900 series (how much and whether it's enough for the price is up for debate) - and that's good - but, even AMD's lower tier of new gen cards have lower vram.   Many youtubers point this out and question the value.   Same as Nvidia.   But, with AMD cards - they have a specific user in mind and narrow down their market.   

 

The charts are interesting but one of the reasons Nvidia is able to gimp cards and then market them as 'next gen improvements/enhancements' is because AMD isn't competing in other spheres.   Nvidia's 'monopoly' on 'jack of all trades' graphics cards allows them to execute these practices - so when you don't care 'because you just game' - it enables Nvidia - I guess choosing an AMD card helps - but, as long as Nvidia can be a 'do all' card - some ppl will be forced to bite their lip and pick them.   I'll try to choose used - when I can - because I don't want to enable them.   

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, 8tg said:

This sounds like a team green loser problem.

I’ll care about nvidias pricing and terrible market abuse when they can provide something at least on par with the $329, 16gb Arc A770


So while nvidia butt blasts their customers and won’t even give them a reach around, AMD sits in the corner eating paint chips, and intel half bakes another gpu project, the only viable answer is to reject modernity and buy a 750ti.

 

GTX 750ti was a chad of a GPU. I had one back in the day that I tested Battlefield with. Ended up being my Physx GPU in an SLI 780ti system because why not (hilariously overkill given the dead feature and how terrible SLI is).

 

5 hours ago, porina said:

Specifically on the 4070, I think it is the sweet spot of current offerings on team green. The absolute pricing is affordable to most building a mid-high end system. Performance is basically 3080 level. Plenty for 1440p, and decent at 4k with sensible settings. It is priced lower than a 3080 at launch MSRP (£589 vs £649), and comparable to cheapest new 3080 today (one is offered at £580 from a large supplier I often use).

 

3070 did launch lower in UK at £469 but outside of the few people that grabbed FE models, I'm not sure that price was achieved. Thanks to shortages I wasn't able to buy one for 4 months and had to pay £720 for the privilege.

 

I suspect neither red or green will directly compete with current gen Arc because Intel is in growth position, and the established duo are more in sustain/defend position. I'd even question if Intel are making a profit on Arc dGPUs. Building up market share and getting established are more important to them for now, even if they make a short term loss.

Regional differences is always a solid point to contest the relative heaven we get in the US (from what it seems) when it comes to stock and MSRP pricing.

 

I'll have a better time recommending Arc if I decide to buy more than just an A380, even then the A380 goes as low as $120 USD like when I bought mine. Its at least 4x better than an i5 13500's iGPU when I tested Diablo 3.

 

How's availability of an A770 in the UK comparable to RTX 4000/RX 7000?

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, RevGAM said:

I've been following this with interest, although I don't fully understand the nuances. What some people say "there's no such thing as a bad product but a bad price" is rubbish. If that were true Youtubers like Krazy Ken would have squat to do. There are tons of bad products out there - anyone remember the Yugo? Or how about the MSI and Gigabyte mobos causing the CPU melt-downs? Someone might argue that was a settings problem but if that were true, no one would've needed a BIOS update. Then there were the Samsung SSDs that had firmware that was corrupting them. Bad. 

 

Anyways, that said the cards are not bad - Nvidia uses slimy marketing tactics and manipulates the market. It's no wonder that they continue to operate based on greed - people are willing to pay for it.

 

Is the a380 or another Intel a worthy contender of any recent AMD or Intel products? How does it compare to an RX6800 XT? What is that comparable to from Nvidia?

I would say the A380 is the best bare minimum GPU you can buy right now (my opinion). I really want to do more testing on it, but I've got a few more projects with my various hobbies occupying my free-time right now. I want to test Diablo 4 and Warframe with it specifically and in my 7950x3D system. So far its only seen testing I've done is with Diablo 3 and HotS from the system that holds it currently, which were both surprisingly playable at 4K high-med settings.

 

My A380 is in my 3950x build which is what I use for my living room TV. I've thought strongly about picking up an LGA1700 DDR4 or DDR5 motherboard and i3 13100F that I can use spare parts to complete just to test the GPU in its own environment. I can't test it on my 4790k build because it does practically require resizable bar and PCIe 4.0, which is where the 13100F would come in as the bare minimum modern purchase for a sub $500 rig in some scenarios.

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, 8tg said:

AMD sits in the corner eating paint chips, and intel half bakes another gpu project.

 

Lol I get the Butt Blasting and Half Baking but why is AMD eating paint chips? I guess I just don't get the reference lol 🤣

 

2 hours ago, Paul17 said:

The 7900 series -seems to have made strides in other areas somewhat - and is an improvement over the 6800/6900 series (how much and whether it's enough for the price is up for debate)

 

The RX 7900-XTX should be compared to the RX 6900-XT and the RX 7900-XT should be compared to the RX 6800-XT.

 

Keep in mind that "XTX" isn't real, its just AMDs stupid new naming scheme to try and promote hype. The upcoming RX 7800-XT is ACTUALLY the missing 7700-XT that everyone has been asking about.

 

Anyways, AMD has made strides about as big as Nvidia this generation. The XTX is a full ~50% faster than the RX 6900-XT in 4K Ultra Gaming - with the old flagship providing just over 60 FPS in an average of 10 games while the new flagship provided just over 90 FPS at the same settings.

 

8nFrKQSKRrvUPqZg52o2vE.thumb.png.e6f342ec5af95ac92d0dc9459608f3d6.png

 

7 hours ago, porina said:

Specifically on the 4070, I think it is the sweet spot of current offerings on team green.

 

The RTX 4070 Non-Ti is indeed the sweet spot in Nvidia's lineup, the 70-class always has been.

 

However there is a MASSIVE problem with it. 12GB of VAM is barely enough for 1440p Ultra on some games even RIGHT NOW. Sure, it should last you a while I guess, but overall its a risk, and taking a risk on a $600 or $800 card feels very shitty. The 4070/Ti NEEDED to be 16GB cards, especially the Ti as it can ACTUALLY somewhat handle 4K.

 

Basically if you want to be safe with Nvidia, you HAVE to buy the RTX 4080, a $1200+ card, more expensive than AMDs most expensive XTX flagship. Its very rough.

 

2 hours ago, Paul17 said:

AMD video cards - can be argued - that they're 'bad' products.

 

Over $1000+ for a card that is mainly a 'gaming' card?   Ppl see value in that?   Well, I cannot see the logic.

 

No, they definitely are not bad products, spoken like a true Nvidia supporter.

 

Ive only ever used a PC for gaming and some light video editing mainly for fun - it doesn't make me money. Ive NEVER used a PC to make money in any way.

 

Most of us have normal jobs of some sort, and therefore we don't need Nvidia cards and AMD becomes a legitimate, competitive option, and thats why AMD is pulling back market share, although its very slow and very small progress.

 

I love my XTX, its a fantastic card and Im loving every single moment interacting with my PC ever since I installed it. So go figure - AMD has more than satisfied me as a customer.

 

9 hours ago, RasmusDC said:

I got to play with a 4090 here a while back. i get it is like 2.5x the price of my 3080 but that card has INSANE performance. it is just such a big leap

 

Its very true. The RTX 4090 is the world's first "HyperCard" as opposed to the RTX 4080 and RTX 7900-XTX which are both "SuperCards" otherwise known as "Flagship Class". So the RTX 4090 is so outlandishly over-powered that you can't even call it a flagship, its BEYOND a flagship.

 

And if you don't understand the reference of "HyperCard" and "SuperCard" then watch this video. You will understand just how much the world's first HyperCar changed the industry forever:

 

 

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×