Jump to content

Wonder why RTX 4000 series feels like poor value?

5 minutes ago, RevGAM said:

I think the theory of bias towards Nvidia is correct. I say this because many, many times I've seen people lean towards Nvidia because of its name, rep and long history leading the GC industry.  When I see people recommending AMD cards to creators, I wince. 

Thats the thing - its both true and untrue - in a way.

 

As @Paul17 said, everyone seems to have a negative disposition towards Nvidia, and yet most still buy their cards.

 

Its complete hypocrisy and has been for years.

 

Its like I always tell people who talk a bunch of smack - If you want things to change, quit bitching and vote with your wallet. Buy something that isn't an Nvidia card if you want Nvidia to change.

 

Thats it - done deal, theres no way to argue with it. You either fund Nvidia's B.S. or you dont.

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

If all you play is Cyberpunk then sure, brag about your superior RT, I agree. Sadly the game sucks, so?

5800X3D Stock. 32GB RAM. RX 7900 XT. Arch Linux.

Link to comment
Share on other sites

Link to post
Share on other sites

GTX 480 was a 100% leap over the 280, 580 was 115% We won't see a leap like that ever again.

5800X3D Stock. 32GB RAM. RX 7900 XT. Arch Linux.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Strobel said:

People are stupid, they cry like bitches then lick their feet by buying, nothing more dumb than a hypocrite ...heck one thing the bible got right, lowest form.

Not everyone has a value orientated purchasing decision, I obviously don't. That doesn't mean I'm incapable of expressing another perspective, like demonstrating why the RTX 4000 series is a poor value relative to RTX 3000, regardless of the performance advantage.

 

8 minutes ago, Strobel said:

If all you play is Cyberpunk then sure, brag about your superior RT, I agree. Sadly the game sucks, so?

I still don't think ray tracing is a worthy feature to pay the Nvidia tax for, that's why the majority of the recommendations, including several systems I built and helped others actually built, are using Radeon RX 6000 or 7000 series GPUs. 

 

3 minutes ago, Strobel said:

GTX 480 was a 100% leap over the 280, 580 was 115% We won't see a leap like that ever again.

Based on my calculations, the gap between Ampere and Ada Lovelace is close to that, almost triple digits, but closer to 90% when doing a direct comparison between comparable dies.

Ryzen 7950x3D Direct Die NH-D15, CCD1 disabled

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional IT since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

I am in the same boat, I buy what my budget can realistically buy and for my own use-case in particular though I don't actually care what AMD or Nvidia do to customers, you don't have to buy from them? lol

5800X3D Stock. 32GB RAM. RX 7900 XT. Arch Linux.

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, WallacEngineering said:

Thats the thing - its both true and untrue - in a way.

 

As @Paul17 said, everyone seems to have a negative disposition towards Nvidia, and yet most still buy their cards.

 

Its complete hypocrisy and has been for years.

 

Its like I always tell people who talk a bunch of smack - If you want things to change, quit bitching and vote with your wallet. Buy something that isn't an Nvidia card if you want Nvidia to change.

 

Thats it - done deal, theres no way to argue with it. You either fund Nvidia's B.S. or you dont.

I didn't. Lots of people do, though. 

I've been using computers since around 1978, started learning programming in 1980 on Apple IIs, started learning about hardware in 1990, ran a BBS from 1990-95, built my first Windows PC around 2000, taught myself malware removal starting in 2005 (also learned on Bleeping Computer), learned web dev starting in 2017, and I think I can fill a thimble with all that knowledge. 😉 I'm not an expert, which is why I keep investigating the answers that others give to try and improve my knowledge, so feel free to double-check the advice I give.

My phone's auto-correct is named Otto Rong.🤪😂

Link to comment
Share on other sites

Link to post
Share on other sites

57 minutes ago, WallacEngineering said:

Thats the thing - its both true and untrue - in a way.

 

As @Paul17 said, everyone seems to have a negative disposition towards Nvidia, and yet most still buy their cards.

 

Its complete hypocrisy and has been for years.

 

Its like I always tell people who talk a bunch of smack - If you want things to change, quit bitching and vote with your wallet. Buy something that isn't an Nvidia card if you want Nvidia to change.

 

Thats it - done deal, theres no way to argue with it. You either fund Nvidia's B.S. or you dont.

Halo product customers who buy the Titan and xx90 tier products will likely buy those products regardless of their value proposition since they're objectively the highest performing product available. Personally I've owned Titans, obviously a 4090, and almost bought a Radeon 295x2 back in the day (specifically the Asus ROG Ares III).

 

Nvidia knows those types of consumers don't care for a value proposition. I personally take no issue with that regardless of being apart of that consumer class. Where I do take issue is that Nvidia knows those halo products sell the lesser products, since someone will see the RTX 3090 vs RTX 4090 and assume they'll get the same performance increase if they upgraded from an RTX 3060ti to RTX 4060ti.

 

Luckily our hardware reviewer community aren't a bunch of shills and aggressively pointed out how wrong of an assumption that is. What I aim to do in this thread is demonstrate why that's the case by breaking down Nvidia's binning scheme difference that has lead to the RTX 3060ti and 4060ti having practically the same performance.

Ryzen 7950x3D Direct Die NH-D15, CCD1 disabled

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional IT since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

48 minutes ago, Strobel said:

GTX 480 was a 100% leap over the 280, 580 was 115% We won't see a leap like that ever again.

Yes we will. The 580 to the 780 Ti was 100%. The 980 Ti to the 1080 Ti was maybe 85-90%. The 4090 over Ampere is close to that. We will definitely have big uplifts when comparing comparable dies gen on gen. Comparing a GTX 680 (GK104) to a GTX 580 (GF110) is a recipe for bad data since they shifted die tiering but not pricing or branding and functionally did that again with Lovelace in the lower stack.

Zen 3 Daily Rig (2022 - Present): AMD Ryzen 9 5900X + Optimus Foundations AM4 | Nvidia RTX 3080 Ti FE + Alphacool Eisblock 3080 FE | G.Skill Trident Z Neo 32GB DDR4-3600 (@3733 c14) | ASUS Crosshair VIII Dark Hero | 2x Samsung 970 Evo Plus 2TB | Crucial MX500 1TB | Corsair RM1000x | Lian Li O11 Dynamic | LG 48" C1 | EK Quantum Kinetic TBE 200 w/ D5 | HWLabs GTX360 and GTS360 | Bitspower True Brass 14mm | Corsair 14mm White PMMA | ModMyMods Mod Water Clear | 9x BeQuiet Silent Wings 3 120mm PWM High Speed | Aquacomputer Highflow NEXT | Aquacomputer Octo

 

Test Bench: 

CPUs: Intel Core 2 Duo E8400, Core i5-2400, Core i7-4790K, Core i9-10900K, Core i3-13100, Core i9-13900KS

Motherboards: ASUS Z97-Deluxe, EVGA Z490 Dark, EVGA Z790 Dark Kingpin

GPUs: GTX 275 (RIP), 2x GTX 560, GTX 570, 2x GTX 650 Ti Boost, GTX 980, Titan X (Maxwell), x2 HD 6850

Bench: Cooler Master Masterframe 700 (bench mode)

Cooling: Heatkiller IV Pro Pure Copper | Koolance GPU-210 | HWLabs L-Series 360 | XSPC EX360 | Aquacomputer D5 | Bitspower Water Tank Z-Multi 250 | Monsoon Free Center Compressions | Mayhems UltraClear | 9x Arctic P12 120mm PWM PST

Link to comment
Share on other sites

Link to post
Share on other sites

56 minutes ago, Strobel said:

I am in the same boat, I buy what my budget can realistically buy and for my own use-case in particular though I don't actually care what AMD or Nvidia do to customers, you don't have to buy from them? lol

I wanted the best performing product, the RTX 4090 on release was so absurdly higher than the last generation, especially at 4K which I was already running, that it was worth it. I was confident that the RX 7900 XTX wouldn't close that gap and the gamble paid off.

 

Overclocking the RTX 4090 is also a lot of fun including the thermodynamics of it, and I doubt any RTX 4090ti is going to be substantially better than an overclocked RTX 4090. That was an upgrade from my RX 6900 XT, which practically doubled the performance and wouldn't run into VRAM issues since some games would actually want over 16GB to max texture settings.

Ryzen 7950x3D Direct Die NH-D15, CCD1 disabled

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional IT since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Strobel said:

If all you play is Cyberpunk then sure, brag about your superior RT, I agree. Sadly the game sucks, so?

 

I actually really liked Cyberpunk, not for the story or plot or anything - but the gameplay was fun, definitely a unique take on the whole GTA style thing.

 

I do also agree that RT is still pretty pointless tho. I might use in in like 10 years but for now - just no.

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, Sir Beregond said:

Yes we will. The 580 to the 780 Ti was 100%. The 980 Ti to the 1080 Ti was maybe 85-90%. The 4090 over Ampere is close to that. We will definitely have big uplifts when comparing comparable dies gen on gen. Comparing a GTX 680 (GK104) to a GTX 580 (GF110) is a recipe for bad data since they shifted die tiering but not pricing or branding and functionally did that again with Lovelace in the lower stack.

at the very least. 
you wont see an increase like that it anytime soon

3nm logic density increase is less than the 5nm transition had(7nm->5nm was 84%, 5nm->3nm is 70%), however sram(cache) and analog scaling is dead.

GA100 is already at the rhetorical limit of what a single chip can be. 

You say die tiering has shifted between kepler and fermi, and yes you are correct, but thats because the xx0 chips got bigger for professional work loads. we don't even get xx0 chips as a consumer anymore like we did back on fermi. 

Hopper, Ampere and Ada nvidia literally maxed out the focal width of the LENS on the machines for their xx0 chips, there is literally no way to ship that at consumer price points.

perhaps we might see a doubling gen over gen for the xx2 chips with TSMC introducing Gaafet on 2N, but even samsung isnt getting double with it for their 3nm node.

Link to comment
Share on other sites

Link to post
Share on other sites

@Agall

 

Have you also noticed that generational uplift this time around is a tapered shape?

 

Let me explain what I mean...

 

The uplift for the 4090 and XTX are both pretty huge, but then uplift for the 4070 and upcoming 7800-XT are not so huge, and then down at lower tier cards the uplift is quite small.

 

And thats not just rebranding, even comparing things correctly and not paying attention to stupid branding, the taper still happens, if not quite as extreme.

 

It really sucks though because those on a budget, those who need the biggest increase in price-to-performance ratio, are getting the smallest gains. Its so weird...

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, WallacEngineering said:

@Paul17

 

Ya true, probably impossible to conceive every little detail, but trust me brand loyalty to Nvidia is certainly a thing. They have been so dominant for so long its impossible for it to be any other way. They feel there is no real alternative and that Nvidia is the only option. They are unwilling to change.

 

There is also some AMD loyalty, its just not as severe. Most AMD fanboys are just ready to defend the "little guy", the "underdog" and spend their money trying to make AMD bigger.

 

I fall slightly into the category of supporting "the little guy". For example I refuse to shop at Walmart and haven't step foot in one for what... Its probably been 11-ish years now...?

 

But I have owned a few Nvidia cards and I would buy one again if AMD didn't offer me a far better value for gaming. I am glad AMD has given me a reason to stick with them (for now) and hopefully the trend continues but unfortunately none of us can predict the future.

 

You can't win every battle, so I pick my fights carefully.

That's part of the problem - I don't consider 'AMD to be the little guy.'   Imho, there isn't one.   AMD bought ATI - ATI was the little guy but they got absorbed by a much larger company.   AMD builds processors - they have AIB partners for their motherboards  and gpus - they also manufacture reference cards etc. etc.   I don't consider them a little guy - compared to nvidia?   I suppose - but, I think they decide to make certain decisions - they offer more VRAM on more recent cards, maybe don't gimp their cards as much but there's still lower tier cards with low VRAM.   But, they choose to invest only sparingly on productivity tech - they're way behind on AI, too.    They make various decisions in which they hope gamers (who obviously, don't care one iota about this stuff) will blindly buy their products.   

 

I don't think I have brand loyalty - I read reviews and yes, companies that try to deceive the public bother me - but, I think they all do that - they're all bad.   I try to pick the least of the worst.... or something like that.   I read that Gigabyte tries to avoid RMAs, won't stand by their product, ASUS got caught in their own scandal, ASRock has done shady things, MSI and on and on.... I believe a product - if they are going to compete - have to at least match 'the other guy' - and to make me/one pick them, they need to try and offer 'something more.'   I don't think AMD cards match Nvidia - except for gaming - so gamers think they do - they're even better on price - and that's fine - if that is your measuring stick.   But, for me, if they totally neglect other uses of the card - I count that against them.   The price isn't reduced enough either, imho - but, I do want to choose them over Nvidia - I did list my reasons.   I did buy a (used) AMD card before.   

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, RevGAM said:

I think the theory of bias towards Nvidia is correct. I say this because many, many times I've seen people lean towards Nvidia because of its name, rep and long history leading the GC industry.  When I see people recommending AMD cards to creators, I wince. 

Isn't that anti-Nvidia bias, then?  Why would someone recommend AMD cards to creators? 🙂  

I wanted an AMD card - and productivity was part of what I'd do with the card - plus gaming - it would be an 'all purpose' card - with video editing and Blender - as the main uses - but, ML etc. would be a bonus.   Before I knew anything about any of this - ppl steered me away from AMD cards- and explained why Nvidia would be better suited.   I also use Linux - and AMD provides open source drivers - so, I wanted an AMD card and ideally, AMD would be sufficient for creators - but, they're really not - so, I admit it, honestly - it made me bitter - but, I still try to be objective and neutral - I have no horse in the race - I think they're both bad companies - but, I like an 'all-purpose' card that can do everything - not just excel in one area.   I guess that is a minority view in a sea of gamers?   

 

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Paul17 said:

I don't think I have brand loyalty - I read reviews and yes, companies that try to deceive the public bother me - but, I think they all do that - they're all bad.   I try to pick the least of the worst.... or something like that.

 

Oh AMD is certainly the underdog in the GPU industry - its like 85% to 15% or so.

 

Surprisingly enough they are also still the underdogs of the CPU world in comparison so Intel, but at least they are far stronger on that front.

 

I would agree that all companies are bad these days and that I also choose the "least evil" or "least bad" option - I suppose thats a good way of putting it.

 

AMDs deception of the public is usually minor. This time its introducing "XTX" branding and moving the entire product stack up a level for whatever reason even though it makes them look worse because now the RTX 4070 will be compared to the RX 7800-XT instead of the RX 7700-XT making AMD look like they are even further behind than they actually are.

 

Definitely a weird call by AMD marketing there, but then compare that to the literal SCAM that is the RTX 4060-Ti and you can start to see why Nvidia's near-monopoly has empowered them to do whatever the hell they feel like doing these days.

 

So ya, all companies are dishonest, Im fully aware that AMD is not our friend despite being the underdog, but you do have to admit that Nvidia is taking things a few steps TOO far these days. They have become complacent without a doubt.

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Paul17 said:

Isn't that anti-Nvidia bias, then?  Why would someone recommend AMD cards to creators?

 

He said recommending AMD cards to creators makes him *Wince*. AKA Cringe, AKA its a bad thing, he agrees with you.

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, WallacEngineering said:

 

He said recommending AMD cards to creators makes him *Wince*. AKA Cringe, AKA its a bad thing, he agrees with you.

I know - I'm just wondering what the context is that they would do that.   

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, starsmine said:

at the very least. 
you wont see an increase like that it anytime soon

3nm logic density increase is less than the 5nm transition had(7nm->5nm was 84%, 5nm->3nm is 70%), however sram(cache) and analog scaling is dead.

GA100 is already at the rhetorical limit of what a single chip can be. 

You say die tiering has shifted between kepler and fermi, and yes you are correct, but thats because the xx0 chips got bigger for professional work loads. we don't even get xx0 chips as a consumer anymore like we did back on fermi. 

Hopper, Ampere and Ada nvidia literally maxed out the focal width of the LENS on the machines for their xx0 chips, there is literally no way to ship that at consumer price points.

perhaps we might see a doubling gen over gen for the xx2 chips with TSMC introducing Gaafet on 2N, but even samsung isnt getting double with it for their 3nm node.

I agree, but that also assumes that the only generational improvements going from 5nm -> 3nm is process improvements. Blackwell could very well be an architecture redesign too that further improves the performance. In that sense though, time will tell. Turing was a redesign from how Pascal CUDA cores worked (in addition to the other additions around RT/Tensor), Ampere changes how the CUDA core worked again from Turing, and Lovelace is largely the same as Ampere. So if they fundamentally redesign the architecture for Blackwell + 3nm, could still see massive gains.

Zen 3 Daily Rig (2022 - Present): AMD Ryzen 9 5900X + Optimus Foundations AM4 | Nvidia RTX 3080 Ti FE + Alphacool Eisblock 3080 FE | G.Skill Trident Z Neo 32GB DDR4-3600 (@3733 c14) | ASUS Crosshair VIII Dark Hero | 2x Samsung 970 Evo Plus 2TB | Crucial MX500 1TB | Corsair RM1000x | Lian Li O11 Dynamic | LG 48" C1 | EK Quantum Kinetic TBE 200 w/ D5 | HWLabs GTX360 and GTS360 | Bitspower True Brass 14mm | Corsair 14mm White PMMA | ModMyMods Mod Water Clear | 9x BeQuiet Silent Wings 3 120mm PWM High Speed | Aquacomputer Highflow NEXT | Aquacomputer Octo

 

Test Bench: 

CPUs: Intel Core 2 Duo E8400, Core i5-2400, Core i7-4790K, Core i9-10900K, Core i3-13100, Core i9-13900KS

Motherboards: ASUS Z97-Deluxe, EVGA Z490 Dark, EVGA Z790 Dark Kingpin

GPUs: GTX 275 (RIP), 2x GTX 560, GTX 570, 2x GTX 650 Ti Boost, GTX 980, Titan X (Maxwell), x2 HD 6850

Bench: Cooler Master Masterframe 700 (bench mode)

Cooling: Heatkiller IV Pro Pure Copper | Koolance GPU-210 | HWLabs L-Series 360 | XSPC EX360 | Aquacomputer D5 | Bitspower Water Tank Z-Multi 250 | Monsoon Free Center Compressions | Mayhems UltraClear | 9x Arctic P12 120mm PWM PST

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, Paul17 said:

Isn't that anti-Nvidia bias, then?  Why would someone recommend AMD cards to creators? 🙂  

I wanted an AMD card - and productivity was part of what I'd do with the card - plus gaming - it would be an 'all purpose' card - with video editing and Blender - as the main uses - but, ML etc. would be a bonus.   Before I knew anything about any of this - ppl steered me away from AMD cards- and explained why Nvidia would be better suited.   I also use Linux - and AMD provides open source drivers - so, I wanted an AMD card and ideally, AMD would be sufficient for creators - but, they're really not - so, I admit it, honestly - it made me bitter - but, I still try to be objective and neutral - I have no horse in the race - I think they're both bad companies - but, I like an 'all-purpose' card that can do everything - not just excel in one area.   I guess that is a minority view in a sea of gamers? 

What is your point regarding bias? Yes, there is also anti-Nvidia bias, but that's mostly a combo of underdog syndrome and their prices & shady tactics. 

 

I've been reading everything that's been written, including yours, so I'm well aware of your stance. I would prefer an all-in-one card, too, but Nvidia isn't worth the price to me,  and I don't even require it at this time.

 

Stop playing the marginalized customer. The majority of graphics card users either don't really need one and/or are gamers, AFAIK. That means that creators of all types are less prominent, and capitalists often focus on the majority. So do politicians or, rather, those who are the richest and those who are the loudest. 😉 Your opinion is just as valid as the rest of us, so "poor me" is nonsense. Especially because you and I aren't the only people who prefer ai1.

I've been using computers since around 1978, started learning programming in 1980 on Apple IIs, started learning about hardware in 1990, ran a BBS from 1990-95, built my first Windows PC around 2000, taught myself malware removal starting in 2005 (also learned on Bleeping Computer), learned web dev starting in 2017, and I think I can fill a thimble with all that knowledge. 😉 I'm not an expert, which is why I keep investigating the answers that others give to try and improve my knowledge, so feel free to double-check the advice I give.

My phone's auto-correct is named Otto Rong.🤪😂

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, RevGAM said:

What is your point regarding bias? Yes, there is also anti-Nvidia bias, but that's mostly a combo of underdog syndrome and their prices & shady tactics. 

 

I've been reading everything that's been written, including yours, so I'm well aware of your stance. I would prefer an all-in-one card, too, but Nvidia isn't worth the price to me,  and I don't even require it at this time.

 

Stop playing the marginalized customer. The majority of graphics card users either don't really need one and/or are gamers, AFAIK. That means that creators of all types are less prominent, and capitalists often focus on the majority. So do politicians or, rather, those who are the richest and those who are the loudest. 😉 Your opinion is just as valid as the rest of us, so "poor me" is nonsense. Especially because you and I aren't the only people who prefer ai1.

What's my point?  I think it was fairly obvious.   I guess there's a lot of amd fanboys on here who can't discuss objectively.

The 'majority' are kind of dumb - which you and the other guy, Wallace, both made the point for me - they go with whatever is popular or the rep - and the 'underdog' AMD - this has nothing to do with a good card or price.   Yes, price is brought up but a lot of it is sentiment and emotion.

 

I'm arguing about practicality and getting the most from your dollar.   If I have a card that is good enough for gaming but it can also be highly functional and provide good performance - in productivity fields, then it will probably hold better value than a card that is strictly good for gaming - hence, why Nvidia cards might hold more value (in 2nd hand market) than AMD cards.   There might be some bias or belief in 'better drivers' or whatever - but, what really suggests 'more value' - is the diversity of the card and what it can offer other customers - be it gamers or content creators etc.   So, yes, this is a 'marginalized customer' - your words.   I think if AMD wants to ignore such ppl, they do at their own peril.   Obviously, nvidia cards are still being purchased more often than AMD cards - so, if you want to champion the system of capitalism, be careful.   I think it would be better if both (hey, why not Intel gpus, too?) provided both gamers and content creators good performance - and then you would have more competition, too, maybe?   

 

AMD's anti-consumerism is being mentioned, currently, ironically - both these companies are not admirable - and like I said, there's no underdog here.   

Link to comment
Share on other sites

Link to post
Share on other sites

Honestly, the performance is in line with what should be expected. 

The issue is simply the price pascal was 600 for a 1080 on launch (which was instantly 550 aib on launch), Today the msrp for a 4080 is 1199. That's literally double. I understand things do creep up a bit over time but 7 years for a 100% increase is quite steep. Realistically 650/700 should be the price. The 4090 should have been 999, and the Titan should have been 1500.  Maybe we'll see prices come down to earth next gen, but I'm not buying until I feel like i'm getting decent value for a top tier gpu. At the prices I listed I would have bought a 4090, However I'm sticking with a pascal card until this sorts itself out. 

The Vinyl Decal guy.

Celestial-Uprising  A Work In-Progress

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, MeDownYou said:

Honestly, the performance is in line with what should be expected. 

The issue is simply the price pascal was 600 for a 1080 on launch (which was instantly 550 aib on launch), Today the msrp for a 4080 is 1199. That's literally double. I understand things do creep up a bit over time but 7 years for a 100% increase is quite steep. Realistically 650/700 should be the price. The 4090 should have been 999, and the Titan should have been 1500.  Maybe we'll see prices come down to earth next gen, but I'm not buying until I feel like i'm getting decent value for a top tier gpu. At the prices I listed I would have bought a 4090, However I'm sticking with a pascal card until this sorts itself out. 

1080ti is hard to beat in 2023 without spending +$600. I have plenty of friends who'd benefit substantially from upgrading their era equivalent (or older) CPU, especially in the games they play, but we're talking largely pre-build type gamers. Anyone else capable of building a new platform and doing a swap or a GPU transplant to a new rig is in for quite the boost in performance.

 

I'll generally suggest anyone thinking of buying a new PC without wildly high expectations/budget should transplant their GPU from their old system if its a 1070ti or higher. If they're still unhappy with the performance, simply buy a new card after.

Ryzen 7950x3D Direct Die NH-D15, CCD1 disabled

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional IT since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Agall said:

1080ti is hard to beat in 2023 without spending +$600. I have plenty of friends who'd benefit substantially from upgrading their era equivalent (or older) CPU, especially in the games they play, but we're talking largely pre-build type gamers. Anyone else capable of building a new platform and doing a swap or a GPU transplant to a new rig is in for quite the boost in performance.

 

I'll generally suggest anyone thinking of buying a new PC without wildly high expectations/budget should transplant their GPU from their old system if its a 1070ti or higher. If they're still unhappy with the performance, simply buy a new card after.

yep, was using a launch 1600 with segfault issues, swapped to a 5600x last year huge upgrade compared to gpu upgrade I'd like to upgrade cpu again once either 14 gen or 8000 series comes out with a better memory controller. 

The Vinyl Decal guy.

Celestial-Uprising  A Work In-Progress

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, MeDownYou said:

yep, was using a launch 1600 with segfault issues, swapped to a 5600x last year huge upgrade compared to gpu upgrade I'd like to upgrade cpu again once either 14 gen or 8000 series comes out with a better memory controller. 

Mostly on the Intel side, but here's a simple but somewhat alarming representation on how bad older CPUs are.

 

Cinebench R23 single threaded scores:

 

4790k- 1069 (baseline)

8700k- 1209 (13.1% increase in 4 generations)

11700k- 1569 (29.8% increase in 3 generations)

12700k- 1939 (23.6% increase in a single generation)

13700k- 2126 (9.6% increase)

 

It goes from a relatively linear increase to a substantial jump. Noting the 4790k versus 8700k only has a 1.5x difference in L3 cache compared to the 12700k's 3.13x relative to the 4790k. That's not even considering the L2 cache per core increases across those generations, but that may factor into cinebench r23's single threaded scores.

 

5800x- 1619

5800x3D- 1475

(cinebench R23 does not seem to care for more L3 cache)

 

7950x- 2072

 

I grabbed this for evidence when discussing single threaded performance for a lot of games, especially MMO/multiplayer games. 

 

The conclusion of this thread being interesting when considering the fact that there's at least one scenario that's a simulated single player experience in this game where the 4790k and 7950x3D are both limited to 500 fps at 4K, yet at 1080p in the same environment, the 7950x3D jumps to 1200 fps and the 4790k doesn't move.

 

Just some of the evidence I use to justify my conclusion regarding potentially deferring a GPU upgrade. I've got an Intel A380 I want to test in my main rig to get more data on the topic, since its a literal $120 GPU and preliminary testing surprised me already in my 3950x rig.

Ryzen 7950x3D Direct Die NH-D15, CCD1 disabled

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional IT since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, MeDownYou said:

Honestly, the performance is in line with what should be expected. 

How so? 4090 maybe...maybe even 4080 (price aside), but the rest of the stack? No way.

 

3070 was about 2080 Ti performance. 2070 was about 1080 Ti performance. 1070 was about 980 Ti performance. Etc. The 4070? Barely maybe meeting a 3080. So the 4070 is higher in price AND doesn't perform as well as it should in it's class. The 4070 Ti should be the "4070" as it meets the performance characteristics of base non-Ti "70" branded cards in the previous gens matching the previous gen flagship.

 

Now let's look at the 4060 Ti. This is just an embarrassment of a card for the price and has no business even being a 60-class card because of both the die used, and the performance characteristics.

 

And then let's go back and look at the 4070 Ti. It absolutely matches the performance characteristics of a "70 card" notice, not a Ti. Yet its $800? Absurd.

 

No, this gen absolutely does not match with what should be expected performance wise for any given "class" or "branding" of the card below the very top end.

Zen 3 Daily Rig (2022 - Present): AMD Ryzen 9 5900X + Optimus Foundations AM4 | Nvidia RTX 3080 Ti FE + Alphacool Eisblock 3080 FE | G.Skill Trident Z Neo 32GB DDR4-3600 (@3733 c14) | ASUS Crosshair VIII Dark Hero | 2x Samsung 970 Evo Plus 2TB | Crucial MX500 1TB | Corsair RM1000x | Lian Li O11 Dynamic | LG 48" C1 | EK Quantum Kinetic TBE 200 w/ D5 | HWLabs GTX360 and GTS360 | Bitspower True Brass 14mm | Corsair 14mm White PMMA | ModMyMods Mod Water Clear | 9x BeQuiet Silent Wings 3 120mm PWM High Speed | Aquacomputer Highflow NEXT | Aquacomputer Octo

 

Test Bench: 

CPUs: Intel Core 2 Duo E8400, Core i5-2400, Core i7-4790K, Core i9-10900K, Core i3-13100, Core i9-13900KS

Motherboards: ASUS Z97-Deluxe, EVGA Z490 Dark, EVGA Z790 Dark Kingpin

GPUs: GTX 275 (RIP), 2x GTX 560, GTX 570, 2x GTX 650 Ti Boost, GTX 980, Titan X (Maxwell), x2 HD 6850

Bench: Cooler Master Masterframe 700 (bench mode)

Cooling: Heatkiller IV Pro Pure Copper | Koolance GPU-210 | HWLabs L-Series 360 | XSPC EX360 | Aquacomputer D5 | Bitspower Water Tank Z-Multi 250 | Monsoon Free Center Compressions | Mayhems UltraClear | 9x Arctic P12 120mm PWM PST

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×