Jump to content

What’s the best upgrade from 1080 11GB?

Tkeyfox

Hi i am wondering what the best upgrade from 1080 11GB would be? 

New or used doesn’t matter to me.

Cost something around 200-400$.

 

Current system specs 

Intel i7 8700

16GB RAM

Corsair RM850 PSU

 

I am thankfull for all recommendations Thanks.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Tkeyfox said:

Hi i am wondering what the best upgrade from 1080 11GB would be? 

New or used doesn’t matter to me.

Cost something around 200-400$.

 

Current system specs 

Intel i7 8700

16GB RAM

Corsair RM850 PSU

 

I am thankfull for all recommendations Thanks.

Assuming you're talking about an GTX 1080ti 11GB, its a hard GPU to upgrade from even in 2023. You won't be able to upgrade it for $400 or less though, since a 1080ti is comparable to an RTX 3060ti/4060ti which are $400 cards.

 

I'd put that budget towards a new CPU+motherboard if any, like a 13500/13600. Depending on the games you play, you could get a substantial performance boost having a newer CPU.

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Agall said:

Assuming you're talking about an GTX 1080ti 11GB, its a hard GPU to upgrade from even in 2023. You won't be able to upgrade it for $400 or less though, since a 1080ti is comparable to an RTX 3060ti/4060ti which are $400 cards.

 

I'd put that budget towards a new CPU+motherboard if any, like a 13500/13600. Depending on the games you play, you could get a substantial performance boost having a newer CPU.

If they are willing to go used and open to AMD then there are plenty of good deals on used 6000 series gpu that would be a decent upgrade over the 1080ti.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Agall said:

You won't be able to upgrade it for $400 or less though

this isn’t really true- the 6750xt is an upgrade, just maybe not quite enough to justify the purchase, but it’s definitely an upgrade, and they’re quite cheap used. Even a used 6800xt would be around $400 used and that’s a solid upgrade.

Link to comment
Share on other sites

Link to post
Share on other sites

Also there are some rtx 3080s that you can find for around 400 which is much faster than a 1080ti. 

Link to comment
Share on other sites

Link to post
Share on other sites

For new cards, $400 budget could get you a 6750 XT, 3060 Ti, 4060 Ti, but I'd say none of those are worthwhile an upgrade for the money, and provide some drawbacks like less VRAM (in the case of the Nvidia options), and performance is largely not going to see a sizable increase over the 1080 Ti.

 

Used, you could find a 2080 Ti around $350 give or take. 3080 is probably going to be more like $450 from what I've seen. You might get lucky and find one around $400. AMD side, could probably find a 6800 or 6800 XT in the $400's that would pose a good uplift in performance. 

 

This was just perusing eBay for 5 mins. Probably better deals for used locally or in other venues which might extend your options.

Zen 3 Daily Rig (2022 - Present): AMD Ryzen 9 5900X + Optimus Foundations AM4 | Nvidia RTX 3080 Ti FE + Alphacool Eisblock 3080 FE | G.Skill Trident Z Neo 32GB DDR4-3600 (@3733 c14) | ASUS Crosshair VIII Dark Hero | 2x Samsung 970 Evo Plus 2TB | Crucial MX500 1TB | Corsair RM1000x | Lian Li O11 Dynamic | LG 48" C1 | EK Quantum Kinetic TBE 200 w/ D5 | HWLabs GTX360 and GTS360 | Bitspower True Brass 14mm | Corsair 14mm White PMMA | ModMyMods Mod Water Clear | 9x BeQuiet Silent Wings 3 120mm PWM High Speed | Aquacomputer Highflow NEXT | Aquacomputer Octo

 

Test Bench: 

CPUs: Intel Core 2 Duo E8400, Core i5-2400, Core i7-4790K, Core i9-10900K, Core i3-13100, Core i9-13900KS

Motherboards: ASUS Z97-Deluxe, EVGA Z490 Dark, EVGA Z790 Dark Kingpin

GPUs: GTX 275 (RIP), 2x GTX 560, GTX 570, 2x GTX 650 Ti Boost, GTX 980, Titan X (Maxwell), x2 HD 6850

Bench: Cooler Master Masterframe 700 (bench mode)

Cooling: Heatkiller IV Pro Pure Copper | Koolance GPU-210 | HWLabs L-Series 360 | XSPC EX360 | Aquacomputer D5 | Bitspower Water Tank Z-Multi 250 | Monsoon Free Center Compressions | Mayhems UltraClear | 9x Arctic P12 120mm PWM PST

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Brooksie359 said:

Also there are some rtx 3080s that you can find for around 400 which is much faster than a 1080ti. 

true- although that’s mostly local, you might find something at that price on Hardwareswap.

Just now, Sir Beregond said:

Used, you could find a 2080 Ti around $350 give or take.

That’s a ripoff for a 2080ti, they shouldn’t be over $300 shipped used, ideally closer to $275. And the 6750xt would be a better buy anyway at that price range. Userbenchmark is just about the only site that puts the 2080ti over the 6750xt and everyone knows how nvidia and Intel biased they are.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Brooksie359 said:

If they are willing to go used and open to AMD then there are plenty of good deals on used 6000 series gpu that would be a decent upgrade over the 1080ti.

Single digit to barely double % upgrades for $400 on a +5 year old CPU, OP's more likely to see a performance boost out of a new CPU than GPU at that point, especially if we're talking $200-$400.

 

If we were talking $600, then sure, but we're talking about neigh equivalent performance at that price point. Anything beyond that used is going to be limited by an i7 8700 to where the OP is best off upgrading the CPU in my opinion. Especially if they play anything but single player games with very well optimized multithreading performance.

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, NF-A12x25 said:

true- although that’s mostly local, you might find something at that price on Hardwareswap.

That’s a ripoff for a 2080ti, they shouldn’t be over $300 shipped used, ideally closer to $275. And the 6750xt would be a better buy anyway at that price range. Userbenchmark is just about the only site that puts the 2080ti over the 6750xt and everyone knows how nvidia and Intel biased they are.

I agree, I wouldn't pay that. I did edit my post to say that was what I saw perusing eBay super quick. Locally and other venues can probably find much better deals. It wouldn't be my first choice.

Zen 3 Daily Rig (2022 - Present): AMD Ryzen 9 5900X + Optimus Foundations AM4 | Nvidia RTX 3080 Ti FE + Alphacool Eisblock 3080 FE | G.Skill Trident Z Neo 32GB DDR4-3600 (@3733 c14) | ASUS Crosshair VIII Dark Hero | 2x Samsung 970 Evo Plus 2TB | Crucial MX500 1TB | Corsair RM1000x | Lian Li O11 Dynamic | LG 48" C1 | EK Quantum Kinetic TBE 200 w/ D5 | HWLabs GTX360 and GTS360 | Bitspower True Brass 14mm | Corsair 14mm White PMMA | ModMyMods Mod Water Clear | 9x BeQuiet Silent Wings 3 120mm PWM High Speed | Aquacomputer Highflow NEXT | Aquacomputer Octo

 

Test Bench: 

CPUs: Intel Core 2 Duo E8400, Core i5-2400, Core i7-4790K, Core i9-10900K, Core i3-13100, Core i9-13900KS

Motherboards: ASUS Z97-Deluxe, EVGA Z490 Dark, EVGA Z790 Dark Kingpin

GPUs: GTX 275 (RIP), 2x GTX 560, GTX 570, 2x GTX 650 Ti Boost, GTX 980, Titan X (Maxwell), x2 HD 6850

Bench: Cooler Master Masterframe 700 (bench mode)

Cooling: Heatkiller IV Pro Pure Copper | Koolance GPU-210 | HWLabs L-Series 360 | XSPC EX360 | Aquacomputer D5 | Bitspower Water Tank Z-Multi 250 | Monsoon Free Center Compressions | Mayhems UltraClear | 9x Arctic P12 120mm PWM PST

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Agall said:

Single digit to barely double % upgrades for $400 on a +5 year old CPU, OP's more likely to see a performance boost out of a new CPU than GPU at that point, especially if we're talking $200-$400.

 

If we were talking $600, then sure, but we're talking about neigh equivalent performance at that price point. Anything beyond that used is going to be limited by an i7 8700 to where the OP is best off upgrading the CPU in my opinion. Especially if they play anything but single player games with very well optimized multithreading performance.

What are you even talking about? Even the 2080ti was 20% faster than the 1080ti and now you can find cards better than the 2080ti for 400 or under easily. Also 8700k is a decent cpu and unless you are playing at 1080p I don't see it being much of an issue at keeping up with a 400 dollar gpu. 

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, Brooksie359 said:

What are you even talking about? Even the 2080ti was 20% faster than the 1080ti and now you can find cards better than the 2080ti for 400 or under easily. Also 8700k is a decent cpu and unless you are playing at 1080p I don't see it being much of an issue at keeping up with a 400 dollar gpu. 

 

Take whatever CPU, like the 8700, and plot it about in between the 4790k and 7950x3D. In some places it doesn't matter, in some it matters a heck of a lot. CPU benchmarks are largely academic and designed to show the most repeatable gap. They largely don't serve a practical use case.

 

I created this thread as a demonstration of that, in a scenario where most people would say "it doesn't matter", being 4K ultra. Yes, there was one scenario where it didn't matter, which was a unicorn circumstance where both the 4790k and RTX 4090 were limited to 500 fps. Demonstrated when I changed the resolution to 1080p. Some might say "X fps is fine for them", and that's fine, if they're talking about this specific game

 

@Tkeyfox What games do you actually play and at what resolution? Its going to be a far better metric to gauge not only if you should buy a new graphics card, but what graphics card or new CPU to get, if any.

 

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Agall said:

 

Take whatever CPU, like the 8700, and plot it about in between the 4790k and 7950x3D. In some places it doesn't matter, in some it matters a heck of a lot. CPU benchmarks are largely academic and designed to show the most repeatable gap. They largely don't serve a practical use case.

 

I created this thread as a demonstration of that, in a scenario where most people would say "it doesn't matter", being 4K ultra. Yes, there was one scenario where it didn't matter, which was a unicorn circumstance where both the 4790k and RTX 4090 were limited to 500 fps. Demonstrated when I changed the resolution to 1080p. Some might say "X fps is fine for them", and that's fine, if they're talking about this specific game

 

@Tkeyfox What games do you actually play and at what resolution? Its going to be a far better metric to gauge not only if you should buy a new graphics card, but what graphics card or new CPU to get, if any.

 

You are talking about a 4090 which even at 4k runs very high fps especially in the garbage tier graphics games you are playing to the point it might as well be 1080p in a modern title. Also the 8700k is vastly different from a 4790k mostly because a huge issue with the 4790k it's 4 cores as well as being on ddr3. Also again don't compare the 4090 to a 400 dollar gpu. I doubt upgrading the cpu would give a higher performance increase than getting a faster gpu would. 

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, Brooksie359 said:

You are talking about a 4090 which even at 4k runs very high fps especially in the garbage tier graphics games you are playing to the point it might as well be 1080p in a modern title. Also the 8700k is vastly different from a 4790k mostly because a huge issue with the 4790k it's 4 cores as well as being on ddr3. Also again don't compare the 4090 to a 400 dollar gpu. I doubt upgrading the cpu would give a higher performance increase than getting a faster gpu would. 

You miss the point where an RTX 4090 eliminates any current GPU bottlenecks. The scenario where the framerate is equal at 500 fps is a GPU limited scenario, which can occur on a 4790k at 500 fps. 4790k being able to reach 500 fps in any game shows that its still a capable CPU but obviously has its limitations.

 

This is why I'll specify "depending on the games you play" with any CPU or GPU upgrade recommendations. In some games and some (or most) scenarios, an RTX 4090 is the GPU bottleneck or at worst you're losing performance well beyond your expectations.

 

I feel like you think too highly of an 8700, a CPU that was really only better than its previous generation because it upped the core count. Intel 6th gen through 11th gen had relatively mediocre increases in performance from just a lack of competition. The last two generations have been absurd in comparison, which is the only reason why this is even an argument in 2023. 

 

GPU rasterization performance on the other hand has less limitations for gaming. There's a certain expectation of performance for any graphics card which can be maximized with a new CPU where in contrast, a new GPU will be limited, sometimes substantially, by an older CPU. This is especially true in any game that involves CPU draw calls of player assets, which I generalize to MMO/multiplayer games. 

 

The sort of 'red pill' on it is simply looking at the difference in cache between the 8700k, 10900k, and 13900k. Intel practically doubled the cache of their CPUs multiple times, 10th to 13th gen being substantial gaps. Obviously on top of that involves IPC/architectural improvements, but we've seen with 3D v-cache how substantial CPU cache can be with 1:1 CPU comparisons with one being almost the same with 3x the L3 cache.

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, Agall said:

You miss the point where an RTX 4090 eliminates any current GPU bottlenecks. The scenario where the framerate is equal at 500 fps is a GPU limited scenario, which can occur on a 4790k at 500 fps. 4790k being able to reach 500 fps in any game shows that its still a capable CPU but obviously has its limitations.

 

This is why I'll specify "depending on the games you play" with any CPU or GPU upgrade recommendations. In some games and some (or most) scenarios, an RTX 4090 is the GPU bottleneck or at worst you're losing performance well beyond your expectations.

 

I feel like you think too highly of an 8700, a CPU that was really only better than its previous generation because it upped the core count. Intel 6th gen through 11th gen had relatively mediocre increases in performance from just a lack of competition. The last two generations have been absurd in comparison, which is the only reason why this is even an argument in 2023. 

 

GPU rasterization performance on the other hand has less limitations for gaming. There's a certain expectation of performance for any graphics card which can be maximized with a new CPU where in contrast, a new GPU will be limited, sometimes substantially, by an older CPU. This is especially true in any game that involves CPU draw calls of player assets, which I generalize to MMO/multiplayer games. 

 

The sort of 'red pill' on it is simply looking at the difference in cache between the 8700k, 10900k, and 13900k. Intel practically doubled the cache of their CPUs multiple times, 10th to 13th gen being substantial gaps. Obviously on top of that involves IPC/architectural improvements, but we've seen with 3D v-cache how substantial CPU cache can be with 1:1 CPU comparisons with one being almost the same with 3x the L3 cache.

If you don't understand the difference between a 6 core high frequency chip with mature ddr4 support vs an old 4th gen i7 then I don't know what to tell you. You even said yourself that the issue with mmos is mostly with the amount of things that need to be calculated and that is where 6 vs 4 cores make quite a big difference. I doubt you would see any significant gains in performance by upgrading the cpu vs gpu. Also I will be honest most mmos today are actually garbage games tbh and I wouldn't waste my time on any of them. 

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, Brooksie359 said:

If you don't understand the difference between a 6 core high frequency chip with mature ddr4 support vs an old 4th gen i7 then I don't know what to tell you. You even said yourself that the issue with mmos is mostly with the amount of things that need to be calculated and that is where 6 vs 4 cores make quite a big difference. I doubt you would see any significant gains in performance by upgrading the cpu vs gpu. Also I will be honest most mmos today are actually garbage games tbh and I wouldn't waste my time on any of them. 

I think you're missing the entire point in the discussion. Core count on those sorts of games doesn't matter simply because they're largely limited by single threaded performance.

 

Here's a quantitative representation of this with Cinebench R23 single threaded scores:

 

4790k- 1069 (baseline)

8700k- 1209 (13.1% increase in 4 generations)

11700k- 1569 (29.8% increase in 3 generations)

12700k- 1939 (23.6% increase in a single generation)

13700k- 2126 (9.6% increase)

 

It goes from a relatively linear increase to a substantial jump. Noting the 4790k versus 8700k only has a 1.5x difference in L3 cache compared to the 12700k's 3.13x relative to the 4790k. That's not even considering the L2 cache per core increases across those generations, but that may factor into cinebench r23's single threaded scores.

 

5800x- 1619

5800x3D- 1475

(cinebench R23 does not seem to care for more L3 cache)

 

Clearly there's been a non linear increase in performance within the last couple generations, even simply gauging off this metric.

 

Noting that those are each 4 generations apart and I pulled those specifically for that reason. Cinebench R23 also doesn't care for the amount of cache since its primarily just an IPC/frequency benchmark.

 

Games like MMOs with high latency sensitive CPU draw calls in theory benefiting substantially from being able to perform more operations locally on the CPU in cache than spending the extra time to fetch data from the system RAM. Noted by 5800x versus 5800x3D performance gaps, especially in MMOs.

 

Cinebench R23 scores all taken from cpu-monkey.com

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

I think you are doing great GPU-wise!

 

Wouldn't it be better to upgrade your RAM, though?

 

I have noticed that some games now can use more than 16 GB, whether you run programs in parallel or not.

 

There are several examples available on the Internet, but an unexpected one for me was Apex Legends.

 

Upgrading it to either 24 GB (2x8 + 1x8, keeping dual-channel) or 32 GB (2x8 + 2x8) would be reasonable.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Agall said:

I think you're missing the entire point in the discussion. Core count on those sorts of games doesn't matter simply because they're largely limited by single threaded performance.

 

Here's a quantitative representation of this with Cinebench R23 single threaded scores:

 

4790k- 1069 (baseline)

8700k- 1209 (13.1% increase in 4 generations)

11700k- 1569 (29.8% increase in 3 generations)

12700k- 1939 (23.6% increase in a single generation)

13700k- 2126 (9.6% increase)

 

It goes from a relatively linear increase to a substantial jump. Noting the 4790k versus 8700k only has a 1.5x difference in L3 cache compared to the 12700k's 3.13x relative to the 4790k. That's not even considering the L2 cache per core increases across those generations, but that may factor into cinebench r23's single threaded scores.

 

5800x- 1619

5800x3D- 1475

(cinebench R23 does not seem to care for more L3 cache)

 

Clearly there's been a non linear increase in performance within the last couple generations, even simply gauging off this metric.

 

Noting that those are each 4 generations apart and I pulled those specifically for that reason. Cinebench R23 also doesn't care for the amount of cache since its primarily just an IPC/frequency benchmark.

 

Games like MMOs with high latency sensitive CPU draw calls in theory benefiting substantially from being able to perform more operations locally on the CPU in cache than spending the extra time to fetch data from the system RAM. Noted by 5800x versus 5800x3D performance gaps, especially in MMOs.

 

Cinebench R23 scores all taken from cpu-monkey.com

Most mmos have garbage net code that wouldn't even benefit very much from the lower latency. I would even go so far as to say that mmos are probably the least effected by latency compared to most other games because at least other games are either single player and there is no server that is limiting how much that latency is effecting things meaning you will actually notice a very low latency when playing single player games or there are competitive games that are made with net code in mind that have very good servers that make low latency make a huge difference especially because most competitive games have very low ttk so the person who hits the other first has a huge advantage and 9 times out of 10 will win the 1v1. Compare that to most mmos where it takes quite a long time for you to finish the other person because of the much larger health pools and other abilities and you can see why latency in mmos isn't nearly as important as other games. Also the amount of increase in performance between cou generations doesn't mean anything if you have a gpu that can't take advantage of it. The 8700k is a fine cpu and can easily benefit from a faster gpu while I doubt the 1080ti would see a significant upgrade if they got a better cpu. I feel like you must only play mmos because almost everything you say revolves around an assumption that they are going to be exclusively playing mmos which most mmos out today are bad anyways so not sure why anyone would bother wasting their time. 

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, Brooksie359 said:

Most mmos have garbage net code that wouldn't even benefit very much from the lower latency. I would even go so far as to say that mmos are probably the least effected by latency compared to most other games because at least other games are either single player and there is no server that is limiting how much that latency is effecting things meaning you will actually notice a very low latency when playing single player games or there are competitive games that are made with net code in mind that have very good servers that make low latency make a huge difference especially because most competitive games have very low ttk so the person who hits the other first has a huge advantage and 9 times out of 10 will win the 1v1. Compare that to most mmos where it takes quite a long time for you to finish the other person because of the much larger health pools and other abilities and you can see why latency in mmos isn't nearly as important as other games. Also the amount of increase in performance between cou generations doesn't mean anything if you have a gpu that can't take advantage of it. The 8700k is a fine cpu and can easily benefit from a faster gpu while I doubt the 1080ti would see a significant upgrade if they got a better cpu. I feel like you must only play mmos because almost everything you say revolves around an assumption that they are going to be exclusively playing mmos which most mmos out today are bad anyways so not sure why anyone would bother wasting their time. 

I played WoW for 15 years up until about 10 months ago which entailed dozens of different hardware configurations, ending on a 5800x3D. Planetside 2 and Warframe for the last decade with the same variety of hardware minus my original Dell Vostro laptop with its GT310M.

 

When I played WoW I was regularly playing at +2400 rating in multiple brackets and +2700 in 3v3s, where the difference in network latency can matter with kicking spells. I've even run a Razer Blade and Razer Core with a GTX 1080 for over a year, to include with two other Thunderbolt capable laptops. That demonstrated a ridiculous amount of input latency that I was able to measure at ~50ms. Not sure how I put up with that for over a year, but I did, and realized that it was what made it difficult to properly kick abilities while playing feral druid in arenas (at that time I was only playing at about 2200-2400 rating, so top 1-3%). 

 

I won't sit and say I'm an expert on system performance for MMOs, but I've got decades of experience in various MMOs with an unreasonable variety of hardware configuration, while also playing them in what's at least the top 1%, sometimes top 0.1% of players where latency and framerates can matter. I'm also the type who remembers the performance of a game like Planetside 2 on my old Asus gaming laptop that had a i7 3610QM and GTX 670M way back in 2013. I've also done my own testing with the 4790k and 7950x3D that you likely saw above, which required a relatively high knowledge about relative performance in over a dozen scenarios that I've developed in a decade of playing the game.

 

I can tell you though that you're wrong in your assumptions about system performance in MMOs. In a game like WoW, the difference between a 8700k and 5800x3D as examples could be 30 fps versus +60 fps in a 40 man raid, which is a substantially better experience which allows for better execution of play. Something that really wasn't possible until the 5800x3D, Intel 12th/13th, or Ryzen 7000 series with the substantial performance gains those generations brought.

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, Agall said:

I played WoW for 15 years up until about 10 months ago which entailed dozens of different hardware configurations, ending on a 5800x3D. Planetside 2 and Warframe for the last decade with the same variety of hardware minus my original Dell Vostro laptop with its GT310M.

 

When I played WoW I was regularly playing at +2400 rating in multiple brackets and +2700 in 3v3s, where the difference in network latency can matter with kicking spells. I've even run a Razer Blade and Razer Core with a GTX 1080 for over a year, to include with two other Thunderbolt capable laptops. That demonstrated a ridiculous amount of input latency that I was able to measure at ~50ms. Not sure how I put up with that for over a year, but I did, and realized that it was what made it difficult to properly kick abilities while playing feral druid in arenas (at that time I was only playing at about 2200-2400 rating, so top 1-3%). 

 

I won't sit and say I'm an expert on system performance for MMOs, but I've got decades of experience in various MMOs with an unreasonable variety of hardware configuration, while also playing them in what's at least the top 1%, sometimes top 0.1% of players where latency and framerates can matter. I'm also the type who remembers the performance of a game like Planetside 2 on my old Asus gaming laptop that had a i7 3610QM and GTX 670M way back in 2013. I've also done my own testing with the 4790k and 7950x3D that you likely saw above, which required a relatively high knowledge about relative performance in over a dozen scenarios that I've developed in a decade of playing the game.

 

I can tell you though that you're wrong in your assumptions about system performance in MMOs. In a game like WoW, the difference between a 8700k and 5800x3D as examples could be 30 fps versus +60 fps in a 40 man raid, which is a substantially better experience which allows for better execution of play. Something that really wasn't possible until the 5800x3D, Intel 12th/13th, or Ryzen 7000 series with the substantial performance gains those generations brought.

I never said it didn't matter but what I did say was that it matters much less in mmos especially because pvp in mmos is a small subset of a the already niche mmos audience. While yes it may seen like there are alot of people who play mmos based on some of the mmos numbers but the fact is that mmos are a small minority of gamers especially when compared to say fortnite, warzone, valorant and other competitive pvp games. Those pvp games have better net code and benefit way more from lower latency. I mean when I went to 240hz back when I played a ton of overwatch I went from gold to masters in a couple of months which is an incredible gap. Granted with an 8700k you can easily hit 240 fps so it's not really an issue. Honestly in most competitive games they run well on even an 8700k so it's not like the 8700k would be much of an issue. As for mmos don't play them because Honestly why would I play an mmo to play pvp when there are plenty of way better games that were designed with pvp in mind rather than an afterthought like in pretty much all mmos. Also to call planetside 2 an mmo is a bit of a stretch.

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, Brooksie359 said:

I never said it didn't matter but what I did say was that it matters much less in mmos especially because pvp in mmos is a small subset of a the already niche mmos audience. While yes it may seen like there are alot of people who play mmos based on some of the mmos numbers but the fact is that mmos are a small minority of gamers especially when compared to say fortnite, warzone, valorant and other competitive pvp games. Those pvp games have better net code and benefit way more from lower latency. I mean when I went to 240hz back when I played a ton of overwatch I went from gold to masters in a couple of months which is an incredible gap. Granted with an 8700k you can easily hit 240 fps so it's not really an issue. Honestly in most competitive games they run well on even an 8700k so it's not like the 8700k would be much of an issue. As for mmos don't play them because Honestly why would I play an mmo to play pvp when there are plenty of way better games that were designed with pvp in mind rather than an afterthought like in pretty much all mmos. Also to call planetside 2 an mmo is a bit of a stretch.

I prefer to not marginalize a community of gamers by making non-specific recommendations, which is why I always stipulate my hardware recommendations for gaming systems around what games the user actually play.

 

Someone playing MMOs primarily is better off chopping even $100 off their GPU and putting it towards the best CPU within reason, something like a 7800x3D or 13700k (depending on their preference). Even then it depends entirely on the game. Even in Warframe I tested CCD0 vs CCD1 of my 7950x3D by just running it as 8c/16t configured in the UEFI and the performance was noticeably smoother on the 3D v-cache CCD (CCD0).

 

I noticed the same thing playing Rainbow 6 Siege when I upgraded to a 4K 240Hz display after having 144Hz from all the way back in 2014. Especially when you can overdrive and get the absolutely smoothest experience possible. Even now playing Diablo 4, 4K ultra with DLSS quality and frame generation, the experience is noticeably smoother than with DLSS, FG, and BOOST disabled. I've done a minimal amount of testing in that game so far, at least on my main rig really only testing various configurations of those features.

 

Its all more nuanced than most people lead on, my goal in the discussion is to open up that conversation to more than just generalizations based on arguably academic benchmarks done at 720p low settings with regards to CPU performance in various games.

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Agall said:

I prefer to not marginalize a community of gamers by making non-specific recommendations, which is why I always stipulate my hardware recommendations for gaming systems around what games the user actually play.

 

Someone playing MMOs primarily is better off chopping even $100 off their GPU and putting it towards the best CPU within reason, something like a 7800x3D or 13700k (depending on their preference). Even then it depends entirely on the game. Even in Warframe I tested CCD0 vs CCD1 of my 7950x3D by just running it as 8c/16t configured in the UEFI and the performance was noticeably smoother on the 3D v-cache CCD (CCD0).

 

I noticed the same thing playing Rainbow 6 Siege when I upgraded to a 4K 240Hz display after having 144Hz from all the way back in 2014. Especially when you can overdrive and get the absolutely smoothest experience possible. Even now playing Diablo 4, 4K ultra with DLSS quality and frame generation, the experience is noticeably smoother than with DLSS, FG, and BOOST disabled. I've done a minimal amount of testing in that game so far, at least on my main rig really only testing various configurations of those features.

 

Its all more nuanced than most people lead on, my goal in the discussion is to open up that conversation to more than just generalizations based on arguably academic benchmarks done at 720p low settings with regards to CPU performance in various games.

I tend to give advice on what the most common gamer would want not on a very niche subset of the gaming community without any indication that op is even apart of that community. If they said they were then fair enough but they didn't so to assume they are a part of this small minority is kinda weird tbh. Also frame generation adds latency so not good when you are so concerned with latency as you said you were. 

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, Brooksie359 said:

I tend to give advice on what the most common gamer would want not on a very niche subset of the gaming community without any indication that op is even apart of that community. If they said they were then fair enough but they didn't so to assume they are a part of this small minority is kinda weird tbh. Also frame generation adds latency so not good when you are so concerned with latency as you said you were. 

I still wouldn't suggest that upgrading from a 8700k and 1080ti to a newer CPU is niche or only an upgrade for MMO/multiplayer games. Its a disproportionately higher upgrade for those games but its still substantial for almost any game, especially at 1080p/1440p, but can also be substantial at 4K depending on the game.

 

Its really no different than getting an RTX 3060ti in 2023 and having an equivalently bad Intel 11th generation or Ryzen 2000 and before CPU and wondering why the performance isn't as good.

 

Have you tried frame generation btw? What's your experience with it personally?

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, Agall said:

I still wouldn't suggest that upgrading from a 8700k and 1080ti to a newer CPU is niche or only an upgrade for MMO/multiplayer games. Its a disproportionately higher upgrade for those games but its still substantial for almost any game, especially at 1080p/1440p, but can also be substantial at 4K depending on the game.

 

Its really no different than getting an RTX 3060ti in 2023 and having an equivalently bad Intel 11th generation or Ryzen 2000 and before CPU and wondering why the performance isn't as good.

 

Have you tried frame generation btw? What's your experience with it personally?

You do realize that trying frame generation doesn't change the facts about it introducing latency. I don't bother with frame generation because I play at 1080p and already get 300 fps in most of the games I play which us primarily esports games. Also even the 8700k is significantly faster than 2000 series ryzen. 3000 series was the first ryzen generation that caught up with Intel in terms of cpu performance. Also upgrading the cpu wouldn't do much in most games where a gpu upgrade would make a big difference in games especially with allowing for higher graphics settings. 

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Brooksie359 said:

You do realize that trying frame generation doesn't change the facts about it introducing latency. I don't bother with frame generation because I play at 1080p and already get 300 fps in most of the games I play which us primarily esports games. Also even the 8700k is significantly faster than 2000 series ryzen. 3000 series was the first ryzen generation that caught up with Intel in terms of cpu performance. Also upgrading the cpu wouldn't do much in most games where a gpu upgrade would make a big difference in games especially with allowing for higher graphics settings. 

Oh so you haven't bothered to try a feature you're discussing the usability of? Generalizing an 'increase in latency' doesn't quantify the feature at all. I'm still not a fan of it, but at least I'm testing its usability before discussing it so assertively. It does have its problems, in Witcher 3 it practically broke the game, and in Diablo 4 so far, it only causes occasional freezes (although the game still works after a 1-2 second freeze). Overall, it only seems to increase the framerate from ~210 to ~240 for me.

 

Single thread R23 scores:

 

2700x- 1071

7700x- 2010

8700k- 1209

13700k- 2126

 

Doesn't look 'significantly faster than 2000 series' to me when you're putting it relative to the newest generation. I'll agree that Ryzen 3000 was the first gaming PC competitive Ryzen generation since that was when I finally upgraded from my 4790k to a 3950x while still having a GTX 1080. Noting that 4790k started with a GTX 980, then to a GTX Titan X (Maxwell), then I had that GTX 1080 till RTX 3000 series.

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×