Jump to content

CPU for Radeon RX 7900XT

WuWkie
Go to solution Solved by Agall,
6 minutes ago, WuWkie said:

Hi, I'm planning to upgrade my current GPU to a RX 7900XT and I was wondering if I should upgrade my CPU before or if it'll work fine, I'm currently using a Ryzen 7 3800X

Either buy a 5800x3D if you have a good motherboard+RAM combo already for your 3800x or jump onto AM5.


For AM5, I would either wait a week for the 7800x3D to come out at $450 or just buy a R5 7600(x) at half the price and wait for the 8800x3D. Resale on the R5 7600(x) should be good for a long time since its also a capable basic workstation CPU due to having an iGPU this time around.

 

Even at 4K, a 3800x would bottleneck a 7900 XTX in most games people actually play by a surprising amount.

Hi, I'm planning to upgrade my current GPU to a RX 7900XT and I was wondering if I should upgrade my CPU before or if it'll work fine, I'm currently using a Ryzen 7 3800X

Link to comment
Share on other sites

Link to post
Share on other sites

To get the absolute best out of it, i would upgrade to a 5800x3D. I also had a 3800x and when i got my 3090ti (i got it pretty cheap for those who question it) and upgraded to the 5800x3D and it has been quite a huge improvement. Just need a bios update and voila.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, WuWkie said:

Hi, I'm planning to upgrade my current GPU to a RX 7900XT and I was wondering if I should upgrade my CPU before or if it'll work fine, I'm currently using a Ryzen 7 3800X

Either buy a 5800x3D if you have a good motherboard+RAM combo already for your 3800x or jump onto AM5.


For AM5, I would either wait a week for the 7800x3D to come out at $450 or just buy a R5 7600(x) at half the price and wait for the 8800x3D. Resale on the R5 7600(x) should be good for a long time since its also a capable basic workstation CPU due to having an iGPU this time around.

 

Even at 4K, a 3800x would bottleneck a 7900 XTX in most games people actually play by a surprising amount.

Ryzen 7950x3D PBO +200MHz / -15mV curve CPPC in 'prefer cache'

RTX 4090 @133%/+230/+1000

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, WuWkie said:

if I should upgrade my CPU before or if it'll work fine,

a very slow CPU still works with very fast GPU, the only question is how much will the slow CPU bottleneck the fast GPU

Note: Users receive notifications after Mentions & Quotes. 

Feel free to ask any questions regarding my comments/build lists. I know a lot about PCs but not everything.

PC:

Ryzen 5 5600 |16GB DDR4 3200Mhz | B450 | GTX 1080 ti

PCs I used before:

Pentium G4500 | 4GB/8GB DDR4 2133Mhz | H110 | GTX 1050

Ryzen 3 1200 3,5Ghz / OC:4Ghz | 8GB DDR4 2133Mhz / 16GB 3200Mhz | B450 | GTX 1050

Ryzen 3 1200 3,5Ghz | 16GB 3200Mhz | B450 | GTX 1080 ti

Link to comment
Share on other sites

Link to post
Share on other sites

Upgrade CPU only if you will get insufficient performance. In general AMD cards don't really struggle with CPU limited scenarios nearly as much as NVIDIA cards where their driver overhead can cut you down by 15%-20%.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, podkall said:

a very slow CPU still works with very fast GPU, the only question is how much will the slow CPU bottleneck the fast GPU

It will and does, more than most people would expect, especially on the games most people actually play. The suite of games used for academic CPU benchmarking don't translate to every game, especially MMO/multiplayer games where draw calls of player assets is likely the thing the GPU is waiting for.

Ryzen 7950x3D PBO +200MHz / -15mV curve CPPC in 'prefer cache'

RTX 4090 @133%/+230/+1000

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, WereCat said:

Upgrade CPU only if you will get insufficient performance. In general AMD cards don't really struggle with CPU limited scenarios nearly as much as NVIDIA cards where their driver overhead can cut you down by 15%-20%.

A 7900 XTX isn't going to speed up how fast an older generation CPU can draw calls of player assets in a game like WoW. Most people are better off getting a new CPU in 2023 than GPU, if they've got at least a GTX 1070ti in relative performance. Assuming they're the typical gamer who mostly plays poorly optimized decade old games like most PC gamers.

 

Steam Charts - Tracking What's Played

Ryzen 7950x3D PBO +200MHz / -15mV curve CPPC in 'prefer cache'

RTX 4090 @133%/+230/+1000

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, WuWkie said:

Hi, I'm planning to upgrade my current GPU to a RX 7900XT and I was wondering if I should upgrade my CPU before or if it'll work fine, I'm currently using a Ryzen 7 3800X

It'll likely be fine, specially if you're going to play more graphically intensive games at high resolutions. CPU is secondary, as long as it hits good enough framerate and it doesn't feel sluggish you should be fine.

 

The 5800X3D is a good option if you want to only change CPUs without buying anything else, I would only consider it after a month or so using what you already have and seeing for yourself whether it's enough or not.

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Agall said:

A 7900 XTX isn't going to speed up how fast an older generation CPU can draw calls of player assets in a game like WoW. Most people are better off getting a new CPU in 2023 than GPU, if they've got at least a GTX 1070ti in relative performance. Assuming they're the typical gamer who mostly plays poorly optimized decade old games like most PC gamers.

 

Steam Charts - Tracking What's Played

OFC, there are games where CPU is a massive bottleneck in some scenarios in which case it does not matter how powerful your card is (Cities Skylines for example) but in general most games will still run faster on AMD card in mostly CPU limited scenarios. That's why I suggest to not upgrade CPU yet and decide whether you need the upgrade or not as it mostly will be performing fine so it may be just a waste of money for now.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, WereCat said:

OFC, there are games where CPU is a massive bottleneck in some scenarios in which case it does not matter how powerful your card is (Cities Skylines for example) but in general most games will still run faster on AMD card in mostly CPU limited scenarios. That's why I suggest to not upgrade CPU yet and decide whether you need the upgrade or not as it mostly will be performing fine so it may be just a waste of money for now.

City Skylines was fun to test with my system. Regardless of resolution, full speed camera movement at ground level was a solid 60 fps experience on a tuned 7950x3D and OC'd 4090.

 

I generally suggest the opposite. Someone's more likely to get a better experience in most games people actually play with a CPU upgrade than a GPU upgrade in 2023, especially with how competitive the CPU market is and how trash the GPU market is (still).

 

Where my recommendation comes in, anyone using a GTX 1070ti or better GPU in 2023 at 1080p/1440p/1440p UW is more likely to benefit from a CPU upgrade in most the games people actually play than a GPU upgrade. If the experience is still not enough, then buy a new GPU.

 

Either way, there's a CPU bottleneck, not only is the CPU upgrade generally less expensive, but also better value in 2023, but also more likely to be a framerate increase in most games PC gamers actually play. To me its a no brainer.

 

Example being my brother's system, who mostly plays WoW, Warzone, Planetside 2, etc (mostly MMO/multiplayer games). He could've upgraded from an R5 3600 to R7 5800x3D and/or RTX 3070 to RX 6900 XT. In the games he actually plays the most, being WoW, the R5 3600 to 5800x3D was a 100% framerate upgrade in the most limiting scenario, being raids, cities, aka high population areas. Same happened in Warzone, his framerate doubled from just the CPU upgrade. He's still rocking the RTX 3070 and I ended up selling the RX 6900 XT (it was a gucci AIO cooled Asus one, so it wouldn't fit in his mITX system which he needs to have). The limiting factor by far in most the games he actually played was the CPU, where the RTX 3070 was more than sufficient for 1440p in those games.

 

OP has threads asking about VR performance, in those scenarios specifically, a new CPU probably won't do much, but I won't speak too much on that since I don't use VR nor plan on testing those games. Generally that should come close to standard RPG games at a high resolution, which generally don't care for a better CPU since they're generally quite well optimized for multithreading.

Ryzen 7950x3D PBO +200MHz / -15mV curve CPPC in 'prefer cache'

RTX 4090 @133%/+230/+1000

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Agall said:

City Skylines was fun to test with my system. Regardless of resolution, full speed camera movement at ground level was a solid 60 fps experience on a tuned 7950x3D and OC'd 4090.

 

I generally suggest the opposite. Someone's more likely to get a better experience in most games people actually play with a CPU upgrade than a GPU upgrade in 2023, especially with how competitive the CPU market is and how trash the GPU market is (still).

 

Where my recommendation comes in, anyone using a GTX 1070ti or better GPU in 2023 at 1080p/1440p/1440p UW is more likely to benefit from a CPU upgrade in most the games people actually play than a GPU upgrade. If the experience is still not enough, then buy a new GPU.

 

Either way, there's a CPU bottleneck, not only is the CPU upgrade generally less expensive, but also better value in 2023, but also more likely to be a framerate increase in most games PC gamers actually play. To me its a no brainer.

 

Example being my brother's system, who mostly plays WoW, Warzone, Planetside 2, etc (mostly MMO/multiplayer games). He could've upgraded from an R5 3600 to R7 5800x3D and/or RTX 3070 to RX 6900 XT. In the games he actually plays the most, being WoW, the R5 3600 to 5800x3D was a 100% framerate upgrade in the most limiting scenario, being raids, cities, aka high population areas. Same happened in Warzone, his framerate doubled from just the CPU upgrade. He's still rocking the RTX 3070 and I ended up selling the RX 6900 XT (it was a gucci AIO cooled Asus one, so it wouldn't fit in his mITX system which he needs to have). The limiting factory by far in most the games he actually played was the CPU, where the RTX 3070 was more than sufficient for 1440p in the games he actually plays.

Yes, all those games you mentioned are benefiting from the extra VCache on 5800X3D and boost the performance quite a bit. For those same games I upgraded my 3900X to 5800X3D myself. But there are plenty other games where the benefit is very small or none at all so it really comes down to what your expectations of acceptable performance are.

 

At the end of the day, bottleneck does not matter as long as you're getting performance adequate to you needs. If the bottleneck gets in the way of your experience, then it's usually time to upgrade. If you play a lot of battleroyale, strategy, MMO, etc... then yes, upgrade may be of a huge benefit but if you mostly play single player games and are used to crank up graphics for full immersion then it does not really matter that much.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, WereCat said:

Yes, all those games you mentioned are benefiting from the extra VCache on 5800X3D and boost the performance quite a bit. For those same games I upgraded my 3900X to 5800X3D myself. But there are plenty other games where the benefit is very small or none at all so it really comes down to what your expectations of acceptable performance are.

 

At the end of the day, bottleneck does not matter as long as you're getting performance adequate to you needs. If the bottleneck gets in the way of your experience, then it's usually time to upgrade. If you play a lot of battleroyale, strategy, MMO, etc... then yes, upgrade may be of a huge benefit but if you mostly play single player games and are used to crank up graphics for full immersion then it does not really matter that much.

That's where my general recommendation comes in, which is tailored by the recent two generation's pretty impressive gap in CPU performance, and the current GPU market. Since GPU technology really hasn't developed, and with ray tracing still being a luxury feature, cards that came out in 2015 are still competitive today. CPUs on the other hand have seen quite the jump in performance relative to gaming, even though there's only been lithographic and architectural upgrades.

 

To me its been clear that CPU bottlenecks are more common based on the testing I've done between a 4790k, R5 3600, 3950x, 5800x3D, and 7950x3D in a lot of games that major benchmarking outlets don't and really shouldn't benchmark. Its a scenario where I hope LTT or someone takes a look at this. Takes a GTX 1080 and an equivalent CPU, then tests more CPU limited games like MMOs/multiplayer games to see how bottlenecked the CPU has been for those titles. It would help a lot of gamers make a more value orientated upgrade decision for the games they actually play, instead of FC6, Hitman 3, etc.

 

I didn't expect a lot of gain at 4K with the 7950x3D, but I've seen far better minimums than I expected, assumingly times in the games I play where having the extra IPC/frequency/cache are beneficial to increasing my 1% low framerates.

 

I've just come to fundamentally disagree with the nature of modern CPU benchmarking and their utility. Its nice to show gains in fixed environments with titles that can replicate the same gains in a new product, but that doesn't help the average gamer. The average gamer is playing League, or CSGO (which they do at least test), or WoW, or other MMOs. Yes testing those can be difficult, but we're not talking margin of error level performance gains here in my experience. It should be easy to show a 100% gain in minimum framerates.

Ryzen 7950x3D PBO +200MHz / -15mV curve CPPC in 'prefer cache'

RTX 4090 @133%/+230/+1000

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Agall said:

That's where my general recommendation comes in, which is tailored by the recent two generation's pretty impressive gap in CPU performance, and the current GPU market. Since GPU technology really hasn't developed, and with ray tracing still being a luxury feature, cards that came out in 2015 are still competitive today. CPUs on the other hand have seen quite the jump in performance relative to gaming, even though there's only been lithographic and architectural upgrades.

 

To me its been clear that CPU bottlenecks are more common based on the testing I've done between a 4790k, R5 3600, 3950x, 5800x3D, and 7950x3D in a lot of games that major benchmarking outlets don't and really shouldn't benchmark. Its a scenario where I hope LTT or someone takes a look at this. Takes a GTX 1080 and an equivalent CPU, then tests more CPU limited games like MMOs/multiplayer games to see how bottlenecked the CPU has been for those titles. It would help a lot of gamers make a more value orientated upgrade decision for the games they actually play, instead of FC6, Hitman 3, etc.

 

I didn't expect a lot of gain at 4K with the 7950x3D, but I've seen far better minimums than I expected, assumingly times in the games I play where having the extra IPC/frequency/cache are beneficial to increasing my 1% low framerates.

 

I've just come to fundamentally disagree with the nature of modern CPU benchmarking and their utility. Its nice to show gains in fixed environments with titles that can replicate the same gains in a new product, but that doesn't help the average gamer. The average gamer is playing League, or CSGO (which they do at least test), or WoW, or other MMOs. Yes testing those can be difficult, but we're not talking margin of error level performance gains here in my experience. It should be easy to show a 100% gain in minimum framerates.

The problem is they need something repeatable for benchmarking. It's impossible to bench MMO. You can go to the same places, do the same things but everything around you is outside the control, same with BR games.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, WereCat said:

The problem is they need something repeatable for benchmarking. It's impossible to bench MMO. You can go to the same places, do the same things but everything around you is outside the control, same with BR games.

"Yes testing those can be difficult, but we're not talking margin of error level performance gains here in my experience. It should be easy to show a 100% gain in minimum framerates."

 

That's precisely why they don't do it, its difficult data to get, but the differences in some of these in my experience are so insane that even with a 20% margin of error, its still obvious.

 

What's pretty obvious, at least to me, is when within 1 hours (the time it took me to CPU swap) when I logged into WoW in Oribos and my average framerate went up 50%.

Same with Warframe, sitting in an Orokin defense map with 3 other players, and the framerate went from 160fps to 240fps on average. Pretty definitive that in some games, those specifically that I tested right before and after the swap, that's there's a substantial difference, disproportional to what you would otherwise expect between a 3950x (tuned) to a 5800x3D.

 

My brother's system we only tested WoW in the new city in DF, his framerates from what I saw right before were doubled. His was a CPU and memory upgrade though, from a lesser 3200MHz kit to a 3600MHz kit as well (I transplanted my 5800x3D and 32GB kit for his when I got my 7950x3D).

Ryzen 7950x3D PBO +200MHz / -15mV curve CPPC in 'prefer cache'

RTX 4090 @133%/+230/+1000

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...

I hope no one feels like I am totally jacking this thread.  However, I would love some advise/opinions. 

 

I just got a RX 7900 XT and slotted it into my old i7-6700K system (previously did upgrade to 3200MT/s CL16 RAM) and am wondering if I should also upgrade to a i5-12600K.  That would cost about $500 to get a decent motherboard too.  

 

The true issue is that my plan is for the 7900 XT to be a hold me over (plus an opportunity to try an AMD card) for 2-3 years before I do a full high end build from scratch.  Targeting intel 20A/18A and RTX 5090/6090 at that time.  So... do I stick it out on my 6700K (has been an awesome CPU) or do the $500 upgrade to a 12600K for a 2-3 year usage period? 

 

I'm guessing the uplift from the i5-13600K (EDIT used to say 12600K -- mistake) will not be huge at 4K (5-15%) in average FPS, but perhaps I would get much better 1% & 0.1% lows?  Just trying to get help deciding if I should pitch in the $500 and give my 6700K an earlier retirement or stick with the 6700K all the way for the next few years knowing that I am bottlenecking the 7900 XT by a "decent amount" even at 4K.   And, I will downshift to 1440p in order to make sure I get 60+ fps stable if such a demanding game presents itself.  Of course I am using a 144Hz monitor, 60Hz is the the low "playable" fps for me.

 

Apricate any advise! 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, solarus8 said:

I hope no one feels like I am totally jacking this thread.  However, I would love some advise/opinions. 

 

I just got a RX 7900 XT and slotted it into my old i7-6700K system (previously did upgrade to 3200MT/s CL16 RAM) and am wondering if I should also upgrade to a i5-12600K.  That would cost about $500 to get a decent motherboard too.  

 

The true issue is that my plan is for the 7900 XT to be a hold me over (plus an opportunity to try an AMD card) for 2-3 years before I do a full high end build from scratch.  Targeting intel 20A/18A and RTX 5090/6090 at that time.  So... do I stick it out on my 6700K (has been an awesome CPU) or do the $500 upgrade to a 12600K for a 2-3 year usage period? 

 

I'm guessing the uplift from the i5-12600K will not be huge at 4K (5-15%) in average FPS, but perhaps I would get much better 1% & 0.1% lows?  Just trying to get help deciding if I should pitch in the $500 and give my 6700K an earlier retirement or stick with the 6700K all the way for the next few years knowing that I am bottlenecking the 7900 XT by a "decent amount" even at 4K.   And, I will downshift to 1440p in order to make sure I get 60+ fps stable if such a demanding game presents itself.  Of course I am using a 144Hz monitor, 60Hz is the the low "playable" fps for me.

 

Apricate any advise! 

It really depends on the games you're actually playing in my opinion. If you're mostly playing single player RPGs and generally well optimized games, you won't see $500 worth of framerate increase. If you're playing Hogwarts and/or multiplayer games, especially older ones, then a 6700k is pretty limiting, even up to 50%, especially in MMOs.

 

Usually that's independent of resolution, and a 7900 XT should be able to reach 4K 60 fps without issue with decent settings, so its quite indicative of a CPU bottleneck. Its really something that needs to be covered more in 2023 since the GPU market is so terrible and the CPU market is as good as its ever been. I would expect a company like LTT to be on the "don't upgrade your GPU when your CPU is really the limiter", which in my opinion is a GTX 1070ti and above in performance.

Ryzen 7950x3D PBO +200MHz / -15mV curve CPPC in 'prefer cache'

RTX 4090 @133%/+230/+1000

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Agall said:

It really depends on the games you're actually playing in my opinion. If you're mostly playing single player RPGs and generally well optimized games, you won't see $500 worth of framerate increase. If you're playing Hogwarts and/or multiplayer games, especially older ones, then a 6700k is pretty limiting, even up to 50%, especially in MMOs.

 

Usually that's independent of resolution, and a 7900 XT should be able to reach 4K 60 fps without issue with decent settings, so its quite indicative of a CPU bottleneck. Its really something that needs to be covered more in 2023 since the GPU market is so terrible and the CPU market is as good as its ever been. I would expect a company like LTT to be on the "don't upgrade your GPU when your CPU is really the limiter", which in my opinion is a GTX 1070ti and above in performance.

Thanks for the perspective.  For some reason this is such a hard choice for me.  I play a variety of games and do not have any recurring "productivity workloads".  Afterburner/Riva does show that my CPU is maxing out when I turn on Ultra FSR 4K in Horizon Zero Dawn.  Most of the other games aren't showing 80-100% CPU utilization.  However, I am betting that a good deal of performance can be gained even in the games not screaming in the benchmark for more CPU.

 

For right now I am leaning on sticking it out with the 6700K.  I agree that CPUs right now are pretty good, however I am really looking forward to Intel 20A and all the actual device level advances we will see.  And of course, I'll also keep an eye on AMD CPUs too and see where things land.

Link to comment
Share on other sites

Link to post
Share on other sites

List of decent CPUs:

•Ryzen™ 9 7900X 

•Ryzen™ 7 5800X3D

•(Used) Ryzen™ 9 7950X

•Core™ i7 13700KF

etc.

 

 

 

With these CPUs you're fine if you use the Radeon™ RX 7900XT.

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, MootEndymion752 said:

List of decent CPUs:

•Ryzen™ 9 7900X 

•Ryzen™ 7 5800X3D

•(Used) Ryzen™ 9 7950X

•Core™ i7 13700KF

etc.

 

 

 

With these CPUs you're fine if you use the Radeon™ RX 7900XT.

 

Also 7800X3D

I edit my posts more often than not

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, solarus8 said:

Thanks for the perspective.  For some reason this is such a hard choice for me.  I play a variety of games and do not have any recurring "productivity workloads".  Afterburner/Riva does show that my CPU is maxing out when I turn on Ultra FSR 4K in Horizon Zero Dawn.  Most of the other games aren't showing 80-100% CPU utilization.  However, I am betting that a good deal of performance can be gained even in the games not screaming in the benchmark for more CPU.

 

For right now I am leaning on sticking it out with the 6700K.  I agree that CPUs right now are pretty good, however I am really looking forward to Intel 20A and all the actual device level advances we will see.  And of course, I'll also keep an eye on AMD CPUs too and see where things land.

The recent set of CPU gaming performance, especially in poorly optimized titles is in part due to the competition AMD has finally put up. Between the 6700k and 12700k, there really wasn't any advancements on Intel. Between even the 10th gen, (11th doesn't count), 12th gen, and 13th gen, Intel increased the amount of L2+L3 cache each of those generations by 50%. There was definitely a limitation associated with L2+L3 cache that was made obvious by the 5800x3D, so I'd expect Intel to keep going in that direction of more cache.

Ryzen 7950x3D PBO +200MHz / -15mV curve CPPC in 'prefer cache'

RTX 4090 @133%/+230/+1000

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×