Jump to content

RTX 2080 Ti Needs a CPU for Gaming

 It's always 50-80% usage for gaming. I have a Asus z390 E Gaming motherboard, but I am willing to upgrade it. 

I was thinking of getting an i9 9900k.

GTX  2080 Ti

Rog Strix z390 E Gaming motherboard

i7 8700k @5ghz - delided

Corsair Vengeance 8gig x2 Ram

Corsair 750w HX  psu

 

Link to comment
Share on other sites

Link to post
Share on other sites

well then buy it, is there something you need help with?

| If someones post is helpful or solves your problem please mark it as a solution 🙂 |

I am a human that makes mistakes! If I'm wrong please correct me and tell me where I made the mistake. I try my best to be helpful.

System Specs

<Ryzen 5 3600 3.5-4.2Ghz> <Noctua NH-U12S chromax.Black> <ZOTAC RTX 2070 SUPER 8GB> <16gb 3200Mhz Crucial CL16> <DarkFlash DLM21 Mesh> <650w Corsair RMx 2018 80+ Gold> <Samsung 970 EVO 500gb NVMe> <WD blue 500gb SSD> <MSI MAG b550m Mortar> <5 Noctua P12 case fans>

Peripherals

<Lepow Portable Monitor + AOC 144hz 1080p monitor> 

<Keymove Snowfox 61m>

<Razer Mini>

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, gtr2244 said:

It's always 50-80% usage for gaming.

Unless you're significantly off of the FPS that you should be getting, a component being at 80% load is not an issue. It's what they're made for.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, gtr2244 said:

i7 8700k @5ghz - delided

That's still a pretty decent chip for gaming and your 2080ti. But hey, who am I to say no to an upgrade 😛. The 9900K would also be a nice chip, if you have the disposable income kicking around, may as well eh? You only YOLO once right?>?

Link to comment
Share on other sites

Link to post
Share on other sites

I went to a 3080 on my 5ghz 8700k and in most games there was not much if any bottleneck at 1440p. Most games were 95%+ GPU utilization.

 

I think in some games GPU utilization is going to be shit regardless of CPU anyway, like COD or other online battle royale games. So if those are the games you're referring to...a CPU upgrade won't help THAT much.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

A i9 9900k does fine with a 2080 ti but in a lot of games it does the same as a 5ghz Intel 6 core.

 

Here is one game with the i9 9900k using MCE and a i7 8086 with a 5ghz all core overclock and I have also added a stock 5800x

Shadow of the Tomb Raider on Ultra with a FTW3 Ultra 2080 ti.

                       1080p      1440p      4k

i9 9900k         160fps     133fps     75fps

i7 8086k         153fps     130fps     74fps

5800x             160fps     129fps     73fps   

Not even the 5800x is an upgrade with a 2080 ti

 

I put this in to give you an idea when to upgrade. Same game with a 3080 ti.

i9 9900k         169fps     154fps     96fps

i7 8086k         156fps     147fps     95fps

5800x             198fps     163fps     96fps

 

 

 

 

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

Not worth the upgrade - you’re better off going with a 12600K or 12700K or just stick with what you have. In my experience the 9900K runs hot as hell even at stock. My 12600K is a good 15 degrees cooler in the same set up. 

CPU i7 14700K | CPU Cooler Noctua NH-U12A | Motherboard MSI Pro Z690-A | GPU Zotac Airo RTX 4080 | RAM 32 GB GSkill Ripjaws V 4400
Mhz |
 Monitor Alienware AW2721D / Gigabyte M28U | PSU ASUS ROG Strix 850G

Link to comment
Share on other sites

Link to post
Share on other sites

if you are not on AM4, consider every "upgrade path" as dead. Sell your old stuff, keep ram, buy Alderlake.

CPU: Ryzen 7 5800x3D | MoBo: MSI MAG B550 Tomahawk | RAM: G.Skill F4-3600C15D-16GTZ @3800CL16 | GPU: RTX 2080Ti | PSU: Corsair HX1200 | 

Case: Lian Li 011D XL | Storage: Samsung 970 EVO M.2 NVMe 500GB, Crucial MX500 500GB | Soundcard: Soundblaster ZXR | Mouse: Razer Viper Mini | Keyboard: Razer Huntsman TE Monitor: DELL AW2521H @360Hz |

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, jones177 said:

Shadow of the Tomb Raider on Ultra

 

 

CPU: Ryzen 7 5800x3D | MoBo: MSI MAG B550 Tomahawk | RAM: G.Skill F4-3600C15D-16GTZ @3800CL16 | GPU: RTX 2080Ti | PSU: Corsair HX1200 | 

Case: Lian Li 011D XL | Storage: Samsung 970 EVO M.2 NVMe 500GB, Crucial MX500 500GB | Soundcard: Soundblaster ZXR | Mouse: Razer Viper Mini | Keyboard: Razer Huntsman TE Monitor: DELL AW2521H @360Hz |

 

Link to comment
Share on other sites

Link to post
Share on other sites

I play Battlefield games, BF 4, BF 1, BF 2042, which run with the gpu and cpu around 60% usage, unless I change the resolution up which makes the image too distorted for me (144hz). I would like the gpu to be utilized fully, and thought maybe the i7 8700k was holding it back somehow. Guess I'll stick with this for now.

Link to comment
Share on other sites

Link to post
Share on other sites

52 minutes ago, gtr2244 said:

I play Battlefield games, BF 4, BF 1, BF 2042, which run with the gpu and cpu around 60% usage, unless I change the resolution up which makes the image too distorted for me (144hz). I would like the gpu to be utilized fully, and thought maybe the i7 8700k was holding it back somehow. Guess I'll stick with this for now.

Those games benefit greatly from faster ram. Maybe your problem is here.

Also dont lookup CPU% usage in task manager, single core utilisation counts. 

CPU: Ryzen 7 5800x3D | MoBo: MSI MAG B550 Tomahawk | RAM: G.Skill F4-3600C15D-16GTZ @3800CL16 | GPU: RTX 2080Ti | PSU: Corsair HX1200 | 

Case: Lian Li 011D XL | Storage: Samsung 970 EVO M.2 NVMe 500GB, Crucial MX500 500GB | Soundcard: Soundblaster ZXR | Mouse: Razer Viper Mini | Keyboard: Razer Huntsman TE Monitor: DELL AW2521H @360Hz |

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, DarkSmith2 said:

Snip.

 

Testing at ultra makes sense. Playing at ultra doesn't.

When I used my own presets it was a mini nightmare repeating them months later. Ultra is just easy to remember.

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, jones177 said:

Testing at ultra makes sense. Playing at ultra doesn't.

When I used my own presets it was a mini nightmare repeating them months later. Ultra is just easy to remember.

you shouldnt compare results to results from months prior either way. because of updates and windows corruption.

Also im sure, you could just expand testing to Ultra and Low settings without much of a hassle.

 

Ultra settings to determine CPU performance isnt accurate, even at 720p with Ultra settings you get wrong impressions or "bottlenecked/filtered" results.

CPU: Ryzen 7 5800x3D | MoBo: MSI MAG B550 Tomahawk | RAM: G.Skill F4-3600C15D-16GTZ @3800CL16 | GPU: RTX 2080Ti | PSU: Corsair HX1200 | 

Case: Lian Li 011D XL | Storage: Samsung 970 EVO M.2 NVMe 500GB, Crucial MX500 500GB | Soundcard: Soundblaster ZXR | Mouse: Razer Viper Mini | Keyboard: Razer Huntsman TE Monitor: DELL AW2521H @360Hz |

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, DarkSmith2 said:

you shouldnt compare results to results from months prior either way. because of updates and windows corruption.

Also im sure, you could just expand testing to Ultra and Low settings without much of a hassle.

 

Ultra settings to determine CPU performance isnt accurate, even at 720p with Ultra settings you get wrong impressions or "bottlenecked/filtered" results.

The tests are for me to see what gain I get in upgrades. That is why they are months and years apart. 

 

My upgrades are not to improve the frames on the vanilla games I play. They are mainly for my building games since at some point I build past what my hardware can support.

With them there is no way to tell if a component is an upgrade or not unless I buy and build.

 

 

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, jones177 said:

The tests are for me to see what gain I get in upgrades. That is why they are months and years apart. 

 

My upgrades are not to improve the frames on the vanilla games I play. They are mainly for my building games since at some point I build past what my hardware can support.

With them there is no way to tell if a component is an upgrade or not unless I buy and build.

 

 

ok i understand it, but you know, even tho, there is an aspect i dont get, you only test for GPU. how would ya tell that a CPU is an upgrade? I mean, your 5800x is obviously bottlenecked by a 3080ti in 1080p Ultra too.

SOTRMAX.thumb.jpg.ec70ba2082804aeed6a3528bf6471366.jpg

 

even just turning TSAA on/off makes a 6% performance difference in this benchmark, while still "0% GPU Bound". (with settings set to lowest preset on both)
465046131_SOTR5GHz1080pLOWEST.thumb.jpg.7cce96793db66bb17f72ab037c7c7c6e.jpg

13 hours ago, jones177 said:

I put this in to give you an idea when to upgrade. Same game with a 3080 ti.

i9 9900k         169fps     154fps     96fps

i7 8086k         156fps     147fps     95fps

5800x             198fps     163fps     96fps

see if i analyze this, 

I potentially know where the performance differences comes from. Fun fact, 8th and 9th gen where both exactly same gaming performance wise despite 9th gen having more L3 cache because of more cores. Further, 5800x has even more L3 cache. So when comparing this i would assume you ran into a Bandwidth bottleneck, which doesnt tell you necessarily if your CPU is faster or slower but more likely tells you that you dont feed it enough Bandwidth to reach its max. performance per clock. The higher your coreclock is the more bandwidth you need especially when using something with low amounts of L3-Cache.

Under "normal" (non-gpubound) circumstances a 5800x should only be around 10-15% faster than a "well tuned" 8700k in gaming. in your case it was 27% faster, thats quite unrealistic. 
 

Edit: @Mister Woof

And for those who wonder how much faster Alderlake can already get:
20211128_191035-png.2534770

CPU: Ryzen 7 5800x3D | MoBo: MSI MAG B550 Tomahawk | RAM: G.Skill F4-3600C15D-16GTZ @3800CL16 | GPU: RTX 2080Ti | PSU: Corsair HX1200 | 

Case: Lian Li 011D XL | Storage: Samsung 970 EVO M.2 NVMe 500GB, Crucial MX500 500GB | Soundcard: Soundblaster ZXR | Mouse: Razer Viper Mini | Keyboard: Razer Huntsman TE Monitor: DELL AW2521H @360Hz |

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, DarkSmith2 said:

ok i understand it, but you know, even tho, there is an aspect i dont get, you only test for GPU. how would ya tell that a CPU is an upgrade? I mean, your 5800x is obviously bottlenecked by a 3080ti in 1080p Ultra too.

SOTRMAX.thumb.jpg.ec70ba2082804aeed6a3528bf6471366.jpg

 

even just turning TSAA on/off makes a 6% performance difference in this benchmark, while still "0% GPU Bound". (with settings set to lowest preset on both)
465046131_SOTR5GHz1080pLOWEST.thumb.jpg.7cce96793db66bb17f72ab037c7c7c6e.jpg

see if i analyze this, 

I potentially know where the performance differences comes from. Fun fact, 8th and 9th gen where both exactly same gaming performance wise despite 9th gen having more L3 cache because of more cores. Further, 5800x has even more L3 cache. So when comparing this i would assume you ran into a Bandwidth bottleneck, which doesnt tell you necessarily if your CPU is faster or slower but more likely tells you that you dont feed it enough Bandwidth to reach its max. performance per clock. The higher your coreclock is the more bandwidth you need especially when using something with low amounts of L3-Cache.

Under "normal" (non-gpubound) circumstances a 5800x should only be around 10-15% faster than a "well tuned" 8700k in gaming. in your case it was 27% faster, thats quite unrealistic. 
 

Edit: @Mister Woof

And for those who wonder how much faster Alderlake can already get:
20211128_191035-png.2534770

The problem with testing a game like SotTR at 1080p lowest settings is that you've gone completely outside the bounds of reality. No one is going to play at 1080p lowest in that type of game unless they are struggling to stay over 30fps.

 

You can use super low settings to highlight CPU differences, sure, but at some point it stops mattering. When I upgraded from the 9600K to a 5900X, I benchmarked several games with a 2060 Super. For the settings I actually use, there was no difference.

 

But I tested other settings, too. One of them was Tomb Raider 2013. I tested it both at 1440p High (the settings I actually used to play the game) and 720p lowest. With the latter, the 9600K averaged about 450fps and peaked at over 600. But the 5900X? It averaged nearly 700fps and peaked at almost 1100.

 

Does that mean that the 5900X is almost twice as fast as the 9600K for gaming?

 

No.

 

If we're talking about this in 15 years, and someone on the forums is on a budget and is looking at a used 9600K system or a used 5900X system, and is going to pair it with an RTX 9050 Ti for a budget rig, then sure, the 5900X is probably going to be twice as fast or more, but this isn't the year 2036.

 

And, for most games, the "normal" way to play is GPU bound. The exception, not the rule, is eSports games.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, DarkSmith2 said:

ok i understand it, but you know, even tho, there is an aspect i dont get, you only test for GPU. how would ya tell that a CPU is an upgrade? I mean, your 5800x is obviously bottlenecked by a 3080ti in 1080p Ultra too.

I am a 4k gamer and at that resolution non of the CPUs are an upgrade to the 2018 vintage i7 8086k. I can even add my i9 10900kf to that list.

The higher core count CPU are noticeably smoother but that may be dew to the extra cache.

 

For CPUs I use old saves in modded games/sims that become unplayable.

I started doing this in 2000 with MS Flight Simulator so it has been a long time. I add content like upgraded airports, weather and aircraft until the game becomes unplayable. Then I shelve it and test the save when I upgrade hardware.  

Now I use a Space Engineers save and my modded fallout 4. 

This scene is unplayable with stutter and pauses with the i7 8086k and i9 9900k without a 5ghz overclock. With the overclock it is playable but not smooth. It is totally smooth with a stock 5800x/5900x.  

SEflyby.thumb.png.e7a06ac0f173f1a49cc8ff3a8bfb818b.png

5 hours ago, DarkSmith2 said:

even just turning TSAA on/off makes a 6% performance difference in this benchmark, while still "0% GPU Bound". (with settings set to lowest preset on both)

It is a good way to visualize the gap between CPUs but it has little value to a 4k gamer. It is a bit shocking how good these settings look on an OLED even at 1080p. At 4k it just looks like a slightly older game. I did get 207fps average at 4k but not 0%GPU bound.

SOTTR4klowest5900xFTW.thumb.png.28f79939d9a6752aa724caf812a2d753.png  

5 hours ago, DarkSmith2 said:

see if i analyze this, 

I potentially know where the performance differences comes from. Fun fact, 8th and 9th gen where both exactly same gaming performance wise despite 9th gen having more L3 cache because of more cores. Further, 5800x has even more L3 cache. So when comparing this i would assume you ran into a Bandwidth bottleneck, which doesnt tell you necessarily if your CPU is faster or slower but more likely tells you that you dont feed it enough Bandwidth to reach its max. performance per clock. The higher your coreclock is the more bandwidth you need especially when using something with low amounts of L3-Cache.

I think it is the L3 cache since in games that use 1 core the 8 core CPUs still feels smoother.  

5 hours ago, DarkSmith2 said:

Under "normal" (non-gpubound) circumstances a 5800x should only be around 10-15% faster than a "well tuned" 8700k in gaming. in your case it was 27% faster, thats quite unrealistic. 

Unrealistic or not the scores are taken from the screens like yours but with settings that are closer to what I use to play. 

 

5 hours ago, DarkSmith2 said:

Edit: @Mister Woof

And for those who wonder how much faster Alderlake can already get:

The 5900x at stock is about 120 frames down so very impressive. I will be doing a build with a i9 12900k in the new year unless AMD changes my plans.

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, YoungBlade said:

The problem with testing a game like SotTR at 1080p lowest settings is that you've gone completely outside the bounds of reality. No one is going to play at 1080p lowest in that type of game unless they are struggling to stay over 30fps.

This doesnt matter, because it reflects how the CPUs can/will perform with future GPUs. CPUs are the hardlimit of your PCs performance with future GPU upgrades. We are just simulating a GPU upgrade by lowering settings. Or in other words "latent performance". 

 

3 hours ago, YoungBlade said:

And, for most games, the "normal" way to play is GPU bound. The exception, not the rule, is eSports games.

yea and this makes CPU performance discussions irrelevant unless you experience stutter / core limits / cache limits.

 

2 hours ago, jones177 said:

This scene is unplayable with stutter and pauses with the i7 8086k and i9 9900k without a 5ghz overclock. With the overclock it is playable but not smooth. It is totally smooth with a stock 5800x/5900x.  

yea more core's and L3 cache help there, i know but with fine tuning alot can be done too.

 

3 hours ago, YoungBlade said:

If we're talking about this in 15 years, and someone on the forums is on a budget and is looking at a used 9600K system or a used 5900X system, and is going to pair it with an RTX 9050 Ti for a budget rig, then sure, the 5900X is probably going to be twice as fast or more, but this isn't the year 2036.

well with current rumors of next gen GPUs being about twice as fast as a 3090 already i dont think it will take this much time. Take Ryzen3000 f.e. people still dont get that its just 20% slower than 8th gen Intel, it took only 1GEN of GPUs to show everyone what people that tested on low settings or 720p already knew years ahead.

CPU: Ryzen 7 5800x3D | MoBo: MSI MAG B550 Tomahawk | RAM: G.Skill F4-3600C15D-16GTZ @3800CL16 | GPU: RTX 2080Ti | PSU: Corsair HX1200 | 

Case: Lian Li 011D XL | Storage: Samsung 970 EVO M.2 NVMe 500GB, Crucial MX500 500GB | Soundcard: Soundblaster ZXR | Mouse: Razer Viper Mini | Keyboard: Razer Huntsman TE Monitor: DELL AW2521H @360Hz |

 

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, DarkSmith2 said:

Those games benefit greatly from faster ram. Maybe your problem is here.

Also dont lookup CPU% usage in task manager, single core utilisation counts. 

I watched Linus' video on does ram matter for gaming. The result was not really, so I never bothered. You think faster ram would really help I'll look into it. 

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, gtr2244 said:

I watched Linus' video on does ram matter for gaming. The result was not really, so I never bothered. You think faster ram would really help I'll look into it. 

its the way techtuber test this, in a GPU bottleneck RAM almost doesnt matter, your CPU performance also almost doesnt matter, you are limited by your GPU.

Despite they mostly only compare XMP profiles vs XMP profiles, which often arent actually faster just because there is a "higher number" on the kit.

 

If you benefit from a faster CPU, you will have a benefit of actually "really faster" RAM, because thats what faster ram does, it makes your CPU faster. How much faster depends on how well the application you use can utilize cache.

 

Battlefield titles / pubg / warzone BR all those games benefit greatly from having faster ram especially if your GPU is no slouch and your not playing at max. graphic settings or very high resolutions.

 

Anyway, i can just recommend getting into RAM OC, its a bit hard first, but overall its very good if you learn how to do it and test around with it and see the differences by yourself, XMP profiles arent really fast.. they are tuned for max. stability with as much as possible systems. So ofc there is a significant performance difference compared to manually tuning RAM because of the overhead.

CPU: Ryzen 7 5800x3D | MoBo: MSI MAG B550 Tomahawk | RAM: G.Skill F4-3600C15D-16GTZ @3800CL16 | GPU: RTX 2080Ti | PSU: Corsair HX1200 | 

Case: Lian Li 011D XL | Storage: Samsung 970 EVO M.2 NVMe 500GB, Crucial MX500 500GB | Soundcard: Soundblaster ZXR | Mouse: Razer Viper Mini | Keyboard: Razer Huntsman TE Monitor: DELL AW2521H @360Hz |

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, DarkSmith2 said:

This doesnt matter, because it reflects how the CPUs can/will perform with future GPUs. CPUs are the hardlimit of your PCs performance with future GPU upgrades. We are just simulating a GPU upgrade by lowering settings. Or in other words "latent performance". 

 

yea and this makes CPU performance discussions irrelevant unless you experience stutter / core limits / cache limits.

 

yea more core's and L3 cache help there, i know but with fine tuning alot can be done too.

 

well with current rumors of next gen GPUs being about twice as fast as a 3090 already i dont think it will take this much time. Take Ryzen3000 f.e. people still dont get that its just 20% slower than 8th gen Intel, it took only 1GEN of GPUs to show everyone what people that tested on low settings or 720p already knew years ahead.

Testing at low for cpu comparison is fine, especially if you're looking for forward upgradability as it can help predict how they will perform with better GPUs as you've said.

 

However all that said I still feel its largely academic and not worth losing that much sleep over; and generally speaking people who are consider future GPUs that are strong enough to matter won't be too upset or are even more likely to want to upgrade a CPU anyway

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Mister Woof said:

However all that said I still feel its largely academic and not worth losing that much sleep over; and generally speaking people who are consider future GPUs that are strong enough to matter won't be too upset or are even more likely to want to upgrade a CPU anyway

well times have changed, poeple blow all their budget for GPUs. Its normal nowdays to not have anything left to upgrade the whole PC at once. So it might be interessting for "step by step upgrades". 

CPU: Ryzen 7 5800x3D | MoBo: MSI MAG B550 Tomahawk | RAM: G.Skill F4-3600C15D-16GTZ @3800CL16 | GPU: RTX 2080Ti | PSU: Corsair HX1200 | 

Case: Lian Li 011D XL | Storage: Samsung 970 EVO M.2 NVMe 500GB, Crucial MX500 500GB | Soundcard: Soundblaster ZXR | Mouse: Razer Viper Mini | Keyboard: Razer Huntsman TE Monitor: DELL AW2521H @360Hz |

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, DarkSmith2 said:

well with current rumors of next gen GPUs being about twice as fast as a 3090 already i dont think it will take this much time.

It is going to be interesting.

My main goal with the 30 series was HDMI 2.1 for 4k 120hz on the OLEDs. The other goal was having enough vram for 8k video editing and I bought a 3090 for that.

 

I am expecting it to take almost a year to get the cards at MSRP like it did with the 30 series so plenty of time to upgrade CPUs.

 

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, jones177 said:

It is going to be interesting.

My main goal with the 30 series was HDMI 2.1 for 4k 120hz on the OLEDs. The other goal was having enough vram for 8k video editing and I bought a 3090 for that.

 

I am expecting it to take almost a year to get the cards at MSRP like it did with the 30 series so plenty of time to upgrade CPUs.

 

im also really taunted by alderlake, because, lets be real my 8700k is maxed out with a 2080ti. So next gen GPUs wont do much for me unless i up my res. 
4k 240hz would be a thing tho. Even for esports. There is so much visual clutter in the shooter games nowdays that upping the pixel count can be quite beneficial.
So im pretty sure 1080p gaming is dying for everything thats not csgo or valorant.

CPU: Ryzen 7 5800x3D | MoBo: MSI MAG B550 Tomahawk | RAM: G.Skill F4-3600C15D-16GTZ @3800CL16 | GPU: RTX 2080Ti | PSU: Corsair HX1200 | 

Case: Lian Li 011D XL | Storage: Samsung 970 EVO M.2 NVMe 500GB, Crucial MX500 500GB | Soundcard: Soundblaster ZXR | Mouse: Razer Viper Mini | Keyboard: Razer Huntsman TE Monitor: DELL AW2521H @360Hz |

 

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, DarkSmith2 said:

well times have changed, poeple blow all their budget for GPUs. Its normal nowdays to not have anything left to upgrade the whole PC at once. So it might be interessting for "step by step upgrades". 

In a way yeah, but I think the more accurate reality is they just aren't buying GPUs.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×