Jump to content

Ryzen upgrade for 1440P Ultrawide

I run i5 4670K @ 4.5GHz / Vega 64 Nitro+

 

I game 3440x1440 UW.

 

Will upgrading give me a massive boost in FPS? It's a £300+ upgrade to switch platform. Finding a buyer for my i5 system is gonna be hard.

 

I googled around and it seems the 2600 isn't gonna be worthwhile at £100. I can wait for the 3600 to drop.

 

 

 

At my resolution, I know I am GPU bound, but will a better CPU get me 20% boost?

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, trojanhorse said:

?

You can figure out that on your own, use something like RivaTuner overlay to see your CPU and GPU usage while gaming, if you CPU usage is always higher than your GPU usage or at least if you see the i5 always pining at 100% then upgrading the CPU to a R5 3600 will give you a great boost, however if your GPU is the one being most hammered then the performance boost will be smaller.

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

On 7/15/2019 at 11:54 AM, trojanhorse said:

I run i5 4670K @ 4.5GHz / Vega 64 Nitro+

 

I game 3440x1440 UW.

 

Will upgrading give me a massive boost in FPS? It's a £300+ upgrade to switch platform. Finding a buyer for my i5 system is gonna be hard.

 

I googled around and it seems the 2600 isn't gonna be worthwhile at £100. I can wait for the 3600 to drop.

 

 

 

At my resolution, I know I am GPU bound, but will a better CPU get me 20% boost?

 

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Lightsilent said:

I think the vega 64 is holding you back, An RTX GPU will provide you much more fps and to be honest that i5 4670K is gonna give better gaming performance than ryzen 3000series.

Even overclocked the 4670k is getting on a bit now and wont be able to keep up with the new Ryzon chips even in Single core stuff.. Also the Vega 64 was pretty much neck and neck with the Nvidia 1080 especially at 2k+ resolutions.

An upgrade to any of the new AMD CPUs would be a good call really.. 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Princess Luna said:

You can figure out that on your own, use something like RivaTuner overlay to see your CPU and GPU usage while gaming, if you CPU usage is always higher than your GPU usage or at least if you see the i5 always pining at 100% then upgrading the CPU to a R5 3600 will give you a great boost, however if your GPU is the one being most hammered then the performance boost will be smaller.

Damn didn't know RivaTuner could show CPU and GPU ingame!

 

Right but is the 3600, a £300+ upgrade gonna give me a boost more than like a 2080 or a 2080ti which would cost the same after selling vega64

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Lightsilent said:

I think the vega 64 is holding you back, An RTX GPU will provide you much more fps and to be honest that i5 4670K is gonna give better gaming performance than ryzen 3000series.

This is what I thought, I must have stronger single core perf because even though I max my CPU out for every game, I get the FPS I want. It's just, in the next couple of years, I want to upgrade to ultrawide 144hz freesync, there is currently only one monitor that is capable and it has a lot of issues. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, caldrin said:

Even overclocked the 4670k is getting on a bit now and wont be able to keep up with the new Ryzon chips even in Single core stuff.. Also the Vega 64 was pretty much neck and neck with the Nvidia 1080 especially at 2k+ resolutions.

An upgrade to any of the new AMD CPUs would be a good call really.. 

 

 

At my resolution the fps difference is pretty small though, it was only like 2-3 fps when for 2-300£ cost. I will grab the charts when I finish work. 

 

EDIT:

 

Here are the charts - https://www.gamersnexus.net/hwreviews/2875-amd-r5-1600x-1500x-review-fading-i5-argument/page-4

 

As you can see, a comparable CPU is the stock 4690K (131.9 FPS) vs 1700X ( 136FPS)

THAT is at 1080P. 1440P UW is 134% more pixels and FAR more GPU bound.

 

At my resolution that difference goes from 4FPS to something like 1/2.

 

EVEN if the 3600 was 50% more powerful in games than the 1700X (It's not) I would see maybe 4/5FPS?

 

That's a £300+ platform upgrade for 4/5FPS at MOST. Spending that on a GPU will yield far more than 5 FPS.

 

Any opinions?

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, trojanhorse said:

Right but is the 3600, a £300+ upgrade gonna give me a boost more than like a 2080 or a 2080ti which would cost the same after selling vega64

Your system should be balanced, pairing a 2080 Super or 2080 Ti with a 4670K will obviously cause terrible bottlenecking that could lead to even worse experience due to stuttering and what not.

 

You should never have a great disparity on your hardware setup, for a 2080 Super or 2080 Ti I would personally pair it with a Ryzen 7 3700X minimum, not even the 3600 being honest... otherwise you'll have inconsistencies with your performance leading to a worse experience and under-utilization of the high end GPU.

 

Upgrading the CPU alone in this case will help because the latest games are fully capable of utilizing at least 4 cores, by having a CPU that also only features 4 cores / 4 threads you'll have your game fighting for resources with your OS and side/background applications, meaning you should indeed see a performance gain with a newer CPU around the R5 3600 and R7 3700X area, but nobody is capable of telling you these gains exactly... these things aren't that black and white.

 

1 hour ago, Lightsilent said:

to be honest that i5 4670K is gonna give better gaming performance than ryzen 3000series.

The Ryzen 5 3600 will outperform the i5 4670K across the board, it has the same single thread capacity of Skylake cores, making it an actual 6c/12t i7 6700K, there is no instance where the i5 4670K would be better at gaming and this can easily be seen on any YouTube legit benchmark.

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

53 minutes ago, Princess Luna said:

Your system should be balanced, pairing a 2080 Super or 2080 Ti with a 4670K will obviously cause terrible bottlenecking that could lead to even worse experience due to stuttering and what not.

 

You should never have a great disparity on your hardware setup, for a 2080 Super or 2080 Ti I would personally pair it with a Ryzen 7 3700X minimum, not even the 3600 being honest... otherwise you'll have inconsistencies with your performance leading to a worse experience and under-utilization of the high end GPU.

  

 Upgrading the CPU alone in this case will help because the latest games are fully capable of utilizing at least 4 cores, by having a CPU that also only features 4 cores / 4 threads you'll have your game fighting for resources with your OS and side/background applications, meaning you should indeed see a performance gain with a newer CPU around the R5 3600 and R7 3700X area, but nobody is capable of telling you these gains exactly... these things aren't that black and white.

  

The Ryzen 5 3600 will outperform the i5 4670K across the board, it has the same single thread capacity of Skylake cores, making it an actual 6c/12t i7 6700K, there is no instance where the i5 4670K would be better at gaming and this can easily be seen on any YouTube legit benchmark.

 

please illustrate with benchmarks and numbers the disparity between a 4670K or similar and 5700XT against a 3600/3700X. Where are these youtube benchmarks?

Link to comment
Share on other sites

Link to post
Share on other sites

I found this one that seems to be somewhat accurate:

Gonna have to do a bit of a approximation there, you ought to understand the older a CPU gets it's only natural people won't cover it as much any more.

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Princess Luna said:

I found this one that seems to be somewhat accurate:



Gonna have to do a bit of a approximation there, you ought to understand the older a CPU gets it's only natural people won't cover it as much any more.

 

Mate, great find! I don't know how I missed this.

 

Right heres the problem. Go ahead and view that video. PAST 1080P. As soon as that happens the difference is gone. Look at Witcher 3 4K (1080Ti) That is legit margin of error 1FPS difference. It isn't there.  

 

 

Now here's the thing. Once it goes past 1080Ti (I have vega which is on par w 1080), and you grab a 2080 Ti the difference is there. 20FPS in most titles. 

 

That's a card that costs 5x more (£200 vs £1200) than my card and ONLY then does a meagre i5 begin bottlenecking the system. 

 

 

Conclusion: Up to the 1080Ti the i5 performs on par with the 3600X at 1440P and above resolutions

 

Please prove me wrong. Otherwise, looks like I'll be better off saving £300+

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, trojanhorse said:

Conclusion: Up to the 1080Ti the i5 performs on par with the 3600X at 1440P and above resolutions

The part that makes it wrong is that even if you're GPU bound with a weak CPU like I said you'll experience stuttering and lower 1% and 0.1% lows that causes those micro-freezes... this happens whenever the CPU has no headroom left and you have the "competing" for resources scenario I have described you.

 

For sake of convenience lets abroad our CPU listing here:

i5-9600k-review-far-cry-5_1080p.png

 

As you can see the i5 8600K and i5 9600K when looking only at the averages they look to be performance on pair with the more expensive CPUs, so if we were to only consider this part of the story you could go i5 instead of the i7's right?

 

However due to the limiting number of cores and threads to only 6c/6t (it gets worse with 4c/4t's like the 4670K) you see that fight for resources will cause larger dips in performance randomly hammering down the 1% and 0.1% lows and causing the annoying "micro-freezes" that can be awfully immersion breaking.

 

All falls down to the necessity of making things balanced, using the GPU bound scenario at 4k to justify keep using an out-dated 4c/4t might backfire at you... also remember that 21:9 1440p can be a lot of pixels already but it's still significantly less than genuine 4k.

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Princess Luna said:

The part that makes it wrong is that even if you're GPU bound with a weak CPU like I said you'll experience stuttering and lower 1% and 0.1% lows that causes those micro-freezes... this happens whenever the CPU has no headroom left and you have the "competing" for resources scenario I have described you.

 

For sake of convenience lets abroad our CPU listing here:

i5-9600k-review-far-cry-5_1080p.png

 

As you can see the i5 8600K and i5 9600K when looking only at the averages they look to be performance on pair with the more expensive CPUs, so if we were to only consider this part of the story you could go i5 instead of the i7's right?

 

However due to the limiting number of cores and threads to only 6c/6t (it gets worse with 4c/4t's like the 4670K) you see that fight for resources will cause larger dips in performance randomly hammering down the 1% and 0.1% lows and causing the annoying "micro-freezes" that can be awfully immersion breaking.

  

 All falls down to the necessity of making things balanced, using the GPU bound scenario at 4k to justify keep using an out-dated 4c/4t might backfire at you... also remember that 21:9 1440p can be a lot of pixels already but it's still significantly less than genuine 4k.

Yes 4K is 66% more pixels than 1440P UW.

 

I think those are a mistake as the CPU comparable to mine, 4790K is outperforming those 1% averages.  Yes by next cycle I will be a bit more screwed as 8C consoles come to play. I know I have to upgrade I just want to know if I can hold on to this for 1/2 more years when the AM5/whatever boards are out (so that I can upgrade the CPU and not the board later).

 

Also, the 2600 is literally on par with the 4790K. Stock. My OC should bring it up to that level. I don't see 1% lows being affected there and this is for a new game at 1080P with a 2080Ti so the difference is excruciated further

 

Vega 64 / 5700XT on an i5 at my resolution is going to leave a smaller footprint. If you can find any 1440p (I have tried) I would love to see that with a realistic GPU. Noone is pairing a £150 CPU with a £1200 GPU. 

Link to comment
Share on other sites

Link to post
Share on other sites

I feel like the arguments @Princess Luna made are pretty clear. A 4c/4t CPU like a 4790k is starting to reach the end of its life with games slowly starting to shift to 6c. The point being, however, is that if you're running a game with a 4790k you're going to get CPU usages hitting 100% every once in a while as the CPU cannot offer more performance to the game whilst it has to balance the demands of the OS and other things. And it can definitely not do it as well or better than a 6c/12t CPU like a 3600 whose gaming performance is comparable to that of a 9600k. If I were you I'd upgrade with the new Ryzen 3000 CPUs showing so much gaming performance improvement.

Ryzen build -  CPU: Ryzen 7 3700X Cooler: Corsair H115i Platinum RGB | GPU: RTX 2070 FE | RAM: 2x8GB Corsair Dominator Platinum RGB DDR4-3200MHz | PSU: EVGA SuperNOVA P2 750W | Motherboard: MSI X570 MEG Ace | Storage: Samsung 970 EVO 500 GB - Seagate Barracuda 2TB 7200 RPM | Case: Lian Li PC-O11 Dynamic

 

Intel build - CPU: i5-9600k @ 4.9 GHz - 1.28v Cooler: NZXT Kraken X62 rev 2 | GPU: GTX 980 Ti FE | RAM: 2x8GB Corsair Vengeace LPX DDR4-3200MHz | PSU: Corsair RM650x  | Motherboard: Gigabyte Z390 Aorus Ultra | Storage: Crucial MX500 500GB - Western Digital Blue 1TB 5400RPM | Case: NZXT H700 Black

 

Laptop - HP Pavillion; CPU: Core i5-7200U RAM: 8GB DDR4-2133MHz | GPU: Intel HD 620 | Storage: Samsung 128GB SSD - Western Digital 1TB HDD

Link to comment
Share on other sites

Link to post
Share on other sites

I had an older 3440 X 1440 60hz monitor so CPU bottlenecking was not a problem since my i7 2600k could do it. It did take a GTX 1080 to keep everything over 60fps and that was something my GTX 980 ti could not do. With a Vega 64 I would struggle to maintain 60fps with the setting I like.

 

When I decided to upgrade to a 100hz/120hz 3440 X 1440 monitor I ran some tests.

I had replaced my i7 2600k with a i7 8700k and the 1080 with a 1080 ti. With that setup I could average 100fps with the settings I like on all the AAA games I tested. I could not get close to 120fps on most games. 

 

I did use 2 GTX 1080 tis in SLI and that got all the games that supported SLI in the 144fps range. 

I replaces the 2 1080 tis with one RTX 2080 ti. Bottlenecking still was not an issue but when I tested at 2560 X 1440 it was a very big issue. Big enough for me to get rid of my i7 8700k. The i7 8700k did not fail in core count or even IPC. It failed because of frequency. 

 

So for a 100hz monitor I would go with a RTX 2080 or equivalent with a 6 to 8 core i7 or a R5 3600x/R7 3700x.  At 120hz I would go with a R9 or an 8 core Intel CPU. For 3440 X 1440 144hz I would only get a i9 9900k. At this point the i9 9900k is a one trick pony and that is the trick. 

 

 

 

 

 

 

 

    

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×