Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Ya, are newer 4 core CPUs good enough? Meaning gaming.

I have 10th Gen I3-10100F with a GTX 1660 on a Gigabyte B460M DS3H V2 with memory at 2666MHz. It seems to do very well for a entry level CPU. Seems like they are for a lot of games. GTA 5 seems to keep up well on this chip. I am just wondering how higher tier 10th and 11th gen do.

Link to post
Share on other sites

GtaV is a old game at this point, it came out when 4 core was king

theres deminishing returns, but 6/12 cpus are the best mid-high end gaming cpus right now 

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to post
Share on other sites

Upgradeability! I will consider a I5-10600KF in the future. It's meeting needs for 1920x1080 60Hz right now. Remember I said entry level. Also a nice way to get into Windows 11 with I3-10100F.

Link to post
Share on other sites

That is a great CPU for the $/performance.  

Ryzen 7 5800X w/Noctua NH-D15S / Gigabyte RTX 2070 Super / 32Gb Vengeance 3200 (4x8) / MSI X470

Sabrent Rocket 512Gb m.2 / 2x1Tb T-Force Vulcan SATA SSD / Seasonic Focus GX-750

Dell S3220DGF - curved 32" 1440p 165Hz

 

Ryzen 5 3600 w/Stealth / MSI RTX 2060 Gaming Z / 16Gb Vengeance 2666 / MSI X470

Intel 660p 1tb / 1Tb T-Force Vulcan SATA SSD / Thermaltake Smart 600W 80+

HP 32Q - 32" 1440p 60hz

Link to post
Share on other sites

The thing the higher tier CPUs bring are higher core counts and higher core clocks. A lot of games still don't use more than 4 cores, and the ones that do rarely use more than 6-8. If you're gonna buy a new CPU, I usually recommend the higher core count CPUs because of the way games are trending, as well as with how they help with multitasking (i.e. running a bunch of Chrome tabs and stuff in the background), though for the most part unless you're doing something that needs it, anything above 6 cores isn't useful. 

 

That being said, if it does what it needs you to do, it's a fine CPU. I'm not trying to say that chip is bad, if I needed a cheap system it would be one of my first choices.

Link to post
Share on other sites

In most situations, yes, a modern 4c/8t CPU is enough for gaming. You won't be able to push max FPS in the latest and greatest games, but I've never seen a benchmark where a modern quad-core with HT can't get you at least 60fps on average with enough GPU horsepower. Honestly, your GTX 1660 is probably holding you back more than the i3 is.

 

And honestly, it isn't necessarily about the cores. Hardware Unboxed did a recent series of videos looking at core and cache scaling and found that, for gaming at least, cache tends to have more impact on FPS than cores do. Up to a point. The final video looked at quad-core performance and the worst performance by the 10105F was 73fps in SotTR. That's perfectly playable. And for some modern games, it doesn't matter at all. Shockingly, in AC:Valhalla, there was functionally no difference between the 10900K and 10105F, even when all 10 cores were left enabled on the i9.

 

Link to post
Share on other sites
40 minutes ago, YoungBlade said:

And honestly, it isn't necessarily about the cores. Hardware Unboxed did a recent series of videos looking at core and cache scaling and found that, for gaming at least, cache tends to have more impact on FPS than cores do. Up to a point. The final video looked at quad-core performance and the worst performance by the 10105F was 73fps in SotTR. That's perfectly playable. And for some modern games, it doesn't matter at all. Shockingly, in AC:Valhalla, there was functionally no difference between the 10900K and 10105F, even when all 10 cores were left enabled on the i9.

 

I am not a fan of that video.

I recently got back my i7 6700k/1080 rig that was loaned out as a "work at home computer" and it is rough in games compared to a 6 core. The 6 core feels rough compared to an 8 core.

Just running a bench won't do it. You have to have your hands on the mouse and keyboard to feel the difference.  

 

AC:Valhalla is just strange. It is the only game that my Intel 6 core beats my Intel 10 core and it hates my 5800x since it scores lower than both.

 

As even HUB say in some videos. It is not all about the frames.  

RIG#1 CPU: AMD, R 7 5800X| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA RTX 3080 ti XC3 ULTRA | PSU: EVGA 1000 G+ | Case: Cooler Master H500P Mesh White | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#2 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3080 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: SilverStone PF360-ARGB AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#3 CPU: Intel i9 9900k | Motherboard: AORUS Z390 Ultra | RAM: Ripjaws V Series 32GB DDR4 3200 | GPU: ASUS  RTX 3080 White OC  | PSU: EVGA 1000 G+ | Case: Cooler Master H500 | Cooler: Noctua NH-D15 | SSD: Crucial P2 1TB  | SSD#2: Samsung 860 QVO 2TB | Monitor: LG 49" NanoCell 85

Link to post
Share on other sites
6 minutes ago, jones177 said:

I am not a fan of that video.

I recently got back my i7 6700k/1080 rig that was loaned out as a "work at home computer" and it is rough in games compared to a 6 core. The 6 core feels rough compared to an 8 core.

Just running a bench won't do it. You have to have your hands on the mouse and keyboard to feel the difference.  

 

AC:Valhalla is just strange. It is the only game that my Intel 6 core beats my Intel 10 core and it hates my 5800x since it scores lower than both.

 

As even HUB say in some videos. It is not all about the frames.  

Were you targeting 60fps gaming or higher?

Link to post
Share on other sites
1 minute ago, YoungBlade said:

Were you targeting 60fps gaming or higher?

The only games that I target 60fps are old Bethesda games. 

All frames are over 60 with my setups now unless I use raytracing at 4k.

 

When I bought the i7 6700k in 2016 I did target 60fps at 4k with GTX 980 ti in SLI.

The 4 core had drops in games that did not exist when I upgraded to an i7 8086k.

The drops were in the 40s so not as severe as the ones I got with the i7 2600k that were in the 30s but I still felt them. Going to 6 cores was a big jump in emersion in games and 8 cores is so good that I can't play on a 6 core anymore. 

 

RIG#1 CPU: AMD, R 7 5800X| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA RTX 3080 ti XC3 ULTRA | PSU: EVGA 1000 G+ | Case: Cooler Master H500P Mesh White | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#2 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3080 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: SilverStone PF360-ARGB AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#3 CPU: Intel i9 9900k | Motherboard: AORUS Z390 Ultra | RAM: Ripjaws V Series 32GB DDR4 3200 | GPU: ASUS  RTX 3080 White OC  | PSU: EVGA 1000 G+ | Case: Cooler Master H500 | Cooler: Noctua NH-D15 | SSD: Crucial P2 1TB  | SSD#2: Samsung 860 QVO 2TB | Monitor: LG 49" NanoCell 85

Link to post
Share on other sites
13 minutes ago, jones177 said:

The only games that I target 60fps are old Bethesda games. 

All frames are over 60 with my setups now unless I use raytracing at 4k.

 

When I bought the i7 6700k in 2016 I did target 60fps at 4k with GTX 980 ti in SLI.

The 4 core had drops in games that did not exist when I upgraded to an i7 8086k.

The drops were in the 40s so not as severe as the ones I got with the i7 2600k that were in the 30s but I still felt them. Going to 6 cores was a big jump in emersion in games and 8 cores is so good that I can't play on a 6 core anymore. 

 

That's a bit surprising that you had drops that low, but that could also be a consequence of SLI.

 

As for the HUB video, I don't see why you have any issues with it. It does show that 4 cores struggles to get above 100fps in modern games. Since you target higher than that, you can see that you have to get at least a 6 core. It also shows that this isn't necessarily caused by the cores themselves. It looks like the Intel Sky through Comet Lake cores want 3-4MB of L3 cache each. Your 6700K only has 8MB shared. If it had 12-16MB, it would probably perform noticeably better in games, especially for the 1% lows.

Link to post
Share on other sites
18 minutes ago, YoungBlade said:

That's a bit surprising that you had drops that low, but that could also be a consequence of SLI.

I did it with a single GTX 980 ti, GTX 1080 and GTX 1080 ti.

I had GTX 1080 tis in SLI with the i9 9900k that I am using now as well and only dropped it when it got the RTX 2080 ti. On games that did not like SLI it was off.

18 minutes ago, YoungBlade said:

As for the HUB video, I don't see why you have any issues with it. It does show that 4 cores struggles to get above 100fps in modern games. Since you target higher than that, you can see that you have to get at least a 6 core. It also shows that this isn't necessarily caused by the cores themselves. It looks like the Intel Sky through Comet Lake cores want 3-4MB of L3 cache each. Your 6700K only has 8MB shared. If it had 12-16MB, it would probably perform noticeably better in games, especially for the 1% lows.

I have an issue with the video because CPUs with more cache wins. How many 4 core CPUs have high cache counts? The i7 7700k had 8mbs as well and the Ryzens have less.  Why were the Ryzen 4 cores not shown as well?

 

Going from a i7 6700k to a i7 8700k was not a big jump in frames at 1440p and 4k. It was a big jump in emersion.

 

If the i7 6700k had a large cache it probably would perform better but in tests it was Windows and background apps that were mainly causing the drops. 

 

My experience has been more that of the Tech Deals videos when it comes to cores. He thinks the 5900x is ideal.

 

I did give the video a thumbs up since I loved my i9 10900k and after giving it to my Son for video work I missed it so much I built another one with an i9 10900kf.

RIG#1 CPU: AMD, R 7 5800X| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA RTX 3080 ti XC3 ULTRA | PSU: EVGA 1000 G+ | Case: Cooler Master H500P Mesh White | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#2 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3080 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: SilverStone PF360-ARGB AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#3 CPU: Intel i9 9900k | Motherboard: AORUS Z390 Ultra | RAM: Ripjaws V Series 32GB DDR4 3200 | GPU: ASUS  RTX 3080 White OC  | PSU: EVGA 1000 G+ | Case: Cooler Master H500 | Cooler: Noctua NH-D15 | SSD: Crucial P2 1TB  | SSD#2: Samsung 860 QVO 2TB | Monitor: LG 49" NanoCell 85

Link to post
Share on other sites

I am saying I5-11600k or kf in future to eventually get computer to keep up with 4k 60. 11th gen is also a option. Might be enough of a boost. Plus I have board already. CPU is on support list as of BIOS F20 on my Gigabyte B460M DSH3 V2 mainboard.  In fact I would like a I5-11600 KF or K for system with a beefy heatsink. I have BIOS F21  installed for great range of CPUs. 3.9 to 4.9 GHz on a I5-11600k or kf. Probably see 4.7GHz with that. 6 cores, 12 threads. Can support 2993 and 3200 MHz memory. I'll findout if main board is limited to 2666. Of course a I5-11600K will have a 125 watt TDP vs a standard I5-11600 with 65 watt TDP.

Link to post
Share on other sites

Yes quad with hyperthreading seem to be enough in gaming. My I3-10100F with a GPU is enough with a GTX 1660. 16GB@2666MHz. It will do it. Will struggle in newer games. Will still run them in medium normal settings.

Link to post
Share on other sites
14 hours ago, Eric Kazer said:

I am saying I5-11600k or kf in future to eventually get computer to keep up with 4k 60.

The higher you go with your resolution, the less important the CPU gets. You'll probably not even notice a big difference between 10100f and the 11600K in that usecase.

7 hours ago, Eric Kazer said:

Yes quad with hyperthreading seem to be enough in gaming. My I3-10100F with a GPU is enough with a GTX 1660. 16GB@2666MHz. It will do it. Will struggle in newer games. Will still run them in medium normal settings.

If you're CPU limited you can typically use higher settings without losing much - if any - performance.

 

7 hours ago, Eric Kazer said:

11600k or kf is better.

The KF is exactly the same CPU as the K, but also has integrated graphics. If you don't have a GPU this can be a huge value add, but since you have a 1660, you don't really need it. So go for whichever is cheaper.

Current Specs:

CPU: AMD Ryzen 5 5600X - Motherboard: ASUS ROG Strix B550-E - GPU: PNY RTX 3080 XLR8 Epic-X - RAM: 4x8GB (32GB) G.Skill TridentZ RGB 3600MHz CL16 - PSU: Corsair RMx (2018) 850W - Storage: 500 GB Corsair MP600 (Boot) + 2 TB Sabrent Rocket Q (Storage) - Cooling: EK, HW Labs & Alphacool custom loop - Case: Lian-Li PC O11 Dynamic - Fans: 6x Noctua NF-A12x25 - AMP/DAC: FiiO K5 Pro - OS: Windows 11 preview - Monitor: ASUS ROG Swift PG35VQ - Mouse: Logitech G Pro + Powerplay - Keyboard: Logitech G915 TKL - Headphones: Beyerdynamic Amiron Home - Microphone: Antlion ModMic

 

Temperatures @steady state: Furmark + CinebenchR23 running for 1 hour. Fans @850RPM. Pump @1600RPM.

Water: 37°C

CPU: 73°C

GPU: 54°C

Link to post
Share on other sites
16 hours ago, Eric Kazer said:

I will still go with a I5-10600k or kf in the future.

Pretty good idea

Link to post
Share on other sites

It's a tough argument, because GPUs are unaffordable to me and many of us. CPUs kina still are. Also higher clocks matter. 10100 is a slower chip than 10600 or 11600. Also catches matter.

Link to post
Share on other sites
On 9/19/2021 at 8:44 PM, Eric Kazer said:

YoungBlade,

CPUs are less in cost than GPUs. 

But not by much... 

Here's my honest answer, yes,  4 cores is "good" enough,  but generally the more cores the better (at similar speed) its also more future proof...

Lots of people are saying now 6 cores is enough, but this is imo quite shortsighted as for multi tasking more cores surely is better,  not just in the future but already currently. 

 

So tldr, 4 cores are good enough,  with exceptions, but its surely not great and I would advise getting a modern 6c/12t processor minimum. 

 

 

AMD stands for Advanced Micro Machines

-ColdFusion, 2021

Link to post
Share on other sites
On 9/17/2021 at 4:08 PM, YoungBlade said:

Shockingly, in AC:Valhalla, there was functionally no difference between the 10900K and 10105F, even when all 10 cores were left enabled on the i9.

 

 

On 9/17/2021 at 5:00 PM, jones177 said:

AC:Valhalla is just strange. It is the only game that my Intel 6 core beats my Intel 10 core and it hates my 5800x since it scores lower than both.

 

As even HUB say in some videos. It is not all about the frames.  

 

I was playing Valhalla without SMT enabled for a bit. My average framerate was around 60~63 and maxing out to 80. When enabled SMT now, my FPS are averaging around 72 and noticed the maximum 113. I do have a 3440x1440 resolution on Samsung Odyssey G5 with 165Hz refresh rate on. I was expecting more framerates at this point but multi-core doesn't show much of an improvement. I just think that Assassin's Creed: Valhalla isn't properly optimised for PC.

Link to post
Share on other sites
4 hours ago, chrome_wheels said:

 

 

I was playing Valhalla without SMT enabled for a bit. My average framerate was around 60~63 and maxing out to 80. When enabled SMT now, my FPS are averaging around 72 and noticed the maximum 113. I do have a 3440x1440 resolution on Samsung Odyssey G5 with 165Hz refresh rate on. I was expecting more framerates at this point but multi-core doesn't show much of an improvement. I just think that Assassin's Creed: Valhalla isn't properly optimised for PC.

I don't think it is optimized for high core count anything. 

 

Here are my results with a stock i9 10900k and a i7 8086k at 5ghz all cores.

The game does not use the preferred Intel cores on my i9 10900k to feed the GPU. Cores 1 and 2 are preferred and will hit 5.3ghz. AC:Valhalla uses core 6(thread 11) that runs a 4.9ghz every time the game is loaded.

ACV4k1440p1080pi9.thumb.jpg.96480432b343a7c657cc1dc192b497cd.jpg

ACV4k1440p1080p.thumb.jpg.8cac76ff2b86c064edbce28463502c46.jpg

 

 

 

RIG#1 CPU: AMD, R 7 5800X| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA RTX 3080 ti XC3 ULTRA | PSU: EVGA 1000 G+ | Case: Cooler Master H500P Mesh White | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#2 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3080 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: SilverStone PF360-ARGB AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#3 CPU: Intel i9 9900k | Motherboard: AORUS Z390 Ultra | RAM: Ripjaws V Series 32GB DDR4 3200 | GPU: ASUS  RTX 3080 White OC  | PSU: EVGA 1000 G+ | Case: Cooler Master H500 | Cooler: Noctua NH-D15 | SSD: Crucial P2 1TB  | SSD#2: Samsung 860 QVO 2TB | Monitor: LG 49" NanoCell 85

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Newegg

×