Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

CPU frequency

Hello, I would like to ask how much impact does have a CPU clock/frequency on gaming ? What is a standart cores/clock for 720/1080/1440p gaming ? Not talking about entry level hardware, mid to high tier. If some of you could try some benchmarks with different clocks.

 

I have tried AC Valhalla benchmark with graphics set to ultra. and 720p/1080p/1440p resolution

My PC Specs: x470, R7 2700X, 32gb ddr4 3200mhz, RX6700XT, W11, M2 NVME ssd

Looking at average fps, 720p showed some difference, cpu usage was at 50-70% max.

Do you guys know of any other tittle or scenario where clock speed has huge impact in gaming ?

summary.png

Link to comment
Share on other sites

Link to post
Share on other sites

In order for frequency to really matter, you need to be CPU bound. If you want to test this in a realistic way, I'd try something where you'd play with low settings, like a competitive shooter. If you just want to test it for the sake of it, drop settings to low in AC Valhalla and put it on 720p again. You'll see a difference.

Link to comment
Share on other sites

Link to post
Share on other sites

It's not clockspeed as much as single core performance, and to lesser extent multicore performance. Clockspeed is just one factor in that, along with IPC (instructions per clock cycle) and branch prediction, etc.

CPU: AMD Ryzen 9 5900X · Cooler: Artic Liquid Freezer II 280 · Motherboard: MSI MEG X570 Unify · RAM: G.skill Ripjaws V 2x16GB 3600MHz CL16 (2Rx8) · Graphics Card: ASUS GeForce RTX 3060 Ti TUF Gaming · Boot Drive: 500GB WD Black SN750 M.2 NVMe SSD · Game Drive: 2TB Crucial MX500 SATA SSD · PSU: Corsair White RM850x 850W 80+ Gold · Case: Corsair 4000D Airflow · Monitor: MSI Optix MAG342CQR 34” UWQHD 3440x1440 144Hz · Keyboard: Corsair K100 RGB Optical-Mechanical Gaming Keyboard (OPX Switch) · Mouse: Corsair Ironclaw RGB Wireless Gaming Mouse

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Kean01 said:

Hello, I would like to ask how much impact does have a CPU clock/frequency on gaming ? What is a standart cores/clock for 720/1080/1440p gaming ? Not talking about entry level hardware, mid to high tier. If some of you could try some benchmarks with different clocks.



It'll depend on the workload you toss at it and the rest of your system.

For laughs, let's assume you have an RTX 3090, 32GB of fast, dual ranks per channel RAM and are using an 8 Core Alderlake CPU with the E cores disabled (only performance cores running). 

Between 1-2 GHz, you'll like see that you get near 2x the performance in frame rates in many titles. Between 2-4GHz, you'll probably get something like a 30-100% gain depending on the title/setting. Going from 4GHz to 5GHz on this CPU won't do much.


For most people if the use case is gaming and you're not multitasking, you don't need to worry about the CPU. CPU mainly matters if you run a FAST GPU at relatively low settings.

3900x | 32GB RAM | RTX 2080

1.5TB Optane P4800X | 1 TB Adata XPG Pro | 2TB Micron 1100 SSD
QN90A | Emotiva B1+, ELAC OW4.2, PB12-NSD, HD800
 

Link to comment
Share on other sites

Link to post
Share on other sites

Tried Low preset, cpu usage 50-60%, overall slightly lower.

I can see the 10-15% difference now, but nothing I would call game changer or CPU limited

low.png

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, cmndr said:



It'll depend on the workload you toss at it and the rest of your system.

For laughs, let's assume you have an RTX 3090, 32GB of fast, dual ranks per channel RAM and are using an 8 Core Alderlake CPU with the E cores disabled (only performance cores running). 

Between 1-2 GHz, you'll like see that you get near 2x the performance in frame rates in many titles. Between 2-4GHz, you'll probably get something like a 30-100% gain depending on the title/setting. Going from 4GHz to 5GHz on this CPU won't do much.


For most people if the use case is gaming and you're not multitasking, you don't need to worry about the CPU. CPU mainly matters if you run a FAST GPU at relatively low settings.

I wanted to find some sort of Baseline, something like 8core/thread and clock speed of 3.0+ghz will never get you cpu bound in gaming.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Kean01 said:

I wanted to find some sort of Baseline, something like 8core/thread and clock speed of 3.0+ghz will never get you cpu bound in gaming.

It'll depend on the game.

Also "not CPU bound" is arguably a poor goal. Ideally you're more concerned with some sort of minimum level of performance for a very specific use case. Think "above 120FPS 99% of the time in CS GO on X map and above 60FPS 99.9% of the time."

 

Also how much clock speed matters will vary by the CPU and RAM you use.

There's a lot of moving parts. The CPU is usually NOT an area to obsess over though. There's 10 year old CPUs (2600k) that are serviceable today for many use cases. The same can't be said about the fastest GPUs of 2012 (GTX 680).

3900x | 32GB RAM | RTX 2080

1.5TB Optane P4800X | 1 TB Adata XPG Pro | 2TB Micron 1100 SSD
QN90A | Emotiva B1+, ELAC OW4.2, PB12-NSD, HD800
 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, cmndr said:

It'll depend on the game.

Also "not CPU bound" is arguably a poor goal. Ideally you're more concerned with some sort of minimum level of performance for a very specific use case. Think "above 120FPS 99% of the time in CS GO on X map and above 60FPS 99.9% of the time."

 

Also how much clock speed matters will vary by the CPU and RAM you use.

There's a lot of moving parts. The CPU is usually NOT an area to obsess over though. There's 10 year old CPUs (2600k) that are serviceable today for many use cases. The same can't be said about the fastest GPUs of 2012 (GTX 680).

So technically from the point of % decrease/increase above some value its more of a loss or gain. In theory having a basepoint say 4,0ghz decreasing clock by 17%(3,3ghz) results in less or same fps(14 in this case) % drop and vice versa to see some sort of curve. That will show some numbers like going below 3,0 will be hurting the performance etc. Yes it would differ in every scenario/setup but some broad insight

Link to comment
Share on other sites

Link to post
Share on other sites

With few more test considering the mhz/fps ratio it looks like for 8/16 core its sweetspot is 2700mhz (1% low 60+fps) going up or down shows rather worse values.

I wanted to know if there is some sort of similar patern for say 4/8 or 6/12 core cpus, or perhaps intel cpus

summ2.png

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Kean01 said:

So technically from the point of % decrease/increase above some value its more of a loss or gain. In theory having a basepoint say 4,0ghz decreasing clock by 17%(3,3ghz) results in less or same fps(14 in this case) % drop and vice versa to see some sort of curve. That will show some numbers like going below 3,0 will be hurting the performance etc. Yes it would differ in every scenario/setup but some broad insight

I would think of gains in terms of Ahmdah's law

https://en.wikipedia.org/wiki/Amdahl's_law

Imagine you have 3 main parts in your system.
1. CPU
2. RAM
3. Videocard

The time spent rendering a frame (that takes 16ms to calculate so ~60FPS) might be something like:
75% GPU, 20% CPU, 5% RAM
so 12ms, 3.2ms, 0.8ms respective.

Getting the GPU 2x as fast will bump the frame rendering time to 10ms overall (since half of the 12ms goes away).    
10ms corresponds with 100FPS.  
 

 

Now do another example where it's the CPU getting 2x as fast.
The CPU calculation time goes from taking 3.2ms to 1.6ms. The overall Frame time is 14.4ms or around 70FPS.


Now if you look at it from a rough GHz perspective... we don't have 10GHz CPUs. Going to 10GHz from 5GHz in that example doesn't really matter.


To give a rough feel for what this looks like...
https://www.techpowerup.com/review/intel-core-i9-12900k-alder-lake-tested-at-various-power-limits/6.html

relative-performance-games-38410-2160.pn

Basically ALL of the CPUs are almost the same with a 3080 at 4K. If your CPU is half as powerful the difference will be even smaller. If you have something like a 1080 and you play at 1080p, this will ROUGHLY be the difference between one CPU and another.

----

 

In general expect any shifts from making the CPU to be non-linear (so going from 1Ghz to 2GHz gives a HUGE boost, going from 4GHz to 5... barely matters). Quantifying the overall experience is trickier though because there will be moments where the CPU matters more than the GPU, but those moments are infrequent. Think 10% of the frames. And the gap between CPUs is NOTHING like the gap between GPUs.


Average Gaming FPS FPS 3840x2160

 

The fastest GPU gives 10x the overall FPS of the slowest (1060)

3900x | 32GB RAM | RTX 2080

1.5TB Optane P4800X | 1 TB Adata XPG Pro | 2TB Micron 1100 SSD
QN90A | Emotiva B1+, ELAC OW4.2, PB12-NSD, HD800
 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Kean01 said:

With few more test considering the mhz/fps ratio it looks like for 8/16 core its sweetspot is 2700mhz (1% low 60+fps) going up or down shows rather worse values.

I wanted to know if there is some sort of similar patern for say 4/8 or 6/12 core cpus, or perhaps intel cpus

summ2.png

GHz and cores are completely dependent on the architecture. If you got an old 8 core Xeon from 12 years ago, it would clock above 2.7GHz, but it would also struggle to give you a solid 60fps. Whereas a 4 core i3 12100 downclocked to 2.7GHz would probably give you better performance than you're seeing there in terms of the average FPS.

 

You can only compare cores and frequency within an architectural family, so your results are only relevant for Zen+ desktop chips that use chiplets. The 3400G, for example, despite being Zen+, is not comparable because it is monolithic, and any other architecture - be it Zen 3, Skylake, Haswell, Bulldozer, etc. - will give you different results.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share


×