Jump to content

Correlation between frame rate and refresh rate

Go to solution Solved by Enderman,

The computer pushes frames to the monitor's buffer based on the frame rate.

If the frame rate is higher than the refresh rate then the next time the monitor updates the screen there is a higher probability of a newer frame having been pushed to the buffer, aka reducing the delay from the time the frame was produced to the time it is shown on screen.

This is because the monitor and computer aren't synced in any way, so it's not possible for a computer to render a frame and push it to the monitor just before the monitor refreshes.

 

Of course the computer isn't pushing out frames one at a time, it progressively scans every frame and sends it to the monitor, so if the monitor refreshes when part of a frame is being written to the buffer it will show part of the old frame and part of the new frame, and this is screen tearing.

 

If you want the most response "feel" when gaming, your fps should be as high as possible regardless of the monitor's refresh rate.

Even though screen tearing may occur, that means part of the image on the screen is a newer frame, which is helpful in fast paced fps games.

Hello everyone,

So recently me and a friend got in a discussion, about having a frame rate which is either throttled to the same as your monitors refresh rate, or to just either set i to unlimited, or limit it to something high like 300.

I was saying that it would make no difference if you had a frame rate higher than 144 on a 144 hertz screen, and it would just be a waste of hardware performance.
While my friend was saying that 300 frames per second would be much better because it shows more recent frames.


So, it really got me wondering, is there actually a difference?
So i set up a quick example for my friend to prove a point. I said, imagine you had a 24 hertz screen with 24 frames per second. The catch was that it had to be one hundred percent stable frame rate, which is not realistic.
But anyway i made a quick illustration showing that having a higher fps than your monitors refresh rate, would not show more recent frames. And when i made it i saw something that really got me wondering.

 

Illustration

Spoiler

7NGPjWz.jpg

OK so i proved my point, it does not show more recent frames and if you match hertz and frame rate perfectly, because it would not be able to show frames that have not been created yet (that is called time travel). But i noticed that having a higher frame rate would show delayed frames!

 

So i opened a spreadsheet and i started doing the math. I calculated everything into milliseconds which is an easy equation (1000*1/x) with being refresh rate or frame rate. That would quickly turn the hertz or frames per second into how many milliseconds it would take per update.

 

Spreadsheet screenshot (full spreadsheet attached)

Spoiler

2LReyF8.png

All the grey rows are 300 frames per second frames that would not be shown on 144 hertz because they would be replaced before the screen had a chance to show them and are therefore not relevant.

So i noticed if you have one hundred percent stable frame rate if your frame rate is higher that your refresh rate on your monitor it would create an delay.

 

It shows that with 300 frames per second on a 144 hertz screen, it would increase the delay from ~0.2 milliseconds to ~3 milliseconds where after 25 frames it would hit zero delay and start increasing the delay to ~3 milliseconds again.

 

(Please note i did not take monitor response time into consideration, or fluctuations in frame rate)

 

And it really confused me because it created delays up to three whole milliseconds, when we started our discussion i didn't even think this was the case. And i must admit, while i know a lot of things about computers. This is not my specialty, i tried to do a little reading but couldn't find anything on this specific topic. While i am pretty sure this is not the case in reality because of fluctuations in frame rate, and the fact that professional e-sport players would unintentionally create delay, which i am pretty sure they would not want to.

 

So i was wondering if any smart heads that know a lot about these things could tell me if this is actually happening or if i really just misunderstood the whole principle. 

 

I attached the full spreadsheet where it shows all frames and delays for 1 whole second showing all the delays for 300 frames (-162 because they are replaced)

 

And please do correct me if i am wrong, that is why i am posting this :)

 

Thanks in advance!

//Magnus5405

refresh rate and frame rate.xlsx

Link to comment
Share on other sites

Link to post
Share on other sites

The computer pushes frames to the monitor's buffer based on the frame rate.

If the frame rate is higher than the refresh rate then the next time the monitor updates the screen there is a higher probability of a newer frame having been pushed to the buffer, aka reducing the delay from the time the frame was produced to the time it is shown on screen.

This is because the monitor and computer aren't synced in any way, so it's not possible for a computer to render a frame and push it to the monitor just before the monitor refreshes.

 

Of course the computer isn't pushing out frames one at a time, it progressively scans every frame and sends it to the monitor, so if the monitor refreshes when part of a frame is being written to the buffer it will show part of the old frame and part of the new frame, and this is screen tearing.

 

If you want the most response "feel" when gaming, your fps should be as high as possible regardless of the monitor's refresh rate.

Even though screen tearing may occur, that means part of the image on the screen is a newer frame, which is helpful in fast paced fps games.

NEW PC build: Blank Heaven   minimalist white and black PC     Old S340 build log "White Heaven"        The "LIGHTCANON" flashlight build log        Project AntiRoll (prototype)        Custom speaker project

Spoiler

Ryzen 3950X | AMD Vega Frontier Edition | ASUS X570 Pro WS | Corsair Vengeance LPX 64GB | NZXT H500 | Seasonic Prime Fanless TX-700 | Custom loop | Coolermaster SK630 White | Logitech MX Master 2S | Samsung 980 Pro 1TB + 970 Pro 512GB | Samsung 58" 4k TV | Scarlett 2i4 | 2x AT2020

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Enderman said:

The computer pushes frames to the monitor's buffer based on the frame rate.

If the frame rate is higher than the refresh rate then the next time the monitor updates the screen there is a higher probability of a newer frame having been pushed to the buffer, aka reducing the delay from the time the frame was produced to the time it is shown on screen.

This is because the monitor and computer aren't synced in any way, so it's not possible for a computer to render a frame and push it to the monitor just before the monitor refreshes.

 

Of course the computer isn't pushing out frames one at a time, it progressively scans every frame and sends it to the monitor, so if the monitor refreshes when part of a frame is being written to the buffer it will show part of the old frame and part of the new frame, and this is screen tearing.

 

If you want the most response "feel" when gaming, your fps should be as high as possible regardless of the monitor's refresh rate.

Amazing :)

So what about G-Sync and Freesync? You say that the monitor and computer aren't synced in any way. Does this sync them so the computer renders and frame and pushes it to the monitor just as it refreshes?

Link to comment
Share on other sites

Link to post
Share on other sites

The idea would be that, if you're running much higher fps than your monitors is able to output,and with vsync disabled. You're trying to maximize the chance of having THE most recent frame shown to you by your monitor. Even if you'd experience screen tearing, you might end up with 3/4 of a frame shown, that is newer than the last 1/4. Which would give you a very minor advantage.

 

Now with g-sync it's displaying a frame as soon as possible. If your fps drops, there might be a scenario where your monitor is displaying an old frame a little longer becasue the new one isn't sent from the graphics card yet. But with the benefit that you wont experience screen tearing or stutter.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Mag5405 said:

Amazing :)

So what about G-Sync and Freesync? You say that the monitor and computer aren't synced in any way. Does this sync them so the computer renders and frame and pushes it to the monitor just as it refreshes?

Yeah gsync and freesync are different.

Here is some testing on the delay:

It's basically very confusing.

 

The thing is, if you want competitive gaming or the most responsiveness, gsync and freesync are not the thing for you.

Variable refresh rate is intended to prevent stuttering and lag spikes when your framerate drops below the monitor's refresh rate.

In fast paced games, you pretty much always turn down the settings to get like 300fps or whatever, in which case gsync/freesync stop working and become useless and the monitor just runs at the normal max refresh rate like any other monitor.

NEW PC build: Blank Heaven   minimalist white and black PC     Old S340 build log "White Heaven"        The "LIGHTCANON" flashlight build log        Project AntiRoll (prototype)        Custom speaker project

Spoiler

Ryzen 3950X | AMD Vega Frontier Edition | ASUS X570 Pro WS | Corsair Vengeance LPX 64GB | NZXT H500 | Seasonic Prime Fanless TX-700 | Custom loop | Coolermaster SK630 White | Logitech MX Master 2S | Samsung 980 Pro 1TB + 970 Pro 512GB | Samsung 58" 4k TV | Scarlett 2i4 | 2x AT2020

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×