Jump to content

The Relationship between Refresh Rate and Response Time: Basic Question

Shog
Go to solution Solved by aithos,

Ok, I'm just going to cover the basics:

 

First of all, talking about overclocking a monitor is mostly pointless because the vast majority of monitors can't achieve a "true" overclock.  What I mean is this: people talk about refresh rate, response time and input lag, but they forget to take into account the physical limits of their displays... IE: how many frames per second can the PCB/scaler process?  It doesn't do you any good to overclock to 120hz if your scaler can only process 80 frames per second, those other 40 will just be dropped... IE: you aren't actually getting an overclock. 

 

Ignoring monitors like the Asus PG279q that can "overclock" out of the box there are only a handful of monitors available that have a PCB capable of processing enough frames to reach the refresh rate being overclocked.  For those who aren't familiar with them they are the single-input, no-scaler Korean panels from companies like QNIX, XSTAR and Yamasaki.  They also come with an 8ms response time (IIRC) so you're going to deal with a fair amount of motion blur and they also have moderate input lag.

 

The other factor that I would say is relevant here is that having a high refresh rate doesn't really do that much for you once you get above 120hz and here's why (in my opinion):

 

1) Having hardware capable of running 1440p+ with a *consistent* 120fps and decent settings is expensive.

2) The jump from 60hz to 120hz is a stark contrast, going from 120hz to 165hz just isn't a big change.

3) A higher refresh rate doesn't address the "real" issues with displays: input lag and frame persistence.

 

I would personally rather play at 120hz with ULMB enabled (like I do on the aforementioned PG279q) than to play at 165hz or higher.  Why?  Because having the backlight flash after each frame to clear the persistence virtually eliminates motion blur, and makes for a picture clarity that we haven't gotten since the days of CRT monitors.  For gaming it is absolutely unparalleled in "feel" and adding that was for me a bigger improvement in quality than going from 60hz to 120hz.  Now don't get me wrong, the jump from 60hz to 120hz is HUGE and is immediately noticeable for anyone who games, I'm not downplaying that at all... I'm saying that ULMB is *THAT* important. 

 

As for response time, you NEVER want it to be higher than your effective refresh rate.  That results in what I mentioned above: motion blur and input lag.  I have both an XSTAR overclocked to 110hz and the Asus I mentioned above, they are night and day despite the fact the Asus is a matte finish and I *HATE* matte finish.  I was willing to sacrifice my glorious glossy panel to get a low input lag, low response time, native 120hz ULMB IPS at 1440p.  It was ridiculously expensive and worth EVERY penny.

 

My questions are quite simple, but quickly get bound up due to the fact I don't really understand off the top of my head what a monitor does to correct for this (real life behavior). I read through the stickied FAQ but I'd like a little more clarity about this situation. To keep this from being confusing please try and use the right technical terminology and correct me if I slip up, I have the habit like most of us do to occasionally mess it up, this conversation right now isn't about input lag, or total system numbers, but please feel free to bring them up too if you feel they are important to the concept(s) below.

The incident that instigated this post was people talking about overclocking a monitor that had a relatively high response time. 

It was my original belief that having a monitor refresh rate that is higher than [a certain percentage of] your monitor refresh rate [for average colors] does nothing meaningful [opinion].

 

For example, let's say your three monitors gtg response times are

Monitor A 5ms,

Monitor B 10ms,

Monitor C 30ms.

 

This means your monitors can fully switch this specific color change at a frequency of, 200hz, 100hz, and 33hz respectively. 

If you reach the point where your monitor is refreshing significantly faster than it can physically change colors (30ms response time @ 120hz [30ms response time vs a 8.33ms refresh rate]) is there where ghosting occurs? I've seen the images of the go cart racers flying across the screen at 120hz vs 60hz at different resolutions and ms. Would setting a refresh rate that is significantly faster than the screen can change colors just lead to incomplete color changes resulting in blurry or lackluster images in fast changing scenes? 

I understand each of us have different opinions of acceptable rates and what we can see with regards to quality, blurriness, and ghosting, however I would appreciate your opinions or experiences with this concept. 

I haven't needed to purchase a monitor myself, however whenever I judge the quality of monitors imaginary me buys I typically use this rule of thumb to determine if a response time or refresh rate will be worthwhile. When we sit down and talk about overclocking or replacing a 60hz monitor to 144hz (16.6ms refresh to 6.9ms refresh) and the response time is 10ms I think it'd be wiser to get a lower response time monitor or only overclock to 100hz (10ms) refresh rate isntead of trying to squeeze out those extra 20 or 44hz which would then place the screen refresh rate at faster than the screen can change colors. 


Is this a concept I entirely made up in my head? Or does this play a minor role in display technology and user experience? 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Ok, I'm just going to cover the basics:

 

First of all, talking about overclocking a monitor is mostly pointless because the vast majority of monitors can't achieve a "true" overclock.  What I mean is this: people talk about refresh rate, response time and input lag, but they forget to take into account the physical limits of their displays... IE: how many frames per second can the PCB/scaler process?  It doesn't do you any good to overclock to 120hz if your scaler can only process 80 frames per second, those other 40 will just be dropped... IE: you aren't actually getting an overclock. 

 

Ignoring monitors like the Asus PG279q that can "overclock" out of the box there are only a handful of monitors available that have a PCB capable of processing enough frames to reach the refresh rate being overclocked.  For those who aren't familiar with them they are the single-input, no-scaler Korean panels from companies like QNIX, XSTAR and Yamasaki.  They also come with an 8ms response time (IIRC) so you're going to deal with a fair amount of motion blur and they also have moderate input lag.

 

The other factor that I would say is relevant here is that having a high refresh rate doesn't really do that much for you once you get above 120hz and here's why (in my opinion):

 

1) Having hardware capable of running 1440p+ with a *consistent* 120fps and decent settings is expensive.

2) The jump from 60hz to 120hz is a stark contrast, going from 120hz to 165hz just isn't a big change.

3) A higher refresh rate doesn't address the "real" issues with displays: input lag and frame persistence.

 

I would personally rather play at 120hz with ULMB enabled (like I do on the aforementioned PG279q) than to play at 165hz or higher.  Why?  Because having the backlight flash after each frame to clear the persistence virtually eliminates motion blur, and makes for a picture clarity that we haven't gotten since the days of CRT monitors.  For gaming it is absolutely unparalleled in "feel" and adding that was for me a bigger improvement in quality than going from 60hz to 120hz.  Now don't get me wrong, the jump from 60hz to 120hz is HUGE and is immediately noticeable for anyone who games, I'm not downplaying that at all... I'm saying that ULMB is *THAT* important. 

 

As for response time, you NEVER want it to be higher than your effective refresh rate.  That results in what I mentioned above: motion blur and input lag.  I have both an XSTAR overclocked to 110hz and the Asus I mentioned above, they are night and day despite the fact the Asus is a matte finish and I *HATE* matte finish.  I was willing to sacrifice my glorious glossy panel to get a low input lag, low response time, native 120hz ULMB IPS at 1440p.  It was ridiculously expensive and worth EVERY penny.

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, aithos said:

As for response time, you NEVER want it to be higher than your effective refresh rate.  That results in what I mentioned above: motion blur and input lag. 

 

Thank you, I appreciated your comment! I quoted this part just in case someone didn't read the whole post. This is the first time I actually heard the term ULMB, typically I've heard low motion blur and other phrases. 

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, Shog said:

Thank you, I appreciated your comment! I quoted this part just in case someone didn't read the whole post. This is the first time I actually heard the term ULMB, typically I've heard low motion blur and other phrases. 

Well previously people called it "lightboost" and it was intended to counteract some issues with implementing 3D content on displays.  Some clever people hacked their drivers to enable it for non-3D content because they found that it was extremely effective at simulating a CRT style monitor (which is superior in many ways to LCD).  Recently some of the monitor companies (like Asus) have started implementing it natively as part of their bios because it is AWESOME for gaming and overall clarity, and rather than call it "lightboost" they renamed it to ULMB.

 

Usually "low motion blur" and other phrases are actually referring to the response time and refresh rate, and that's not what I'm talking about.  So here's how an LCD works:

 

- A frame is drawn

- It remains on the screen until a new frame is drawn

- the new frame draws, causing motion blur as the pixels transition

 

Here's how ULMB works:

- A frame is drawn

- The backlight flashes shortly after the frame is displayed and wipes the screen

- A new frame is drawn and there is no motion blur because the pixels are all transitioning from black

 

As an aside: TV companies do the same thing and it's how they "fake" their refresh rates.  The 240hz "clearmotion" and all that nonsense is just ULMB inserting "clear frames" by flashing the backlight.  It does help with motion blur but calling them "frames" is shady as hell and I have never liked it.  It's just mimicking a CRT style method of refreshing. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Shog said:

My questions are quite simple, but quickly get bound up due to the fact I don't really understand off the top of my head what a monitor does to correct for this (real life behavior). I read through the stickied FAQ but I'd like a little more clarity about this situation. To keep this from being confusing please try and use the right technical terminology and correct me if I slip up, I have the habit like most of us do to occasionally mess it up, this conversation right now isn't about input lag, or total system numbers, but please feel free to bring them up too if you feel they are important to the concept(s) below.

The incident that instigated this post was people talking about overclocking a monitor that had a relatively high response time. 

It was my original belief that having a monitor refresh rate that is higher than [a certain percentage of] your monitor refresh rate [for average colors] does nothing meaningful [opinion].

 

For example, let's say your three monitors gtg response times are

Monitor A 5ms,

Monitor B 10ms,

Monitor C 30ms.

 

This means your monitors can fully switch this specific color change at a frequency of, 200hz, 100hz, and 33hz respectively. 

If you reach the point where your monitor is refreshing significantly faster than it can physically change colors (30ms response time @ 120hz [30ms response time vs a 8.33ms refresh rate]) is there where ghosting occurs? I've seen the images of the go cart racers flying across the screen at 120hz vs 60hz at different resolutions and ms. Would setting a refresh rate that is significantly faster than the screen can change colors just lead to incomplete color changes resulting in blurry or lackluster images in fast changing scenes? 

I understand each of us have different opinions of acceptable rates and what we can see with regards to quality, blurriness, and ghosting, however I would appreciate your opinions or experiences with this concept. 

I haven't needed to purchase a monitor myself, however whenever I judge the quality of monitors imaginary me buys I typically use this rule of thumb to determine if a response time or refresh rate will be worthwhile. When we sit down and talk about overclocking or replacing a 60hz monitor to 144hz (16.6ms refresh to 6.9ms refresh) and the response time is 10ms I think it'd be wiser to get a lower response time monitor or only overclock to 100hz (10ms) refresh rate isntead of trying to squeeze out those extra 20 or 44hz which would then place the screen refresh rate at faster than the screen can change colors. 


Is this a concept I entirely made up in my head? Or does this play a minor role in display technology and user experience? 

 

Ghosting comes directly from slow response time. It isn't really related to the response time being higher than the refresh period. If the response time is slower than the refresh period (let's say 10 ms on a 120 Hz monitor, etc.), this doesn't lead to heavier ghosting than 10 ms on a 60 Hz monitor. The fact that color transitions will be interrupted before they are complete may complicate things but the effects of this are not really well-studied. In any case it won't "suddenly" amplify the effects of ghosting. As with most things related to response time, it's a smooth curve. Even if you had a 10 ms transition but the display interrupts it after 8.33 ms, the transition is almost fully complete and its unlikely that you would even be able to tell the difference if the transition had been completed in exactly 8.33 ms instead. The idea that the response time passing above the refresh period will cause everything to suddenly be super terrible I think is just a presumption based on the fact that it sounds like a very theoretically bad situation, but it is without factual support right now, it's just theorycrafting.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×