Jump to content

Does halving resolution affect refresh rate?

VirtusGraphics

I am trying to educate myself when it comes to monitors, and while doing so I have stumbled upon some comments saying something along the lines of doubling or halving the resolution affected the refresh rate.

For example, a  2560 x 1440 monitor at 30Hz was downscaled or something to 1280 x 720 gave 60Hz, and vice versa in different occasions.

Do these things have a direct relationship, does it even work like that (even if in some cases), and how does it work?

Link to comment
Share on other sites

Link to post
Share on other sites

To my understanding, if you have 4K 60Hz moinitor, you will always get 60Hz even at 720p. These numbers are two separate categories.

Link to comment
Share on other sites

Link to post
Share on other sites

Not at all, I think you're mistaking FPS for Hz, since rendering something at a lower resolution will allow you to have more frames. The hz of your monitor won't change between resolutions unless your being bottlenecked by your cable

That's an F in the profile pic

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/15/2017 at 5:04 AM, Froody129 said:

Not at all, I think you're mistaking FPS for Hz, since rendering something at a lower resolution will allow you to have more frames. The hz of your monitor won't change between resolutions unless your being bottlenecked by your cable

I know the difference between FPS and Hz.
This is also regarding a matter of a high end screen that has the option to change its refresh rate (EIZO CG277).

What makes me wonder is why they would lower the Hz to increase resolution (as they used "which allowed me to")? Could it just be a very rare case?

 

I would cite them, but unfortunately I cannot seem to find those places.

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/15/2017 at 4:56 AM, VirtusGraphics said:

I am trying to educate myself when it comes to monitors, and while doing so I have stumbled upon some comments saying something along the lines of doubling or halving the resolution affected the refresh rate.

For example, a  2560 x 1440 monitor at 30Hz was downscaled or something to 1280 x 720 gave 60Hz, and vice versa in different occasions.

Do these things have a direct relationship, does it even work like that (even if in some cases), and how does it work?

Running at a lower resolution affects frame rate in games, because the graphics card can render frames more quickly when there are fewer pixels in each frame. But this is separate from the refresh frequency of the monitor, which doesn't change in normal operation. If you have a 60 Hz monitor it always operates at 60 Hz, whether you are getting 40 fps or 100 fps ingame.

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/15/2017 at 5:09 AM, Glenwing said:

Running at a lower resolution affects frame rate in games, because the graphics card can render frames more quickly when there are fewer pixels in each frame. But this is separate from the refresh frequency of the monitor, which doesn't change in normal operation. If you have a 60 Hz monitor it always operates at 60 Hz, whether you are getting 40 fps or 100 fps ingame.

I found one of the places I read of this.
 

Quote

Here is how I was able to use the EIZO CG277 and an Apple 27 in Thunderbolt display together: I connected all my Thunderbolt devices to the one Thunderbolt port on my Mac Mini. I then connected the EIZO CG277 to the Mac Mini with an HDMI cable. At first I was only able to get 1080p on the CG277. EIZO Tech Support walked me through how to use the menu on the CG277 to set the input to be "PC" instead of "Video", and then to turn my refresh rate from 60 Hz to 30 Hz for the CG277. I could then achieve the full resolution of the EIZO at 2560 x 1440. 30 Hz refresh rate is fine for my purposes of photography post processing. ..

This monitor should also be able to be set down to 24Hz so it doesn't cause interpolation when editing film movies shot at 24fps.

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/15/2017 at 5:08 AM, VirtusGraphics said:

I know the difference between FPS and Hz.
This is also regarding a matter of a high end screen that has the option to change its refresh rate (EIZO CG277).

What makes me wonder is why they would lower the Hz to increase resolution (as they used "which allowed me to")? Could it just be a very rare case?

 

I would cite them, but unfortunately I cannot seem to find those places.

Video interfaces have a maximum amount to of data per second they can transmit, for example HDMI 1.4 can only transmit 8.16 Gbit/s. Resolution and refresh frequency both increase the amount of data required for transmission, so if you have a monitor that supports 3840×2160 and 60 Hz, you cannot use both of those at the same time over HDMI 1.4 because combined it is too much data. You have to choose one; you can operate at 3840×2160 at 30 Hz, or you can lower the resolution to 2560×1440 which will require less data per frame, and therefore fit more frames per second (60 Hz) in the available data stream. This is somewhat unusual though, most of the time the interface in question is really only for backup, most 4K monitors will have DisplayPort 1.2 or HDMI 2.0 for 4K 60 Hz, not only HDMI 1.4 alone.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Glenwing said:

Video interfaces have a maximum amount to of data per second they can transmit, for example HDMI 1.4 can only transmit 8.16 Gbit/s. Resolution and refresh frequency both increase the amount of data required for transmission, so if you have a monitor that supports 3840×2160 and 60 Hz, you cannot use both of those at the same time over HDMI 1.4 because combined it is too much data. You have to choose one; you can operate at 3840×2160 at 30 Hz, or you can lower the resolution to 2560×1440 which will require less data per frame, and therefore fit more frames per second (60 Hz) in the available data stream.

Aha okay, I understand. This is the answer I was looking for. Thank you! :)

If you have any reads to link me regarding this matter, going more in depth, it would be much appreciated!

Link to comment
Share on other sites

Link to post
Share on other sites

EDIT: there were some new messages I didn't read yet.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, VirtusGraphics said:

I found one of the places I read of this.
 

This monitor should also be able to be set down to 24Hz so it doesn't cause interpolation when editing film movies shot at 24fps.

In that case it sounds like he's running two monitors split off a single Thunderbolt connection; if that:s Thunderbolt 1 then it would not have enough bandwidth for two monitors both at 2560×1440 at 60 Hz; they would need to be run at 30 Hz.

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/15/2017 at 5:17 AM, Glenwing said:

In that case it sounds like he's running two monitors split off a single Thunderbolt connection; if that:s Thunderbolt 1 then it would not have enough bandwidth for two monitors both at 2560×1440 at 60 Hz; they would need to be run at 30 Hz.

Is there any way to know exactly how much is going through, or do you just have to find out by testing?


Furthermore, I am currently awaiting my build to arrive, and have one screen ready. I am planning to have a total of four screens, running on one card.
However, due to the resolution of two of the main screens, and the refresh rate on one, I am concerned that it will be too much for the card (EVGA 1080 Ti Founder Ed.)


Main screen: Eizo ColorEdge CG277, running at 60Hz — 2560 x 1440 (27")
Second screen: Asus ROG Swift PG279Q, running 165Hz — 2560 x 1440 (27")
Third screen: Some cheap 1920x1080 screen at expected 60Hz (22", maybe)
Fourth screen: Wacom Cintiq 12WX at 1280x800. Refresh rate unknown.

 

I both play heavy games and work professionally with photos and design, which is what this build and monitors will be used for.

 

It could be possible for me to extend by throwing in an Quadro, although I am unsure if Quadro is necessary. It has some features I require, that I am unsure if unlocked in 1080 Ti. Main feature being 10-bpc colour depth. If so, question is, how well will that work? Not thinking of SLI.

Link to comment
Share on other sites

Link to post
Share on other sites

Refresh rate is basically equates to the maximum fps you can get with V-Sync enabled on that monitor.  If you go over that (i.e. 61fps on a 60hz monitor), you will miss frames (1 frame in this example).  At least that's my understanding.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

No it doesn't

 

LCDs are actually all fixed resolution.  The number of "dots" doesn't change.  If your computer is sending a 1080p signal to a 4k monitor, the monitor's internal video processing will upscale the video.  

 

Also the monitor's TCON chip will provides the timing of the video which is synchronized with the video card.  This timing is also constant.  If your computer video card can only output 30 fps, most likely duplicate frames will get be shown in place of the extra frames in the 60 (fps) Hz refresh rate of the monitor.  Likewise, a monitor cannot actually display more fps than its refresh rate.

 

The only time I can recall when resolution affected refresh rate was the days of the CRT.  In CRTs the resolution is not fixed so the cathode ray tube would have to scan slower to display a high resolution video.  This caused a noticeable flickering when using very high resolutions.

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/15/2017 at 5:25 AM, VirtusGraphics said:

Is there any way to know exactly how much is going through, or do you just have to find out by testing?


Furthermore, I am currently awaiting my build to arrive, and have one screen ready. I am planning to have a total of four screens, running on one card.
However, due to the resolution of two of the main screens, and the refresh rate on one, I am concerned that it will be too much for the card (EVGA 1080 Ti Founder Ed.)


Main screen: Eizo ColorEdge CG277, running at 60Hz — 2560 x 1440 (27")
Second screen: Asus ROG Swift PG279Q, running 165Hz — 2560 x 1440 (27")
Third screen: Some cheap 1920x1080 screen at expected 60Hz (22", maybe)
Fourth screen: Wacom Cintiq 12WX at 1280x800. Refresh rate unknown.

 

I both play heavy games and work professionally with photos and design, which is what this build and monitors will be used for.

 

It could be possible for me to extend by throwing in an Quadro, although I am unsure if Quadro is necessary. It has some features I require, that I am unsure if unlocked in 1080 Ti. Main feature being 10-bpc colour depth. If so, question is, how well will that work? Not thinking of SLI.

You can look up the maximum of each interface here:

You don't need to worry about adding up all your monitors though, every port has a separate limit, a 1080 Ti will handle all those monitors just fine.

Link to comment
Share on other sites

Link to post
Share on other sites

you can change the hz of a monitor by

right click yr desktop click display settings then advanced display setting and then clickin display adaptor settings then on monitor this will help u under stand it better i hope 

ie your can buy a 60hz 8k panel but at 720p u would only get 60hz 

international racing driver

My Build

i5-7600k

hyper x fury 16gb (2133)mhz

asus strix 1070 

CM 212x

asus z270-p

corsair 550w psu

 

agon 1440p 144hz tn monitor

corsair strafe mx silent KB

corsair void rbg (wired)

razer mamba te with firefly mouse pat

ps4 controller using ds4 windows

Link to comment
Share on other sites

Link to post
Share on other sites

  • 4 months later...

i have the same question but about films. it seems that if a display is not calibrated for the highest resolution the frames sit on the display for more refreshes, thus losing cuts. and seemingly slowing frame rate does anyone know if this  is true?    

Link to comment
Share on other sites

Link to post
Share on other sites

  • 4 months later...

I have an older 1080p60 monitor that does 75Hz at some lower resolutions (I believe 720P and lower). 75Hz at lower resolutions should be a pretty safe bet on a 4k60 monitor since the limitation is bandwidth, not the display controller or the display itself generally. TBH I don't see the point of anything above 60 for gaming, but for browsing for example it makes more sense, especially on larger screens. Text is hard to read if you scroll on a 60Hz monitor sometimes and at 120Hz it's much easier to read text.

 

4k120 "DIY" monitors exist and they can do 480Hz at 480p I believe, so if your display controller allows it, you might even be able to do 240Hz at 480p with a 4k monitor, but usually 75Hz is the most any 60Hz monitor will do while using the OSD.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×