Jump to content

HDR10 vs refresh rate discussion.

Go to solution Solved by circeseye,

it does that because of bandwidth constraints. 100hz or lower uses less bandwidth so can support 10bit color. so your basically maxing out what the cable can support.

I brought a Samsung  CRG90 monitor. The weird thing I noticed is that if I run at 120Hz or higher (OC it 144Hz) at 5120 × 1440, it only supports 8 bit color. If I run the same resolution at 100Hz or lower, it will use 10 bit color. I personally dont mind losing 20 Hz for more color because I generally play games that are geared towards looks and story than speed and skill. I mostly brought this monitor because it supports 32:9 1440p, HDR 1000 nit, and higher refresh rate for a smoother video/gameplay. I'm wondering if people would take the higher refresh rate with 8 bit color or take the 10 bit and lower refresh rate? 

Link to comment
Share on other sites

Link to post
Share on other sites

it does that because of bandwidth constraints. 100hz or lower uses less bandwidth so can support 10bit color. so your basically maxing out what the cable can support.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×