Jump to content

Is it possible to run 165hz, 1440p, 10-bit on displayport 1.2a/hdmi 2.0, and is that much better than 8-bit ?

Victorvis
I'm looking to buy a monitor and saw that you need to run on 8-bit for 165hz and 1440p for displayport 1.2a, while it does support 10-bit (8bit-FRC), is this true and is this bad?
Link to comment
Share on other sites

Link to post
Share on other sites

10 bit gives you more colors than 8 bit but 8nit already has lots of colors. Even 165hz vs say 120hz may not matter to you. Depends on what you’re playing and what kind of visual acuity you have.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

You don't need 10bit for most applications. SDR is mastered for 8bit and 10bit will not really do anything. Even HDR, which is mastered for 10bit can be played with next to no artifacts. The only thing 10bit does is reduce banding a bit.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×