Jump to content

Does the Mainstream Geforce/Radeon Lineup support 10 bit color out?

Tech_Dreamer

Does the Geforce GTX /Radeon RX(later iterations ) Mainstream GPU Lineup support 10 bit deep color output or is it strictly only on the professional grade GPU's?

Details separate people.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, nerdslayer1 said:

nope, only pro cards support it. 

yep, thats why I bought one.

The geek himself.

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, nerdslayer1 said:

nope, only pro cards support it. 

Are you sure? What do you make of this? They listed their consumer GPUs, their 300 series, not the FirePro series, as being able to support 10 bits per color (bpc) over HDMI 1.4b and DisplayPort 1.2. I think this conclusively proves that those cards at least DO support 10-bit color.5930db025e368_RTGTechSummit-VisualTechSession-EmbargoDecember820159amEST-page-017.thumb.jpg.acd657447f7015c2190d041c7500a5e4.jpg

Why is the God of Hyperdeath SO...DARN...CUTE!?

 

Also, if anyone has their mind corrupted by an anthropomorphic black latex bat, please let me know. I would like to join you.

Link to comment
Share on other sites

Link to post
Share on other sites

10 bpc has been a selectable option for output color depth for years on both AMD and NVIDIA consumer cards, even my 780 Ti can select it, so there is a lot of misinformation out there about consumer cards "supporting 10-bit color". Yes, although you can select "10 bpc" in the control panel, this only dithers the image to 10-bit at the output, the original image is still rendered in 8 bpc (it's a bit like the TV fake refresh rates, where the "120 Hz" is actually generated using a 60 Hz source and calculating what some "inbetween frames" might look like).

 

Last I heard (which was a while ago, before current generation cards), pro cards were still required to enable true 10-bit rendering to have an actual 10-bit workflow for applications like Photoshop and whatnot. I don't know if this is still true though, as NVIDIA has advertised Pascal as being their first consumer generation to support 10-bit color, though since 10 bpc has been selectable as an option before Pascal, I assume they mean actual 10-bit support has been added. Same with AMD, saying 400 series has 10-bit support. May want to check this though, unfortunately it's a finer detail that is not widely known or tested.

 

This is my understanding anyway, I'm going by what's been explained to me by other people, I haven't tested this myself.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×