Jump to content

RGB 8 Bit VS 4.2.2. 12 Bit?

Skatedudeguy
Go to solution Solved by mariushm,

Like others said , RGB or YCbCr 4:4:4 is basically the same thing.  Just make sure it's not Studio/ Limited YCbCr. 

YCbCr 4:2:2 subsamples colors a bit so theoretically there's a decrease in color quality but it would be pretty difficult to notice the degradation.

YCbCr 4:2:0 is the format movies are encoded into (bluray, dvd etc) - if you play such content visually you shouldn't notice any difference. However, in games, in Windows, you would definitely notice the loss of color information especially at texts (it would screw up a bit antialiasing of various fonts, colored texts on colored backgrounds ex blue text on red background would be screwed up)

 

As for color depth , 8 bit and 10 bit are the most often used standards. 12 bit is kinda pointless and overkill and not well supported by anything.  For example, when compressing movies to H264 or HEVC there's a noticeable increase in quality per bitrate when going from 8bit to 10 bit, but there's very small increase when going from 10bit to 12bit so little content is encoded in 12 bit (it's much much longer encode times due to being harder to optimize software to work with 12 bits per pixel)

 

They would be useful for example when you obtain some 4K HDR blurays which are encoded with the new Rec.2020 color space in 10 bit (most commonly used) or 12 bit more (unlikely).  See https://en.wikipedia.org/wiki/Rec._2020

 

So by configuring your connection to TV to YCbCr 4:2:2 or 4:2:0 10 bit it would make it easier for the video player (if it's smart enough) to convert the Rec2020 color space 10 bit to YCbCr 4:2:2 or 4:2:0 10 bit , retaining more colors, more than what the standard full rgb would offer.  I'm thinking Media Player Classic Home Cinema with Madvr rendered or something like that... few video players are smart enough to be designed to work with 10bit per color properly, there's few monitors out there with 10bit capability and not much content in these new color spaces with HDR and lots of colors so not many developers can implement and test such features. 

 

 

 

So, I'm a bit newish to color depth and how I can enhance it. I have a Samsung JU7100 that I am still working on trying to find the perfect image for games and movies (but separately!) The options I have are 

 

RGB Full range at 8 bit output color depth 

OR

YCbCr 4.2.2 (or 4.2.0, but isn't 4.2.2 better than 4.2.0?) at up to 12 bit output color depth. 

 

I do also have YCbCr 4.4.4 with 8 bit color depth. but I heard that RGB Full is better than 4.4.4. overall.

 

What is the best option for games, and what is the best option for movies? Will I even notice any games going to 12 bit on PC right now? I do notice HDR options on my PS4 OG right now, but PC I have more tweaking of course.

 

Thanks!

Link to comment
Share on other sites

Link to post
Share on other sites

NUMBERS

ACRONYMS

 

IM SCARED

Roses are red

My name is Roy

We caught the alligator that ate the De Luca boy

Link to comment
Share on other sites

Link to post
Share on other sites

YCBCR 4:4:4 and RGB are the same thing. Ultimately YCBCR is just a transform of RGB which is translated back to RGB by the display. If there's no subsampling (4:4:4 = no subsampling) then it will be converted back to exact RGB values.

 

4:2:2 is said to be very difficult to distinguish from 4:4:4, and even 4:2:0 is considered pretty high quality, although text and other things that require fine precision tend to suffer with subsampling enabled. I really doubt that 12-bit color depth will be of any value though. I think RGB would be the best option.

Link to comment
Share on other sites

Link to post
Share on other sites

RGB 8bit will have a higher resolution than 4:2:2 or 4:2:0 (to be more precise, the resolution of the color information will be higher).

8 bit RGB is the same as 4:4:4 at 8 bit color depth (visual wise).

 

Unless you are watching content with more than 8bits of color depth, use RGB or 4:4:4 8 bit.

Link to comment
Share on other sites

Link to post
Share on other sites

Most movies are encoded using some YCbCr format, so leave it on YCbCr for movies. Games are usually RGB. However, to use an HDR format, you must output at least 10-bit color. The dominant HDR spec, HDR-10, requires it. The other one, Dolby HDR, requires 12-bit color.

Link to comment
Share on other sites

Link to post
Share on other sites

Like others said , RGB or YCbCr 4:4:4 is basically the same thing.  Just make sure it's not Studio/ Limited YCbCr. 

YCbCr 4:2:2 subsamples colors a bit so theoretically there's a decrease in color quality but it would be pretty difficult to notice the degradation.

YCbCr 4:2:0 is the format movies are encoded into (bluray, dvd etc) - if you play such content visually you shouldn't notice any difference. However, in games, in Windows, you would definitely notice the loss of color information especially at texts (it would screw up a bit antialiasing of various fonts, colored texts on colored backgrounds ex blue text on red background would be screwed up)

 

As for color depth , 8 bit and 10 bit are the most often used standards. 12 bit is kinda pointless and overkill and not well supported by anything.  For example, when compressing movies to H264 or HEVC there's a noticeable increase in quality per bitrate when going from 8bit to 10 bit, but there's very small increase when going from 10bit to 12bit so little content is encoded in 12 bit (it's much much longer encode times due to being harder to optimize software to work with 12 bits per pixel)

 

They would be useful for example when you obtain some 4K HDR blurays which are encoded with the new Rec.2020 color space in 10 bit (most commonly used) or 12 bit more (unlikely).  See https://en.wikipedia.org/wiki/Rec._2020

 

So by configuring your connection to TV to YCbCr 4:2:2 or 4:2:0 10 bit it would make it easier for the video player (if it's smart enough) to convert the Rec2020 color space 10 bit to YCbCr 4:2:2 or 4:2:0 10 bit , retaining more colors, more than what the standard full rgb would offer.  I'm thinking Media Player Classic Home Cinema with Madvr rendered or something like that... few video players are smart enough to be designed to work with 10bit per color properly, there's few monitors out there with 10bit capability and not much content in these new color spaces with HDR and lots of colors so not many developers can implement and test such features. 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×