Jump to content

Best output color format in the nvidia control panel for 4K and HDR TV display?

Go to solution Solved by mariushm,

RGB is 8 bit per color, 16 million colors

YCbCr  is same thing as RGB, but instead of colors you have Y = Brightness (Luminance), Chroma Blue and Chroma Red (color information) ... a formula is used to determine red, green and blue from these 3.

4:4:4 means that for every pixel, the monitor receives red,green and blue , or ycbcr information .. 3 x 8 bits or 3 x 10 bits or 3 x 12 bits

4:2:2 and 4:2:0 means that the video cards makes groups of 4 pixels, sends the brightness information for each pixel untouched, but makes an average of the color information for those 4 pixels and sends only 1 or 2 values for the whole group... so instead of 4 Cb , only 1-2 Cb values are sent, or only 1-2 Cr values.

Human eyes are much more sensitive to light, but less sensitive to variations in color, so in most situations you wouldn't notice that averaging.

 

Bluray movies, Youtube content etc is compressed using YCbCr 4:2:0

Some professional recording equipment records in YCbCr 4:2:2 

 

So 12 bit > 10 bit > 8 bit   RGB = YCbCr 4:4:4  > YCbCr 4:2:2 > YCbCr 4:2:0

 

YCbCr 4:2:0 is acceptable for watching movies, sometimes ok for playing games (there's issues with tiny text and some colors like intense reds being affected by the conversion from 4:4:4 to 4:2:0)

 

If you can do 10 bit, I'd say try it out. I wouldn't bother with 12 bit ... few games are even aware and support it, and your tv is probably not calibrated well enough that having 12 bit would make a difference.

Your TV is probably 8bit+FRC anyway, making it a 10 bit panel.

 

edit :  

Read your original message again.  YCbCr 4:2:0 is the worst option, and it's not worth using it just for 12bit, because all the video content is 8 bit or 10 bit... you'd get no quality increase forcing 12 bit.

I'd suggest going with YCbCr 4:2:2 10 bit  or YCbCr 4:4:4 / RGB 8 bit.

 

Check the TV menus and see if there's some "PC mode" or "Raw Input" or whatever, which may allow you to configure YCbCr 4:4:4 10 bit, or RGB 10 bit. That would be the best.  It may be a hdmi 2.0 limitation here, not giving you that format.

What is the best output color format in the nvidia control panel (RTX 2060) for 4K and HDR display ? I have my desktop connected with my TV (LG 43UM7450PLA) using HDMI 2.0. When i selected YCbCr420 can support 8bit and 12bit, RGB and YCbCr444 only 8bit and YCbCr422 8,10 and 12bit color depth. Now i selected the YCbCr420 and i believe is the best option for my TV because of the 12bit color depth am i right ?

Link to comment
Share on other sites

Link to post
Share on other sites

I have a 4k monitor IPS HDR and i have the same dilemma.

The options are 

RGB

YCBCr422

YCBCr444

I really don't know witch one to chose.

Some help here also would be appreciated. 

 

CPU:i7 9700k 5047.5Mhz All Cores Mobo: MSI MPG Z390 Gaming Edge AC, RAM:Corsair Vengeance LPX 16GB 3200MHz DDR4 OC 3467Mhz GPU:MSI RTX 2070 ARMOR 8GB OC Storage:Samsung SSD 970 EVO NVMe M.2 250GB, 2x SSD ADATA PRO SP900 256GB, HDD WD CB 2TB, HDD GREEN 2TB PSU: Seasonic focus plus 750w Gold Display(s): 1st: LG 27UK650-W, 4K, IPS, HDR10, 10bit(8bit + A-FRC). 2nd: Samsung 24" LED Monitor (SE390), Cooling:Fazn CPU Cooler Aero 120T Push/pull Corsair ML PRO Fans Keyboard: Corsair K95 Platinum RGB mx Rapidfire Mouse:Razer Naga Chroma  Headset: Razer Kraken 7.1 Chroma Sound: Logitech X-540 5.1 Surround Sound Speaker Case: Modded Case Inverted, 5 intake 120mm, one exhaust 120mm.

Link to comment
Share on other sites

Link to post
Share on other sites

I'd think YCbCr422 12bit would be better for HDR, but RGB will look better for SDR which quite frankly will be what you are using most of the time.

What's even more confusing is, are games even outputting 12bit in the first place?  There's few games with HDR support as it is, let alone knowing if they do 12bit or not. :/

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

RGB is 8 bit per color, 16 million colors

YCbCr  is same thing as RGB, but instead of colors you have Y = Brightness (Luminance), Chroma Blue and Chroma Red (color information) ... a formula is used to determine red, green and blue from these 3.

4:4:4 means that for every pixel, the monitor receives red,green and blue , or ycbcr information .. 3 x 8 bits or 3 x 10 bits or 3 x 12 bits

4:2:2 and 4:2:0 means that the video cards makes groups of 4 pixels, sends the brightness information for each pixel untouched, but makes an average of the color information for those 4 pixels and sends only 1 or 2 values for the whole group... so instead of 4 Cb , only 1-2 Cb values are sent, or only 1-2 Cr values.

Human eyes are much more sensitive to light, but less sensitive to variations in color, so in most situations you wouldn't notice that averaging.

 

Bluray movies, Youtube content etc is compressed using YCbCr 4:2:0

Some professional recording equipment records in YCbCr 4:2:2 

 

So 12 bit > 10 bit > 8 bit   RGB = YCbCr 4:4:4  > YCbCr 4:2:2 > YCbCr 4:2:0

 

YCbCr 4:2:0 is acceptable for watching movies, sometimes ok for playing games (there's issues with tiny text and some colors like intense reds being affected by the conversion from 4:4:4 to 4:2:0)

 

If you can do 10 bit, I'd say try it out. I wouldn't bother with 12 bit ... few games are even aware and support it, and your tv is probably not calibrated well enough that having 12 bit would make a difference.

Your TV is probably 8bit+FRC anyway, making it a 10 bit panel.

 

edit :  

Read your original message again.  YCbCr 4:2:0 is the worst option, and it's not worth using it just for 12bit, because all the video content is 8 bit or 10 bit... you'd get no quality increase forcing 12 bit.

I'd suggest going with YCbCr 4:2:2 10 bit  or YCbCr 4:4:4 / RGB 8 bit.

 

Check the TV menus and see if there's some "PC mode" or "Raw Input" or whatever, which may allow you to configure YCbCr 4:4:4 10 bit, or RGB 10 bit. That would be the best.  It may be a hdmi 2.0 limitation here, not giving you that format.

Link to comment
Share on other sites

Link to post
Share on other sites

Dont use chroma subsampling for PC use. text gets fuzzy.

 

In other words stick with YCbCr 444

 

if u lack the bandwidth its better to drop from 10bit to 8 bit so u can stick with 444 sampling.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

I dont have any problem with text gets fuzzy with any output color option because i have already the "PC mode" input enable in my TV and I'm surprised, i have better quality picture in windows 10 than before using the IPS 4K monitor (27UK650). I have try to play a game in HDR forcing enable the HDR in windows before launch the game (GreedFall) and only with 4:2:2 10bit option i can play the game with HDR. So i will try also other games if i can play hdr with 4:2:2 12bit, but for now i will stick with the option 4:2:2 10bit. Anyway thank you all for your help and mariushm for the background theory.:)

 

EDIT: I have restart my pc and try 4:2:2 12bit again and it works! ? Of course i didn't notice any difference from 10 bit but who cares... Anyway i will say it again the 4k hdr it looks much better with TV. Anyone who thinking about buying 4K monitor not worth it, i did it before and i regret it.

Why the PS4 pro doesn't support 4:2:2 ?

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×