Jump to content

NVIDIA control panel 8 bpc vs 10 bpc

Kevinjr12

So I have an LG 27GN950-b and an RTX 3080. My monitor can do 10 bit color, but by default windows shows it at 8 bit (I have monitor at 4k 144hz DP 1.4 DSC). I can go into the NVIDIA control panel->Change Resolution-> and switch from "Use default color settings" to "Use NVIDIA color settings" and then switch output color depth to 10 bpc. But what does that actually switch to 10 bpc? I read that most things on windows will just stay 8 bpc, and almost all games use 8 bpc anyway. Is this true (I have read some conflicting things lol)? Also if I turn on HDR it does switch to 10 bpc automatically (even if it was set to "Use default color settings"), do I only benefit from the 10 bpc in HDR? Should I always switch from "Use default color settings" to "Use NVIDIA color settings" and turn on 10 bpc? Any info would be greatly appreciated!

Link to comment
Share on other sites

Link to post
Share on other sites

biggest thing is the much wider color space. 8 bit is the standard 16.7million colors with SRGB. 10 bit has over 1 billion colors within it so it helps to be even that much more precise with color accuracy. 

Community Standards | Fan Control Software

Please make sure to Quote me or @ me to see your reply!

Just because I am a Moderator does not mean I am always right. Please fact check me and verify my answer. 

 

"Black Out"

Ryzen 9 5900x | Full Custom Water Loop | Asus Crosshair VIII Hero (Wi-Fi) | RTX 3090 Founders | Ballistix 32gb 16-18-18-36 3600mhz 

1tb Samsung 970 Evo | 2x 2tb Crucial MX500 SSD | Fractal Design Meshify S2 | Corsair HX1200 PSU

 

Dedicated Streaming Rig

 Ryzen 7 3700x | Asus B450-F Strix | 16gb Gskill Flare X 3200mhz | Corsair RM550x PSU | Asus Strix GTX1070 | 250gb 860 Evo m.2

Phanteks P300A |  Elgato HD60 Pro | Avermedia Live Gamer Duo | Avermedia 4k GC573 Capture Card

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Skiiwee29 said:

biggest thing is the much wider color space. 8 bit is the standard 16.7million colors with SRGB. 10 bit has over 1 billion colors within it so it helps to be even that much more precise with color accuracy. 

Bit depth has nothing to do with color space, as this is only limited by the monitors gamut coverage. The min and max values (black and fully saturated colors) will stay the same no matter if you use 8bit or 10bit. A higher bit depth introduces more steps in between the min and max values. So theoretically you can achieve any color space coverage you want, be it full sRGB, DCI-P3, AdobeRGB or Rec2020 with just 8bit. But like you said it will introduce more colors in between. With 8bit there are 256 steps in between black and fully saturated. I can't remember atm how many steps there are with 10bit.

 

Theoretically you will only need 8bit in most cases. The only reasons to use 10bit is if you are a professional working with colors or if you want to use HDR, where 10bit is part of the spec. But even in HDR you can easily get away with 8bit if you're bandwidth limited with next to no visual difference to 10bit. The only thing 10bit really does is improve gradient performance. (Gradient performance is basically how very close colors can "wash" into one color rather than being distinguishable)

 

But in the end, if you can enable 10bit without limiting your refresh rate and resolution, (if you have the bandwidth left) then there is no reason not to use it. It doesn't bring any drawbacks to the table. Anything that is 8bit will just be scaled like @Glenwing said. And everything that is natively 10bit will just run in 10bit. 

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

Ahh ok. So I should switch to 10 bpc in the NVIDIA control panel? Also do you know why the control panel defaults to 8 bpc, even though I can do 10?

Link to comment
Share on other sites

Link to post
Share on other sites

On 8/6/2021 at 3:18 AM, Stahlmann said:

Bit depth has nothing to do with color space, as this is only limited by the monitors gamut coverage. The min and max values (black and fully saturated colors) will stay the same no matter if you use 8bit or 10bit. A higher bit depth introduces more steps in between the min and max values. So theoretically you can achieve any color space coverage you want, be it full sRGB, DCI-P3, AdobeRGB or Rec2020 with just 8bit. But like you said it will introduce more colors in between. With 8bit there are 256 steps in between black and fully saturated. I can't remember atm how many steps there are with 10bit.

 

Theoretically you will only need 8bit in most cases. The only reasons to use 10bit is if you are a professional working with colors or if you want to use HDR, where 10bit is part of the spec. But even in HDR you can easily get away with 8bit if you're bandwidth limited with next to no visual difference to 10bit. The only thing 10bit really does is improve gradient performance. (Gradient performance is basically how very close colors can "wash" into one color rather than being distinguishable)

 

But in the end, if you can enable 10bit without limiting your refresh rate and resolution, (if you have the bandwidth left) then there is no reason not to use it. It doesn't bring any drawbacks to the table. Anything that is 8bit will just be scaled like @Glenwing said. And everything that is natively 10bit will just run in 10bit. 

Ahh ok. So I should switch to 10 bpc in the NVIDIA control panel? Also do you know why the control panel defaults to 8 bpc, even though I can do 10? (sorry, forgot to quote you the first time)

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, Kevinjr12 said:

Ahh ok. So I should switch to 10 bpc in the NVIDIA control panel? Also do you know why the control panel defaults to 8 bpc, even though I can do 10? (sorry, forgot to quote you the first time)

Idk what the reasoning is behind it defaulting to 8bpc. But long story short: Just switch to 10bpc if you have the option to.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 year later...

I use 10 bit with my Dell G3223Q but I'm not sure if it impacts performance.

 

I figure using 30 bit color gives more color gradations to allow for things like Night Light and color profiles

Link to comment
Share on other sites

Link to post
Share on other sites

On 8/6/2021 at 2:18 PM, Stahlmann said:

Bit depth has nothing to do with color space, as this is only limited by the monitors gamut coverage. The min and max values (black and fully saturated colors) will stay the same no matter if you use 8bit or 10bit. A higher bit depth introduces more steps in between the min and max values. So theoretically you can achieve any color space coverage you want, be it full sRGB, DCI-P3, AdobeRGB or Rec2020 with just 8bit. But like you said it will introduce more colors in between. With 8bit there are 256 steps in between black and fully saturated. I can't remember atm how many steps there are with 10bit.

 

Theoretically you will only need 8bit in most cases. The only reasons to use 10bit is if you are a professional working with colors or if you want to use HDR, where 10bit is part of the spec. But even in HDR you can easily get away with 8bit if you're bandwidth limited with next to no visual difference to 10bit. The only thing 10bit really does is improve gradient performance. (Gradient performance is basically how very close colors can "wash" into one color rather than being distinguishable)

 

But in the end, if you can enable 10bit without limiting your refresh rate and resolution, (if you have the bandwidth left) then there is no reason not to use it. It doesn't bring any drawbacks to the table. Anything that is 8bit will just be scaled like @Glenwing said. And everything that is natively 10bit will just run in 10bit. 

It doesn't do anything for professional photo editing either. People are struggling to identify any advantage going from 8 to 14 bits even. 

Link to comment
Share on other sites

Link to post
Share on other sites

People don't agree even how many colors can human eye see.

16 million colors it's surely not. Some google searches report 1 million, others up to 100 million.

 

Not sure about 1 million, as even 16 million color digital photos have often quite a lot of colour banding.

I edit my posts more often than not

Link to comment
Share on other sites

Link to post
Share on other sites

Just switch to 10bit if your display is capable. Windows default 8bit I think is to prevent any compatibility issue. I not sure. 

 

10bit give you more smooth gradient as it is more colors can be produce, at least I can notice the differences, and no worried they won't impact your pc performance. 

 

Some display even can choose 12bit when using HDMI 2.1 only but normally most will have G-sync issue and frank speak I can't notice the differences of 10bit vs 12bit anymore. I prefer better support in G-sync and since I cannot notice the differences, I remain in 10bit rather than 12bit, basically I think for the moment no content in 12bit also. 

PC: AMD Ryzen 9 5900X, Gigabyte GeForce RTX 4090 OC 24G, X570 AORUS Elite WIFI Motherboard, HyperX FURY 32GB DDR4-3200 RGB RAM, Creative Sound Blaster AE-9 Sound Card, Samsung 970 Evo Plus M.2 SATA 500GB, ADATA XPG SX8200 Pro M.2 SATA 2TB, Asus HyperX Fury RGB SSD 960GB, Seagate Barracuda 7200RPM 3.5 HDD 2TB, Cooler Master MASTERLIQUID ML240R ARGB, Cooler Master MASTERFAN MF120R ARGB, Cooler Master ELV8 Graphics Card Holder ARGB, Asus ROG Strix 1000G PSU, Lian Li LANCOOL II MESH RGB Case, Windows 11 Pro (22H2).


Laptop: Asus Vivobook "A Bathing Ape" - ASUS Vivobook S 15 OLED BAPE Edition: Intel i9-13900H, 16 GB RAM, 15.6" 2.8K 120hz OLED | Apple MacBook Pro 14" 2023: M2 Pro, 16 GB RAM, NVMe 512 GB | Asus VivoBook 15 OLED: Intel® Core™ i3-1125G4, Intel UHD, 8 GB RAM, Micron NVMe 512 GB | Illegear Z5 SKYLAKE: Intel Core i7-6700HQ, Nvidia Geforce GTX 970M, 16 GB RAM, ADATA SU800 M.2 SATA 512GB.

 

Monitor: Samsung Odyssey OLED G9 49" 5120x1440 240hz QD-OLED HDR, LG OLED Flex 42LX3QPSA 41.5" 3840x2160 bendable 120hz WOLED, AOC 24G2SP 24" 1920x1080 165hz SDR, LG UltraGear Gaming Monitor 34" 34GN850 3440x1440 144hz (160hz OC) NanoIPS HDR, LG Ultrawide Gaming Monitor 34" 34UC79G 2560x1080 144hz IPS SDR, LG 24MK600 24" 1920x1080 75hz Freesync IPS SDR, BenQ EW2440ZH 24" 1920x1080 75hz VA SDR.


Input Device: Asus ROG Azoth Wireless Mechanical KeyboardAsus ROG Chakram X Origin Wireless MouseLogitech G913 Lightspeed Wireless RGB Mechanical Gaming Keyboard, Logitech G502X Wireless Mouse, Logitech G903 Lightspeed HERO Wireless Gaming Mouse, Logitech Pro X, Logitech MX Keys, Logitech MX Master 3, XBOX Wireless Controller Covert Forces Edition, Corsair K70 RAPIDFIRE Mechanical Gaming Keyboard, Corsair Dark Core RGB Pro SE Wireless Gaming Mouse, Logitech MK850 Wireless Keyboard & Mouse Combos.


Entertainment: LG 55" C9 OLED HDR Smart UHD TV with AI ThinQ®, 65" Samsung AU7000 4K UHD Smart TV, SONOS Beam (Gen 2) Dolby Atmos Soundbar, SONOS Sub Mini, SONOS Era 100 x2, SONOS Era 300 Dolby Atmos, Logitech G560 2.1 USB & Bluetooth Speaker, Logitech Z625 2.1 THX Speaker, Edifier M1370BT 2.1 Bluetooth Speaker, LG SK9Y 5.1.2 channel Dolby Atmos, Hi-Res Audio SoundBar, Sony MDR-Z1R, Bang & Olufsen Beoplay EX, Sony WF-1000XM5, Sony WH-1000XM5, Sony WH-1000XM4, Apple AirPods Pro, Samsung Galaxy Buds2, Nvidia Shield TV Pro (2019 edition), Apple TV 4K (2017 & 2021 Edition), Chromecast with Google TV, Sony UBP-X700 UltraHD Blu-ray, Panasonic DMP-UB400 UltraHD Blu-ray.

 

Mobile & Smart Watch: Apple iPhone 15 Pro Max (Natural Titanium), Apple Watch Series 8 Stainless Steel with Milanese Loop (Graphite).

 

Others Gadgets: Asus SBW-06D2X-U Blu-ray RW Drive, 70 TB Ext. HDD, j5create JVCU100 USB HD Webcam with 360° rotation, ZTE UONU F620, Maxis Fibre WiFi 6 Router, Fantech MPR800 Soft Cloth RGB Gaming Mousepad, Fantech Headset Headphone Stand AC3001S RGB Lighting Base Tower, Infiniteracer RGB Gaming Chair

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, Andrewtst said:

Just switch to 10bit if your display is capable. Windows default 8bit I think is to prevent any compatibility issue. I not sure. 

 

10bit give you more smooth gradient as it is more colors can be produce, at least I can notice the differences, and no worried they won't impact your pc performance. 

 

Some display even can choose 12bit when using HDMI 2.1 only but normally most will have G-sync issue and frank speak I can't notice the differences of 10bit vs 12bit anymore. I prefer better support in G-sync and since I cannot notice the differences, I remain in 10bit rather than 12bit, basically I think for the moment no content in 12bit also. 

Should be noted that it's not necessarily help with banding, and can even make the matter worse in some case

 

@Stahlmann used to mention his experiences with LG C2 at one point, I've tested it out myself and can personally confirm that the going from 8 bit to 12 bit (or 10 bit even) resulted in significantly more gradient than just stay at 8 bit (which resulted in 8 bit + FRC despite the dispay being a native 12 bit panel)

 

So it's kind of just depend on that department as well. 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, e22big said:

Should be noted that it's not necessarily help with banding, and can even make the matter worse in some case

 

@Stahlmann used to mention his experiences with LG C2 at one point, I've tested it out myself and can personally confirm that the going from 8 bit to 12 bit (or 10 bit even) resulted in significantly more gradient than just stay at 8 bit (which resulted in 8 bit + FRC despite the dispay being a native 12 bit panel)

 

So it's kind of just depend on that department as well. 

In general this not supposed happen, and seems only C2 got this issue. Asus also using C2 panel but did not have this issue as well. 

PC: AMD Ryzen 9 5900X, Gigabyte GeForce RTX 4090 OC 24G, X570 AORUS Elite WIFI Motherboard, HyperX FURY 32GB DDR4-3200 RGB RAM, Creative Sound Blaster AE-9 Sound Card, Samsung 970 Evo Plus M.2 SATA 500GB, ADATA XPG SX8200 Pro M.2 SATA 2TB, Asus HyperX Fury RGB SSD 960GB, Seagate Barracuda 7200RPM 3.5 HDD 2TB, Cooler Master MASTERLIQUID ML240R ARGB, Cooler Master MASTERFAN MF120R ARGB, Cooler Master ELV8 Graphics Card Holder ARGB, Asus ROG Strix 1000G PSU, Lian Li LANCOOL II MESH RGB Case, Windows 11 Pro (22H2).


Laptop: Asus Vivobook "A Bathing Ape" - ASUS Vivobook S 15 OLED BAPE Edition: Intel i9-13900H, 16 GB RAM, 15.6" 2.8K 120hz OLED | Apple MacBook Pro 14" 2023: M2 Pro, 16 GB RAM, NVMe 512 GB | Asus VivoBook 15 OLED: Intel® Core™ i3-1125G4, Intel UHD, 8 GB RAM, Micron NVMe 512 GB | Illegear Z5 SKYLAKE: Intel Core i7-6700HQ, Nvidia Geforce GTX 970M, 16 GB RAM, ADATA SU800 M.2 SATA 512GB.

 

Monitor: Samsung Odyssey OLED G9 49" 5120x1440 240hz QD-OLED HDR, LG OLED Flex 42LX3QPSA 41.5" 3840x2160 bendable 120hz WOLED, AOC 24G2SP 24" 1920x1080 165hz SDR, LG UltraGear Gaming Monitor 34" 34GN850 3440x1440 144hz (160hz OC) NanoIPS HDR, LG Ultrawide Gaming Monitor 34" 34UC79G 2560x1080 144hz IPS SDR, LG 24MK600 24" 1920x1080 75hz Freesync IPS SDR, BenQ EW2440ZH 24" 1920x1080 75hz VA SDR.


Input Device: Asus ROG Azoth Wireless Mechanical KeyboardAsus ROG Chakram X Origin Wireless MouseLogitech G913 Lightspeed Wireless RGB Mechanical Gaming Keyboard, Logitech G502X Wireless Mouse, Logitech G903 Lightspeed HERO Wireless Gaming Mouse, Logitech Pro X, Logitech MX Keys, Logitech MX Master 3, XBOX Wireless Controller Covert Forces Edition, Corsair K70 RAPIDFIRE Mechanical Gaming Keyboard, Corsair Dark Core RGB Pro SE Wireless Gaming Mouse, Logitech MK850 Wireless Keyboard & Mouse Combos.


Entertainment: LG 55" C9 OLED HDR Smart UHD TV with AI ThinQ®, 65" Samsung AU7000 4K UHD Smart TV, SONOS Beam (Gen 2) Dolby Atmos Soundbar, SONOS Sub Mini, SONOS Era 100 x2, SONOS Era 300 Dolby Atmos, Logitech G560 2.1 USB & Bluetooth Speaker, Logitech Z625 2.1 THX Speaker, Edifier M1370BT 2.1 Bluetooth Speaker, LG SK9Y 5.1.2 channel Dolby Atmos, Hi-Res Audio SoundBar, Sony MDR-Z1R, Bang & Olufsen Beoplay EX, Sony WF-1000XM5, Sony WH-1000XM5, Sony WH-1000XM4, Apple AirPods Pro, Samsung Galaxy Buds2, Nvidia Shield TV Pro (2019 edition), Apple TV 4K (2017 & 2021 Edition), Chromecast with Google TV, Sony UBP-X700 UltraHD Blu-ray, Panasonic DMP-UB400 UltraHD Blu-ray.

 

Mobile & Smart Watch: Apple iPhone 15 Pro Max (Natural Titanium), Apple Watch Series 8 Stainless Steel with Milanese Loop (Graphite).

 

Others Gadgets: Asus SBW-06D2X-U Blu-ray RW Drive, 70 TB Ext. HDD, j5create JVCU100 USB HD Webcam with 360° rotation, ZTE UONU F620, Maxis Fibre WiFi 6 Router, Fantech MPR800 Soft Cloth RGB Gaming Mousepad, Fantech Headset Headphone Stand AC3001S RGB Lighting Base Tower, Infiniteracer RGB Gaming Chair

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 month later...
On 8/6/2021 at 12:18 AM, Stahlmann said:

Bit depth has nothing to do with color space, as this is only limited by the monitors gamut coverage. The min and max values (black and fully saturated colors) will stay the same no matter if you use 8bit or 10bit. A higher bit depth introduces more steps in between the min and max values. So theoretically you can achieve any color space coverage you want, be it full sRGB, DCI-P3, AdobeRGB or Rec2020 with just 8bit. But like you said it will introduce more colors in between. With 8bit there are 256 steps in between black and fully saturated. I can't remember atm how many steps there are with 10bit.

 

Theoretically you will only need 8bit in most cases. The only reasons to use 10bit is if you are a professional working with colors or if you want to use HDR, where 10bit is part of the spec. But even in HDR you can easily get away with 8bit if you're bandwidth limited with next to no visual difference to 10bit. The only thing 10bit really does is improve gradient performance. (Gradient performance is basically how very close colors can "wash" into one color rather than being distinguishable)

 

But in the end, if you can enable 10bit without limiting your refresh rate and resolution, (if you have the bandwidth left) then there is no reason not to use it. It doesn't bring any drawbacks to the table. Anything that is 8bit will just be scaled like @Glenwing said. And everything that is natively 10bit will just run in 10bit. 

REC 2020 requires 10-bit or 12-bit as stated in the Wiki article. Sure, you can enable REC 2020, however, if anything below 10bpc is being used then you are not taking advantage of the full color space that REC 2020 offers. Rec. 2020 - Wikipedia

REC 2120 is for HDR.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, jesseinla said:

REC 2020 requires 10-bit or 12-bit as stated in the Wiki article. Sure, you can enable REC 2020, however, if anything below 10bpc is being used then you are not taking advantage of the full color space that REC 2020 offers. Rec. 2020 - Wikipedia

REC 2120 is for HDR.

ITU-R BT.2020 defines two complete image systems known as 4K UHDTV and 8K UHDTV. Color gamut is one of the parameters defined by the standard for these formats. Color depth is another one. So are resolution and frame rate. None of these are inherently related to each other. Yes, the BT.2020 standard defines the 4K UHDTV format as having either 10 bpc or 12 bpc color depth, and color primaries as defined in Table 3 (which determines the color gamut). But saying that, because the standard specifies 10 bpc color depth, that must mean you can't use the full color gamut without it... Well the standard also specifies 60 Hz refresh rate (or a few other allowed values), why not say "without 60 Hz refresh rate you can't use the full color gamut"? The color depth is just one of the parameters defined by the standard, and color gamut is another one. That's all. That don't have any dependence on each other. If you have one without the other, it just means your system does not meet the definition of 4K UHDTV as per the BT.2020 standard.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×