Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
PeachGr

8 bit +2 frc vs 10 bit

Recommended Posts

Posted · Original PosterOP

What's the difference between those, and what frc really is?

Link to post
Share on other sites

Context?


F@H
Desktop: i7-5960X 4.4GHz, Noctua NH-D14, ASUS Rampage V, 32GB, RTX2080S, 2TB NVMe SSD, 2x16TB HDD RAID0, Corsair HX1200, Thermaltake Overseer RX1, Samsung 4K curved 49" TV, 23" secondary

Mobile SFF rig: i9-9900K, Noctua NH-L9i, Asrock Z390 Phantom ITX-AC, 32GB, GTX1070, 2x1TB NVMe SSD RAID0, 2x5TB 2.5" HDD RAID0, Athena 500W Flex (Noctua fan), Custom 4.7l 3D printed case

 

Dell XPS 2 in 1 2019, 32GB, 1TB, 4K

 

GPD Win 2

Link to post
Share on other sites
20 minutes ago, PeachGr said:

What's the difference between those, and what frc really is?

10 bits actually has 10 bit color resulution per r,g and b channel.

 

8bits +2frc has 8 bit color resolution per channel bit uses strobing of the subpixel to emulate having 2 more bits of resolution.

 

If your using windows then youll never notice the difference as the standard api's are all 8 bit per channel anyway; theres only a hand full of software that actually utalizes 10 bits and even then only if combined with a capabale gpu and driver.

(Example https://petapixel.com/2019/07/29/nvidia-unveils-new-studio-driver-with-support-for-10-bit-color-for-creatives/ ).

 

For average joe, it doesnt do jack sh*t and even if windows display preferences say 10bit pp theres about 0.00001% software actually using it.

 

What i dont know is if games/gpu use/display it (since their effects and filtering and scaling calculate pixels) but honestly especially in moving content youd never know the difference.

 

In the unlikely event of all planets aligning so that you actually would be using true effective displayed 10bpp for graphics work, theoretically the  true 10 bits screen should be less straining on the eyes when compared ro the frc one.

Link to post
Share on other sites
Posted · Original PosterOP
1 hour ago, Bartholomew said:

 

What i dont know is if games/gpu use/display it (since their effects and filtering and scaling calculate pixels) but honestly especially in moving content youd never know the difference.

Honestly I wanna see less layers of color (or colour) on movies or YouTube :D will that be able when I'll change to 8+2 bit monitor? GPU  2060 super

Link to post
Share on other sites
6 minutes ago, PeachGr said:

Honestly I wanna see less layers of color (or colour) on movies or YouTube :D will that be able when I'll change to 8+2 bit monitor? GPU  2060 super

I doubt it will make a difference as i dont think that content is 30b (10b for r,g,b) bits to begin with.

 

"However", it depends on what you come from.,if you are now on lets say some older 6b + 2 frc TN panel and move up to a true 8b IPS or VA for example, youll notice a improvement most likely. But thats unrelated to 8+2 or 10bpp.

 

 Note that im assuming here that with layers of color you mean banding by lack of color resolution, which coincidentely was often visible on older 6b panels (but im talking 5 to 10 year old panels then).

 

With 8bpp and good mastered content (24b color with proper dithering) youll never notice banding. Notice the good mastered content? That excludes 95% of youtube, pirated movies that have been recompressed to death, netflix on low bandwidth, etc.

 

Also, resolution and connection can make some difference, not 100% sure about the following (but recall reading it somewhere once) but either hdmi and/or dp at lower spec (like 1.0 or 1.1 or so) do some reduction in color resolution to at least be able 4k/30 or 60hz. Going from 8/8/8 to lets say a imagjnary 7/5/7 can introduce some banding.

Link to post
Share on other sites

No, majority of Youtube videos are encoded in YV12 (4:2:0) , so even less than 8 bit per color (the video codec makes groups of 4 pixels, stores the brightness information for each pixel and averages the color information of those 4 pixels and stores the average instead of the color for each pixel)

A small set of videos are encoded using a better color space and/or 10bit color

Link to post
Share on other sites
Posted · Original PosterOP
6 minutes ago, mariushm said:

No, majority of Youtube videos are encoded in YV12 (4:2:0) , so even less than 8 bit per color (the video codec makes groups of 4 pixels, stores the brightness information for each pixel and averages the color information of those 4 pixels and stores the average instead of the color for each pixel)

A small set of videos are encoded using a better color space and/or 10bit color

So my hope is what? New games? I hope there will be better encoding, or something. That is an interesting topic

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×