Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
atiab.bz

Watching 10-bit content on 8-bit screen

Recommended Posts

Posted · Original PosterOP

Is it a good idea to playback 10-bit video on a panel that is only capable of doing 8-bit? I mean, will I be detracting from the viewing experience or will it be a more vivid video? Am I better off just exclusively sticking to 8-bit?

Link to post
Share on other sites
1 minute ago, atiab.bz said:

Is it a good idea to playback 10-bit video on a panel that is only capable of doing 8-bit? I mean, will I be detracting from the viewing experience or will it be a more vivid video? Am I better off just exclusively sticking to 8-bit?

There is nothing wrong with playing 10bit stuff on an 8bit display, the content is simply scaled back down to 8bit-range when playing.


Hand, n. A singular instrument worn at the end of the human arm and commonly thrust into somebody’s pocket.

Link to post
Share on other sites
Posted · Original PosterOP
1 hour ago, WereCatf said:

There is nothing wrong with playing 10bit stuff on an 8bit display, the content is simply scaled back down to 8bit-range when playing.

 

1 hour ago, mariushm said:

Yeah, look up dithering in Wikipedia, that's what your video player will do : https://en.wikipedia.org/wiki/Dither#Digital_photography_and_image_processing

 

So, would playing 10-bit content on an 8-bit panel result in a nicer output (analogous to SSAA/MSAA, maybe?) or would it just introduce unwanted noise?

Link to post
Share on other sites
4 minutes ago, atiab.bz said:

So, would playing 10-bit content on an 8-bit panel result in a nicer output (analogous to SSAA, maybe?) or would it just introduce unwanted noise?

For all intents and purposes, neither. You would literally have to take up a big-ass magnifying-glass and go pixel-peeping to see any difference. That said, from a technological standpoint, it'd result in nicer output, since you had more data to work with than if you were playing 8bit content.*

 

*) In some instances 10bit content playing on an 8bit panel will actually result rather noticeably less banding than straight up 8bit content, but that doesn't necessarily apply very often.


Hand, n. A singular instrument worn at the end of the human arm and commonly thrust into somebody’s pocket.

Link to post
Share on other sites
On 9/28/2020 at 7:29 AM, atiab.bz said:

Is it a good idea to playback 10-bit video on a panel that is only capable of doing 8-bit? I mean, will I be detracting from the viewing experience or will it be a more vivid video? Am I better off just exclusively sticking to 8-bit?

Yes, it's always a good idea, it will have less banding and sometimes better contrast in subtle areas, but it's most likely unnoticeable in motion. It's a really subtle difference unlike playing UHD in 1080p where chroma upsampling is not required anymore (as 1080p Blu-rays have only 540p of color information). 

Link to post
Share on other sites

The video card sends 8 bit per color data to the monitor.

 

The video player takes 10bit per color video and reduces the amount of colors using dithering until it's down to 8 bit per color and sends that to the monitor, at no point in time is the monitor seeing anything else than 8 bit per color.

 

10 bit encoded videos can have better quality because often, they're compressed using higher quality settings, or the bitrate (how much KB is downloaded for each second of video) is a bit higher.  So you get more quality from the start, and even if the player dithers it down to 8 bit per color, the video kept more of its quality all the way to you, compared to an equivalent 8 bit version.

 

Compressing to 10 bit per color can also help, because when compressing  video a lot of decisions are made about what's less important or more important to your eyes, what you notice or not when something moves in a scene (so that the encoder can reduce quality of picture where your eyes are unlikely to focus on and give more bits to areas you look at)... a lot of the math involved in making those decisions will have a few rounding errors when you're limited to 8 bits... so when you have 10 bits to work with yeah, you have more data to compress, but fewer rounding errors, the accuracy of the calculations often makes up for it, and you get more quality in the same amount of KB or just a few more KB required for same scene for better quality.

 

Cheap monitors are barely able to show the 8 bit per color, in fact a lot of them aren't even able to show those 16 million nuances of color ... nothing to worry about, play 10bit per color videos as much as you want, the video player will convert it down to 8 bit per color and play it like an 8 bit video.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×