Jump to content

Is HDR and 10 bit same thing?

J.b091
Go to solution Solved by nekogod,
3 minutes ago, J.b091 said:

so for example video can be 10 bit but not necessarily HDR? 

Yes, however I believe you need at least 10 bit (40bit) colour for true HDR

 

I did find this reddit thread which has some information about this

 

When I'm browsing websites and if in movie's info says hevc 10 bit, does it also automatically mean it's HDR movie?

Link to comment
Share on other sites

Link to post
Share on other sites

Not necessarily I believe it just means it uses the 10 bit colour standard.

 

It's a bit confusing as they changed from total bits to bits per channel.

 

In the old days you might have had a 32bit colour mode on your monitor what that really was actually was 32 bits per pixel, 8 bits for each of the 3 colours and 8 bits for transparency

 

10 bit I believe is simply 10 bits per channel so would be 40bit colour in the old language

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, nekogod said:

Not necessarily I believe it just means it uses the 10 bit colour standard.

 

It's a bit confusing as they changed from total bits to bits per channel.

 

In the old days you might have had a 32bit colour mode on your monitor what that really was actually was 32 bits per pixel, 8 bits for each of the 3 colours and 8 bits for transparency

 

10 bit I believe is simply 10 bits per channel so would be 40bit colour in the old language

so for example video can be 10 bit but not necessarily HDR? 

Link to comment
Share on other sites

Link to post
Share on other sites

Also is there any way to tell if video file is HDR if I check file info?

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, J.b091 said:

so for example video can be 10 bit but not necessarily HDR? 

Yes, however I believe you need at least 10 bit (40bit) colour for true HDR

 

I did find this reddit thread which has some information about this

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, nekogod said:

Yes, however I believe you need at least 10 bit (40bit) colour for true HDR

 

I did find this reddit thread which has some information about this

 

I have Samsung 4k HDR TV, but my PC has normal non HDR screen and want to know which HDR videos I can have for my TV.

Thanks for link info.

Link to comment
Share on other sites

Link to post
Share on other sites

You can have 10 bit with and without HDR. 

To have a good HDR, you need to have a monitor with a powerful backlight or something using OLED, because the extra colors the HDR (high dynamic range) brings can not be reproduced correctly with the amount of brightness in regular monitors (regular monitors can't produce enough shades of a color to get HDR well)

 

10bit videos can be produced using 8 bit per color original content, because it helps with preserving more video quality during compression, you get less rounding/approximation errors when calculating how objects move from frame to frame, so you can compress motion and other things better and with more quality.

 

Even the old H264 supported 10bit per color, and even 12bit and 14bit (used for medical stuff like xrays for example) but because hardware decoders on video cards supported only 8bit, it didn't gain much popularity.

HEVC (and VP9) supports 10bit by default and all hardware decoders support it, because it's important for HDR and blurays and other things.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, mariushm said:

You can have 10 bit with and without HDR. 

To have a good HDR, you need to have a monitor with a powerful backlight or something using OLED, because the extra colors the HDR (high dynamic range) brings can not be reproduced correctly with the amount of brightness in regular monitors (regular monitors can't produce enough shades of a color to get HDR well)

 

10bit videos can be produced using 8 bit per color original content, because it helps with preserving more video quality during compression, you get less rounding/approximation errors when calculating how objects move from frame to frame, so you can compress motion and other things better and with more quality.

 

Even the old H264 supported 10bit per color, and even 12bit and 14bit (used for medical stuff like xrays for example) but because hardware decoders on video cards supported only 8bit, it didn't gain much popularity.

HEVC (and VP9) supports 10bit by default and all hardware decoders support it, because it's important for HDR and blurays and other things.

 

 

Then I do not understand why some regular monitors and TVs have "HDR" label? For example when I bought TV it said it supports HDR and is 4k but it's not oled, just regular led TV.

Link to comment
Share on other sites

Link to post
Share on other sites

Videos with HDR would look a bit washed out, or with weird colors on your pc.

Videos that are 10bit but without HDR would probably look ok, as the movie player will dither the 10bit down to 8 bit.

With Media Player Classic Home Cinema, you could look at properties of a file and see if it's VP9 Profile 2 ... that's most likely HDR

 

There's a few examples here: ftp://helpedia.com/pub/multimedia/testvideos/

 

Try from there:

 

Life_of_ Pi_draft_Ultra-HD_HDR.mp4

Sony_4K_HDR_Camp.mp4
The Redwoods 4k 24fps (VP9 Profile 2 HDR).mkv
The World in HDR 4k 59.94 fps (VP9 Profile 2 HDR).mkv

 

Quote

Then I do not understand why some regular monitors and TVs have "HDR" label? For example when I bought TV it said it supports HDR and is 4k but it's not oled, just regular led TV.

There's multiple  versions of HDR  ... like HDR Basic, Advanced, Pro ...

You can have HDR with LCD monitors and LCD TVs with LED backlight, but you need more powerful leds (brighter) and you need more complex circuitry to have "zones" of backlight  - basically you divide the rectangle of your monitor in squares and you need to be able to control the brightness of each square separately, and the leds can be so bright that you can't really have all squares lit to the maximum, the monitor needs circuitry to keep track of how many squares are at maximum brightness and stuff like that.

The most basic HDR is a crappy kind of HDR, you can barely call it that.

 

For proper HDR you need some amount of brightness like let's say 1000 something (some value, candela per meter I think ) but the minimum HDR "level" requires only 400 of that... and the cheaper ones have fewer "zones", the viewing rectangle is divided in larger squares so you have less granularity and all that...

 

Ah... here's the Vesa DisplayHDR standard (which isn't the only one standard out there when it comes to HDR) : https://displayhdr.org/performance-criteria/

(click for zoom)

hdr.thumb.png.cdec40c43aebf32bc21ddfeb2f8375f2.png

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×