Jump to content

An Explanation of HDR Displays

LAwLz

At CES 2016 the latest must-have feature for TVs and monitors seem to be High Dynamic Range (HDR). But with all the hype comes a lot of confusion. What does HDR actually mean for monitors? Is it like HDR photography?

 

While most companies haven't gone into detail what they mean exactly with "HDR", there are some things I believe we will see, and those beliefs are based on current specifications by organizations such as the UHD Alliance with their "UHD Premium" and the International Telecommuncation Union (ITU) with their BT.2020 recommendation. I also used AMD's announcement of their HDR support as a guide on what to expect from monitors.

So just to be clear, HDR is not a well defined standard. We don't know if Dell's definition of HDR is the same as LG's definition of HDR. We do however know about some specifications which try to define HDR and this post is written assuming monitor manufacturers aim for them.

 

So what new and exciting things can we expect from HDR displays? We can expect:

  • Higher bit depth
  • Wider color space
  • Higher contrast ratio
  • Better encoding
Color Bit Depth

Let's start with higher bit depth. Right now 8 bits of color depth is the standard. What that means is that the video signal uses 8 bits for blue, 8 bits for red and 8 bits for green, and then mixes them all together to create colors. It might be hard to wrap your mind around, but increasing the bit depth does not actually create brighter reds, or more blue blues. What a higher color bit depth does is increase the granularity. Here is an example of what I mean.

This is what we could do with two bits of color depth:

post-216-0-81839600-1452424267.png

We can create 4 separate colors.

However, if we increase the bit depth from 2 bits to 3 bits we can create 7 different colors.

post-216-0-35437000-1452424300.png

 

As you might have noticed, the extreme red and extreme yellow are exactly the same with both 2 and 3 bits of color. What has changed is the amount of steps between red and yellow.

As I said before, the standard right now is that we use 8 bits for each primary color (red, green and blue). With the move to HDR this will be bumped up to 10bits (or possibly even 12 bits). That will allow us to have much smoother gradients. So when we have something, such as a sky, which transitions smoothly from one color into another (maybe a blue sky transitions to a more red sky the closer to the sun you look), the transition will look smoother the higher the color bit depth we have. Here is an exaggerated example of what the difference could be when looking at for example a picture of a sky:

post-216-0-22792700-1452424809.jpg

Color Space

The next improvement will be the color space. The bit depth defined how many steps we had between two colors, but the color space defines which those two extremes of the spectrum are. Let's get back to the red and yellow blocks of colors again.

This might be an example of a color space:

post-216-0-82380400-1452425182.png

and this might be another one:

post-216-0-11099800-1452425196.png

 

 

The top color space only goes from red to orange. It does not have any yellow in it. No matter how many bits we add to the color depth, we will never get a good yellow color. That's because the color space defines the edge colors. The same is true for the bottom picture which goes from orange to yellow. We will never be able to get a good red color in this color space.

Today the most common color space is called sRGB. The new standard we are moving towards is what is defined in the BT.2020 specification. 4K blu-ray movies will be mastered in the BT.2020 color space.

Here is a picture showing how much larger the BT.2020 color space is compared to sRGB:

post-216-0-89867500-1452425712_thumb.png

 

The small yellow triangle are the colors in sRGB. The large, black triangle are the colors in BT.2020. Since you are probably viewing this using the sRGB color space the picture won't actually be accurate. It makes it a lot easier to explain though.

 

Side note: As you can see in the image with the triangles it says "Rec.709" and "Rec.2020", not sRGB and BT.2020. That's because Rec.709 is more than just a color space. Rec.709 defines the color space, the refresh rate, the pixel count and many many more things. The sRGB standard was developed around the color space defined in Rec.709 but with some changes (such as the average gamma used). These differences however are not important to understand the concept of a color space and what we can expect from HDR.

Rec and BT are interchangeable. BT.2020 and Rec.2020 both refer to the same thing. That's because the ITU numbers these recommendations for television BT.####, and they are recommendations.

Contrast Ratio

Contrast ratio is the difference between the brightest and darkest spot. I don't think BT.2020 defines any specific contrast ratio but UHD Premium does. UHD Premium has a minimum contrast requirement of :

Over 1000 nits of peak brightness and less than 0.05 nits of black.

OR

Over 540 nits of peak brightness and less than 0.0005 nits of black.

 

 

Those are some seriously high requirements. For comparison, in Anandtech's tests the iPhone 6S got:

582 nits of peak brightness.

0.42 nits of black.

 

The Galaxy S6 would just barely pass these requirements because its peak brightness was 593 nits of brightness and 0 nits of blacks (because the pixel is literally turn off).

It will be near impossible to achieve these kinds of contrast ratios without using for example OLED (where pixels can be completely turned off, creating perfect black levels) or by getting incredibly good LCD panels and then using local dimming.

Encoding

The last part is the encoding. Or more specifically the "electro-optical transfer function" which defines how the digital signal representing visible light is encoded. Today we are using Rec.1886 EOTF which was designed in the 1930s for use with CRT monitors. Sadly I don't know enough about this to explain it, but the benefits of moving from Rec.1886 encoding to ST 2084 (the standard needed for the UHD Premium certificate) is that it was designed for higher brightness (Rec.1886 only took up to 100 nits of brightness into consideration, and now we are talking about displays with 1000 nits or more) and we can also expect more details in dark areas.

Maybe someone else can explain this part better.

 

 

Conclusion

So there you have it. HDR for monitors is not the same as the HDR photos you will see if you image searches "HDR photo". It is far more than that.

It is very exciting but don't get too excited yet. The entire chain of things has to support the same standards for it to work. So you will need a new monitor, possibly a new graphics card (AMD announced that the 300 series will get support for some things and the new generation will support it completely) as well as support in software. Windows does not support it yet but they are working on it, drivers will need to support it and the programs you use might also need to support it. YouTube has announced support for it which means that once they re-encode the original file, it will retain the wider color space, the 10 bits of color depth and so on. Showing a video encoded for the standard dynamic range (SDR) on an HDR display won't make much of a difference. So content will be limited as well. On top of all that, things will most likely have a very hefty price premium in the beginning. It is a very good step towards making it the standard in the future.

 

 

Disclaimer: Displays are not one of my strong points so feel free to correct any misinformation that might have slipped through my validations, or come with suggestions on how to explain things even better.

Link to comment
Share on other sites

Link to post
Share on other sites

The future of displays look promising

Leave a like if you breathed oxygen today

Link to comment
Share on other sites

Link to post
Share on other sites

Im not sure why HDR was only just showcased, my Samsung 4k set is HDR and I purchased 8 months ago

 

Ryzen Ram Guide

 

My Project Logs   Iced Blood    Temporal Snow    Temporal Snow Ryzen Refresh

 

CPU - Ryzen 1700 @ 4Ghz  Motherboard - Gigabyte AX370 Aorus Gaming 5   Ram - 16Gb GSkill Trident Z RGB 3200  GPU - Palit 1080GTX Gamerock Premium  Storage - Samsung XP941 256GB, Crucial MX300 525GB, Seagate Barracuda 1TB   PSU - Fractal Design Newton R3 1000W  Case - INWIN 303 White Display - Asus PG278Q Gsync 144hz 1440P

Link to comment
Share on other sites

Link to post
Share on other sites

thank you for this wonderful post

If your grave doesn't say "rest in peace" on it You are automatically drafted into the skeleton war.

Link to comment
Share on other sites

Link to post
Share on other sites

@LAwLz

 

RTG%20Tech%20Summit%20-%20Visual%20Tech%

 

larger, but better image in terms of explaining why this matters. mostly cuz it shows the "human" range too

 

EDIT:

in short. unless pascal also supports it. a AMD 400 series card + HDR monitor will simply look better then anything Nvidia can offer.. simply because the colors will be A LOT richer

Link to comment
Share on other sites

Link to post
Share on other sites

Thanks for the explanation, there is a lot of misinformation floating around.  I am looking to buy a new TV, how long do you think it will be before compliant HDR TVs becomes affordable(£500-1000)?  

Intel 12400F | 2x8 3000Mhz Corsair LPX | ASRock H570M-ITX  | Noctua DH-N14 | Corsair MP50 480GB | Meshilicious | Corsair SF600Fedora

 

Thanks let me know if I said something useful. Cheers!

Link to comment
Share on other sites

Link to post
Share on other sites

So how long before compliant HDR becomes affordable(£500-1000)?

If you think 1000 pounds is affordable then you will probably not have to wait that long.

 

 

This is the most informative thread I've ever read on LTT. No joke. It cleared so much up in one simple post, now I can understand Linus' monitor reviews...

Thank you! It makes me happy to hear that you liked it.

This post leaves out a lot of stuff that you might hear Linus talk about in his monitor reviews though. I only covered stuff related to HDR.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 months later...
On 10/01/2016 at 1:26 PM, LAwLz said:

-snip-

So my Sony TV I have now isn't 'HDR' however it does support 12bit colour depth and its colour accuracy is pretty good.

What exactly will I be missing if I have a HDR compatible graphics card etc?

Gaming PC: Case: NZXT Phantom 820 Black | PSU: XFX 750w PRO Black Edition 80Plus Gold (Platinum) | CPU: Intel Core i5 4690K | CPU Cooler: BE QUIET! Dark Rock Pro 2 | MB: ASUS Sabertooth Z97 Mark S | RAM: 24GB Kingston HyperX and Corsair Vengeance 1866MHz | GPU: MSI R9 280X 3G | SSD: Samsung 840 Evo 250GB | HDD: 9TB Total | Keyboard: K70 RGB Brown | Mouse: R.A.T MMO7

Laptop: HP Envy 15-j151sa | 1920x1080 60HZ LED | APU: AMD A10-5750M 2.5GHZ - 3.5GHZ | 8GB DDR3 1600mhz | GPU: AMD  HD 8650G + 8750M Dual Graphics | 1TB SSHD

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Lazmarr said:

So my Sony TV I have now isn't 'HDR' however it does support 12bit colour depth and its colour accuracy is pretty good.

What exactly will I be missing if I have a HDR compatible graphics card etc?

Hard to tell since I don't know which TV you got. My guess is that the things missing will be contrast, the wider color space and different encoding scheme.

You will benefit from the move to HDR though since you will at the very least get a higher color depth, assuming all of the software chain is also HDR compatible (OS, driver, program, etc).

 

In the beginning we will probably just see a few games appear with HDR enabled, and it's not sure they will use all of the same things I listed. The specs I listed above are the ones for movies and TVs.

What I am hoping is that HDR becomes the new standard for everything, kind of like how FHD is the most common thing to have today. That's many years away though.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 7 months later...

Yep, this post was definetly the most detailed ive seen on this forum, about anything ^^.

Groomlake Authority

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×