Jump to content

VESA Group annouces new "ClearMR" certification to clarify monitor response time specs

Qub3d
33 minutes ago, Senzelian said:

The displays are run on their maximum refresh rate while testing, if that is what you were asking.

During the test backlight strobing is turned off.

 

"Limits are also placed on overshoot and undershoot", tho I'm not sure what that means. 
To be honest this entire table doesn't make all that much sense to me...

image.thumb.png.b67bce1df15c9e50a8728dd850fe2f06.png

 

https://www.clearmr.org/performance-criteria/

I already see a huge flaw in this certification. Most gaming monitors run with VRR enabled, so the refresh rate will rarely be at the monitor's maximum. The guaranteed clarity coming from the certification only applies to max refresh rate, which is not how gaming monitors are generally used.

 

It's exactly what i thought it would be: A new logo on the box without any meaning for real-world useage. Again, monitors can be heavily optimized for their respecitve maximum refresh rate (for example 240Hz) and score a 9000-tier rating, but as you use the monitor with VRR in games that only run at 100-200 fps the performance can be a lot worse without impacting the certification.

 

Just more misleading marketing, helping brands to make their products seem better than they are.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, Stahlmann said:

That basically means the guaranteed clarity coming from the certification only applies to max refresh rate, which is not how gaming monitors are generally used.

True. Now it's a question of how well this new spec scales down to lower refresh rates. I think that VESA should do more testing and possibly deliver results at multiple set refresh rates such as 60Hz, 90Hz, 120Hz and so on.

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Stahlmann said:

Is it so wrong to assume that a certification called "Vesa DisplayHDR" evaluates HDR performance? I know it doesn't, but also know perfectly well that i'm not the majority.

I think you, and probably other people as well, have two fundamental misunderstandings of what DisplayHDR is.

 

1) DisplayHDR evaluates certain aspects of a monitor which is related to HDR. HDR is NOT just contrast. It is easy to assume it is, but it isn't. If you go and look at the various HDR standards that exist you will see that they encompasses a lot of things that are not related to contrast. In fact, most of the specifications are unrelated to HDR. 

 

2) DisplayHDR is not really an "evaluation" in the same way a benchmark is. The point is not to get an accurate measurement of what the display does in the same sense that a benchmark measures it. What DisplayHDR does is set a minimum requirement in several different tests. DisplayHDR is not a sticker that says "buy this monitor because it's good". It's a sticker that says "in the aspects tested, this monitor reaches at least these levels".

Buying a monitor based on DisplayHDR certifications is like buying an apartment based on how many rooms it has, without looking at any pictures of the rooms. The number of rooms is useful for filtering out apartments you know you aren't interested in, but you still have to look at pictures of what the rooms are like before determining if the apartment is good or not. 

"This apartment has 4 rooms" is not a useless statment just because you are interested in what condition the rooms are in. 

 

 

And I'd like to add a third misunderstanding.

3) Measuring displays is not an easy thing. There are literally hundreds of ways you can measure them. It is pretty much impossible to measure a display and give a final verdict for every single scenario. We have this issue with CPU benchmarks as well.

If you test a CPU using Cinebench then that may not be an accurate representation of how it performs in Quake 3.

Does that mean "Cinebench is useless", "Cinebench is a scam" or "I guess I was wrong for assuming Cinebench evaluated CPU performance"? No it doesn't. It just means that you used the incorrect measurement to evaluate it. The measurement you were after, Quake 3 performance, was not accounted for in the test you looked up, which was Cinebench.

 

Cinebech, and DisplayHDR, will accurately reflect certain aspects of the product. Neither of them will accurately reflect ALL aspect of a product, but neither test is set out to do so. Why not? Because measuring a display is extremely difficult and it is next to impossible to include all various aspects.

 

I mean, just look at the naming of the certifications. DisplayHDR 500, DisplayHDR 1000, DisplayHDR 1400. What do the numbers represent? It represents beak brightness in a 10% center patch test. It doesn't represent contrast. DisplayHDR has some contrast tests included, but it is just one aspect out of many.

 

 

 

Contrast is only a small part of what is tested, and it is clearly not the primary focus. So to evaluate how "good" DisplayHDR's test is by only looking at contrast is like judging a fish by its ability to climb a tree. You are criticizing it for not doing something "properly" (which is a completely different debate) when that thing clearly isn't a high priority. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Stahlmann said:

I already see a huge flaw in this certification. Most gaming monitors run with VRR enabled, so the refresh rate will rarely be at the monitor's maximum. The guaranteed clarity coming from the certification only applies to max refresh rate, which is not how gaming monitors are generally used.

 

It's exactly what i thought it would be: A new logo on the box without any meaning for real-world useage. Again, monitors can be heavily optimized for their respecitve maximum refresh rate (for example 240Hz) and score a 9000-tier rating, but as you use the monitor with VRR in games that only run at 100-200 fps the performance can be a lot worse without impacting the certification.

 

Just more misleading marketing, helping brands to make their products seem better than they are.

What are you on about?

I get the impression that you dislike VESA and are trying to find issues.

 

If you don't think they should test the displays at the maximum refresh rate, then what should they test them at?

Also, how do you expect VESA to create a test that monitor manufactures can't optimize for? 

 

Of course the monitor will perform worse when running at 100Hz vs 240Hz. Lower refresh rate inevitably leads to higher response times. What's wrong with testing the monitor under the best conditions? Should reviewers not review processors with good coolers because they might boost more than for someone with a worse cooler?

The "flaws" you pointed out with the tests apply to pretty much every single display measurement out there, including those from sites like Blurbusters. 

 

It's the same "the measurement is not perfect in every single testing condition I can think of so therefore it's just marketing BS that shouldn't exist" argument you use for disliking DisplayHDR. 

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Stahlmann said:

Is it so wrong to assume that a certification called "Vesa DisplayHDR" evaluates HDR performance? I know it doesn't, but also know perfectly well that i'm not the majority. The vast majority of customers will not do the research to find out what the certification is about. That's why these certifications like DisplayHDR, G-Sync Ultimate and FreeSync Premium Pro annoy me so much. I know they're BS and know i need to watch and read reviews to really find out what a monitor is about, but fact is most people don't and rely on the stickers on the box. (Or worse: Amazon reviews)

That the certification doesn't tell "are you going to have great HDR experience" really is a hard one and would stand a lot on subjectivity of the person looking at the certification. Certification needs certain specs to be met, what those specs cannot be is subjective, as in "this monitor gives great HDR experience" isn't objective thing. Just like with headphones we can easily measure the frequency response and say how bad the V-curve of the headphones is, but that cannot tell will you have "great audio experience" with them because that depends do you love the V-curve of random cheap gaming headsets or would you rather have the flat curve of studio monitors.

Same as we can measure the colour response of a TV, calibrate it to show the natural color and measure it again to show that it really shows the exact colours it should show. What that doesn't tell is does someone like to watch that image.Some may love to watch the art movies with the exact sRGB image but if you're going to watch the reruns of The Bold and the Beautiful from the 90's, you probably want to bump that saturation, sharpness and contrast a bit out of the exact points.

 

majority expects the DisplayHDR to be a measure of "great HDR experience" which not a single certification can do. They can just ensure the specs are there, it's up to the consumers to find out what specs they like.

 

Quote

The fact that everyone looking for advice on buying an HDR monitor on forums starts with some sort of Vesa certification requirement shows that the communication around what this certification does is non existent and most people take it for what is in the name.

And quite often you come to see techtubers rather whining about how the certification doesn't meet the unrealistic expectations of the consumers rather than making a video explaining the certification to the consumers so they could understand what the sticker actually means so they don't need to complain why they didn't get optical orgasm from DisplayHDR 400 certified monitor.

 

Quote

I just think this certification will change nothing about how current monitor marketing works. The brands will probably end up tuning the monitor exactly that a best case scenario will boost it's certification tier up a few and the real world performance will differ vastly. Just like every 1ms monitor on the market can only hit these kinds of response times in an absolute best case scenario and real world response times are actually 4-6 times slower. (Other than OLED, which are true 1ms, but then again Dell already started to market their 1ms OLED monitors with 0,1ms)

That will happen in any case unless there is organization that handles the testing and certificating of every and each product. Just look at something like the CE-marking that basicly covers nothing because it's manufacturers themselves who certify it and it's up to luck does their product get tested and found out not filling the requirements. Or cars fuel consumption measured by the manufacturer, things like overfilled tires, slight downhill and stripped down vehicles are basicly expected things to be done.

 

Also just to mention when consumers start to complain about something there is also the topic of content. You would be really surprised how much the percentage of people enjoying certain headphones changed just from the change of music genre used as demo because really, people are that simple, they don't like the song demoed, they don't like the headphones.

So should we also add for the certification that Spongebob Squarepants or Terminator 2 looks good on the "EnjoymentHDR" certified monitor or are we adventurous and take something like the Pink Floyd: The Wall as the benchmark for the certification? Just so that consumers know they are going to like the image of the certified monitor.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, LAwLz said:

I think you, and probably other people as well, have two fundamental misunderstandings of what DisplayHDR is.

 

1) DisplayHDR evaluates certain aspects of a monitor which is related to HDR. HDR is NOT just contrast. It is easy to assume it is, but it isn't. If you go and look at the various HDR standards that exist you will see that they encompasses a lot of things that are not related to contrast. In fact, most of the specifications are unrelated to HDR. 

 

2) DisplayHDR is not really an "evaluation" in the same way a benchmark is. The point is not to get an accurate measurement of what the display does in the same sense that a benchmark measures it. What DisplayHDR does is set a minimum requirement in several different tests. DisplayHDR is not a sticker that says "buy this monitor because it's good". It's a sticker that says "in the aspects tested, this monitor reaches at least these levels".

Buying a monitor based on DisplayHDR certifications is like buying an apartment based on how many rooms it has, without looking at any pictures of the rooms. The number of rooms is useful for filtering out apartments you know you aren't interested in, but you still have to look at pictures of what the rooms are like before determining if the apartment is good or not. 

"This apartment has 4 rooms" is not a useless statment just because you are interested in what condition the rooms are in.

Where did you get the idea that the Vesa DisplayHDR certification isn't about HDR performance? That's exactly what they're describing it is. They don't use industry-standard methods to measure the performance of certified displays, which are capable of representing real-world performance. For their certification process they went out of their way and invented measurement methods that specifically allow brands to "cheat" and get high tier performance, even though the monitor has barely any actual hardware capabilities to display HDR content. If you watch the Hardware Unboxed video i posted earlier you'd understand what i mean.

image.png.cdcda9073b162c582ae8c2a0e6e53a31.png

 

And again, how did this particular monitor in the Hardware Unboxed video get through a certification process that calls for "outstanding local dimming".

image.png.51bfa1ecb1388f05baf25ced3eb21704.png

 

2 hours ago, LAwLz said:

And I'd like to add a third misunderstanding.

3) Measuring displays is not an easy thing. There are literally hundreds of ways you can measure them. It is pretty much impossible to measure a display and give a final verdict for every single scenario. We have this issue with CPU benchmarks as well.

If you test a CPU using Cinebench then that may not be an accurate representation of how it performs in Quake 3.

Does that mean "Cinebench is useless", "Cinebench is a scam" or "I guess I was wrong for assuming Cinebench evaluated CPU performance"? No it doesn't. It just means that you used the incorrect measurement to evaluate it. The measurement you were after, Quake 3 performance, was not accounted for in the test you looked up, which was Cinebench.

 

Cinebech, and DisplayHDR, will accurately reflect certain aspects of the product. Neither of them will accurately reflect ALL aspect of a product, but neither test is set out to do so. Why not? Because measuring a display is extremely difficult and it is next to impossible to include all various aspects.

In terms of pixel response times it's completely different than with CPUs. While you cannot say how well a CPU performs in games when just looking at a Cinebench score, you can say pretty accurately what level of performance you get in games by doing standardized response time testing. A monitor doesn't care if the color information comes from a game or a web browser. So yes, measurement data like these tables will in fact be representative of ingame performance: (provided you know how to read them and actually understand the data)

image.thumb.png.82c8f0383601ba3e8184be4151370ae7.png

 

2 hours ago, LAwLz said:

I mean, just look at the naming of the certifications. DisplayHDR 500, DisplayHDR 1000, DisplayHDR 1400. What do the numbers represent? It represents beak brightness in a 10% center patch test. It doesn't represent contrast. DisplayHDR has some contrast tests included, but it is just one aspect out of many.

 

Contrast is only a small part of what is tested, and it is clearly not the primary focus. So to evaluate how "good" DisplayHDR's test is by only looking at contrast is like judging a fish by its ability to climb a tree. You are criticizing it for not doing something "properly" (which is a completely different debate) when that thing clearly isn't a high priority. 

That is the main problem. They test for features that aren't necessary for a good HDR experience, yet they leave out the most important one. Again, I know contrast isn't everything, but it's THE main thing that sets apart an HDR display from a bog standard SDR display. There are ways to objectively measure HDR performance, which you can see in a monitor review from Hardware Unboxed or display reviews from RTINGS for example.

 

2 hours ago, LAwLz said:

What are you on about?

I get the impression that you dislike VESA and are trying to find issues.

I don't dislike them. I simply don't trust them because of their actions in the past. I'm critical of half-assed nonsense, just like everyone should be.

More stickers on a box doesn't mean more better.

This has nothing to do with personal hatred.

 

2 hours ago, LAwLz said:

If you don't think they should test the displays at the maximum refresh rate, then what should they test them at?

For a realistic evaluation of a variable refresh rate gaming monitor you need multiple measuring points to test how they're performing at different refresh rates. For example the certification on a 240Hz display should at least encompass results from 60Hz, 120Hz, 180Hz and 240Hz imo.

 

2 hours ago, LAwLz said:

Also, how do you expect VESA to create a test that monitor manufactures can't optimize for?

 

Of course the monitor will perform worse when running at 100Hz vs 240Hz. Lower refresh rate inevitably leads to higher response times.

Lower refresh rate =/= slower response times. On most monitors this applies, but that's where really good monitors set themself apart. Their performance stays consistent and they don't need the user changing overdrive settings depending on what kind of fps they're expecting from a game. A consistent so-called "single overdrive mode experience" is imo among the most important features a monitor can have and brand generally don't talk about this being included or not. So this 

 

You say "of course it's worse at 100Hz" but the average buyer doesn't know that. In fact even most tech YouTubers don't know. So don't expect anything more from the average consumer.

 

2 hours ago, LAwLz said:

What's wrong with testing the monitor under the best conditions? Should reviewers not review processors with good coolers because they might boost more than for someone with a worse cooler?

It's standard procedure that most users have a cooler that is capable of running the CPU without thermal throttling. It's not standard procedure that gamers run their monitors locked at their maximum refresh rate. You're comparing apples to oranges.

 

2 hours ago, LAwLz said:

The "flaws" you pointed out with the tests apply to pretty much every single display measurement out there, including those from sites like Blurbusters. 

 

It's the same "the measurement is not perfect in every single testing condition I can think of so therefore it's just marketing BS that shouldn't exist" argument you use for disliking DisplayHDR. 

I expect monitor manufacturers to aim to make a good overall product, just like with everything i'm buying. I don't want them to optimize for one single test to get a certification that makes it look good, because that's not how i end up with a good product as the customer. I don't like buying a "one-trick-pony", and neither should you, be it monitors, a washing machine or literally anything else.

 

I know cherry picking information for marketing purposes is a thing in every industry. That doesn't mean i have to like or support it.

 

And again: I don't need or want this certification, as i'm perfectly capable of doing my own research on displays. And i'll dare say i know quite a lot about displays in general. (That's been pretty much my main point of interest in the tech space for the last few years) My gripe is that this certification doesn't help the average consumer in picking a display with overall good motion clarity, just as the DisplayHDR certification doesn't help consumers to find a good HDR display.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, Thaldor said:

That the certification doesn't tell "are you going to have great HDR experience" really is a hard one and would stand a lot on subjectivity of the person looking at the certification. Certification needs certain specs to be met, what those specs cannot be is subjective, as in "this monitor gives great HDR experience" isn't objective thing. Just like with headphones we can easily measure the frequency response and say how bad the V-curve of the headphones is, but that cannot tell will you have "great audio experience" with them because that depends do you love the V-curve of random cheap gaming headsets or would you rather have the flat curve of studio monitors.

Same as we can measure the colour response of a TV, calibrate it to show the natural color and measure it again to show that it really shows the exact colours it should show. What that doesn't tell is does someone like to watch that image.Some may love to watch the art movies with the exact sRGB image but if you're going to watch the reruns of The Bold and the Beautiful from the 90's, you probably want to bump that saturation, sharpness and contrast a bit out of the exact points.

 

majority expects the DisplayHDR to be a measure of "great HDR experience" which not a single certification can do. They can just ensure the specs are there, it's up to the consumers to find out what specs they like.

Like i said above, it's absolutely possible to nail down a good HDR experience using objective measurements. Color accuracity, PQ-tracking, contrast (and thus local dimming effectiveness), black levels, color gamut, bit depth can all be measured. The only thing that has to be evaluated subjectively is blooming. So other than blooming, it'd be possible for VESA to revise their tiers to encompass actual useful information using industry standard measurement methods and test patterns.

 

31 minutes ago, Thaldor said:

And quite often you come to see techtubers rather whining about how the certification doesn't meet the unrealistic expectations of the consumers rather than making a video explaining the certification to the consumers so they could understand what the sticker actually means so they don't need to complain why they didn't get optical orgasm from DisplayHDR 400 certified monitor.

Agreed. Generally the communication about these standards is rather bad, which is why consumers are mislead. Put the blame on the fail that is DisplayHDR on Vesa or the consumers, the result is the same: It fails at what it's trying to do.

 

31 minutes ago, Thaldor said:

That will happen in any case unless there is organization that handles the testing and certificating of every and each product. Just look at something like the CE-marking that basicly covers nothing because it's manufacturers themselves who certify it and it's up to luck does their product get tested and found out not filling the requirements. Or cars fuel consumption measured by the manufacturer, things like overfilled tires, slight downhill and stripped down vehicles are basicly expected things to be done.

The certification is done by "Vesa authorized test centers" so they basically have full control over which monitor get's it and which one fails.

 

31 minutes ago, Thaldor said:

Also just to mention when consumers start to complain about something there is also the topic of content. You would be really surprised how much the percentage of people enjoying certain headphones changed just from the change of music genre used as demo because really, people are that simple, they don't like the song demoed, they don't like the headphones.

So should we also add for the certification that Spongebob Squarepants or Terminator 2 looks good on the "EnjoymentHDR" certified monitor or are we adventurous and take something like the Pink Floyd: The Wall as the benchmark for the certification? Just so that consumers know they are going to like the image of the certified monitor.

That's why you can use neutral test patterns to evaluate a product. A subjective opinion using a specific content piece can be useful, but shouldn't be the basis of a technical review.

 

Back on topic:

The main problem here isn't that they decided to start another certification for motion performance, rather that they decided it'll be measured at one specific point of a monitor's performance spectrum and as a result will not be applicable 90% of the time.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

There are good standard certification badges out there

80+ as brought up

At the time it came out, most PSUs were between 50-70% efficient, it brought up the efficiency of the entire industry. 
True its not a measure of quality, but for 90% of use cases, it was a measure of good enough as it was hard to design a 80+ bronze 10 years ago and not be using parts that were half decent until you run high end hardware. It also does not measure peak efficiency, but brought up 20% and 100% loads as well. It didnt touch standby sure, but that wasnt the main issue back then. 


HDR400 fails at being even a useful baseline. One can argue that many monitors only peaked at 200-250 nits before it came out, but for the GOALS of HDR, HDR400 failed spectacularly to bring up the industry as a manufacturer can cheese that specific specification. As all you needed to do was accept the signal and overvolt/change the duty cycle at a 10% window to 350nits. It has been a worthless certification from DAY ZERO. 

This buyer beware mentality is insane, the WHOLE point of certifications is to lessen the need for a consumer to go super deep into technical aspects. HDR400 was and is used to mislead consumers about the capability of a screen. To argue, oh a consumer should know better is just the most asinine argument when even tech knowledge heavy consumers are duped by it. Certifications that fail to communicate a baseline of useful information are certifications that need to just not exist. HDR400 screens are not actually capable of displaying HDR content. but companies are able to slap a label on there claiming it is, and have VESA as a shield to pretend they are not being misleading. 

I do not mind ClearMR as a certification at the time as it looks more like the 80+ cert then HDR400 to me. Where its not saying x is good, but its saying x is in reality this, not the lies monitor manufactures were spreading before. 

 

Edit: also in term of contrast ratios. Im lost on some of yall. HDR by definition is increasing dynamic range... aka contrast. you are adding more stops. so you get more shadow detail without losing the highlights, or you add more highlight detail without losing the shadows. That is literally the definition of HDR. maximum brightness is a MEANS to this end. NOT the end. thats why OLEDS that cant hit 1000 nits are often better at HDR then a tradition LCD that does. Because of the black levels. That is also why TrueBlack HDR certifications exist now, and with other certifications has existed for the past 6 years. There is no HDR without high contrast, regardless of brightness. Your flood lights on a pick up are not HDR compliant.

Also in terms of HDR 400, realize saying it accepts HDR signal is misleading. It do... but then it throws out a lot of the data as HDR400 is 8 bit color space when HDR signals are 10 bit. It just has a chip with the lut to color map it back to 8 bit...

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Stahlmann said:

Where did you get the idea that the Vesa DisplayHDR certification isn't about HDR performance?

Where did you get the idea that HDR is only about contrast?

It seems like your entire post is built upon this assumption that "HDR = contrast", which is not the case when it comes to monitors. Again, none of the HDR standards out there is only about contrast. All of them include other things such as bit depth, color gamut and various other metrics, which DisplayHDR do focuses on and does a good job with certifying for.

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, LAwLz said:

Where did you get the idea that HDR is only about contrast?

I have repediately said mainly, not only. I'm starting to think you hear what you want to hear, not what i'm saying.

 

23 minutes ago, LAwLz said:

It seems like your entire post is built upon this assumption that "HDR = contrast", which is not the case when it comes to monitors. Again, none of the HDR standards out there is only about contrast. All of them include other things such as bit depth, color gamut and various other metrics, which DisplayHDR do focuses on and does a good job with certifying for.

I mentioned other specs like PQ-tracking, color accuracity, color gamut, black level etc. They're all part of it, yet contrast is still the most important factor of a good HDR experience. That's also the reason why pretty much any reputable reviewer will write a LCD monitor that lacks local dimming (or only uses edge-lit solutions) off as a non-HDR display. And Vesa doesn't even measure it for their HDR quality certification.

 

I'll gladly say it again: Most of the DisplayHDR spec is built upon non-standard tests to make it easier to pass. It mainly works as a selling point for brands, not as an HDR quality assurance for consumers. There are established ways to objectively measure HDR performance (as i mentioned above), but they deliberately didn't use them.

 

If you want more proof of their certification's failure then just dive deeper and check different Vesa certified HDR monitors. All the way up the ladder to their 600 (Targets professional/enthusiast-level laptops and high-performance monitors. -Vesa) and 1000 (Targets professional/enthusiast/content-creator PC monitors. -Vesa) tiers there are many certified monitors that lack basic HDR hardware like full array local dimming (FALD).

 

I hope i can make my point as to why i distrust them to bring up an actually useful certification clear.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, LAwLz said:

If you don't think they should test the displays at the maximum refresh rate, then what should they test them at?

Also, how do you expect VESA to create a test that monitor manufactures can't optimize for? 

 

Of course the monitor will perform worse when running at 100Hz vs 240Hz. Lower refresh rate inevitably leads to higher response times. What's wrong with testing the monitor under the best conditions? Should reviewers not review processors with good coolers because they might boost more than for someone with a worse cooler?

"under best conditions" aren't really what they should be made for and not the same use case.

When it comes to gaming, its the overall experience, but for TV/creator content its more about the most optimal/best settings. (would want that too for gaming)

Where you have to at least see how the monitor handles 60fps - 120 - 240, kind of the low, medium and highest setting and use one setting to see how all this combined becomes on a rating in visual clarity. As games will vary and the content the display will be used for and it would suck if your experience is a lot worse just because its a bit below its optimal point. that maybe 99% wont reach or use.

Edited by Quackers101
Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Stahlmann said:

Where did you get the idea that the Vesa DisplayHDR certification isn't about HDR performance?

I get that you think contrast is the most important thing for HDR and that it is the thing that should be focusing on, but not everyone shares the same opinion.

 

Let's look at what HDR10+ defines in its spec:

  • 10 bit color depth or higher (maximum of 16 bits)
  • Up to 8K resolution.
  • A different gamma curve compared to SDR (up to 10k nits and down to 0.0001 nits).
  • 4:2:0 chroma subsampling.
  • Rec. 2020 color space.

 

I get that you don't think high bit depth, resolution, chroma subsampling or color space are relevant to HDR because they can also be used for SDR content, but that is YOUR opinion of what the definition of HDR is. The monitor and video industry does not agree with you. They think HDR is more than just contrast. It's the full package, trying to raise the bar in more aspects than just contrast. That's what DisplayHDR does as well. It's not a certification about contrast. It's a certification that has some contrast elements in it, but takes into account a lot of different factors. Some with more focus than other ones, such as peak brightness which is the thing the various tiers are named after.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Quackers101 said:

"under best conditions" aren't really what they should be made for and not the same use case.

When it comes to gaming, its the overall experience, but for TV/creator content its more about the most optimal/best settings. (would want that too for gaming)

Where you have to at least see how the monitor handles 60fps - 120 - 240, kind of the low, medium and highest setting and use one setting to see how all this combined becomes on a rating in visual clarity. As games will vary and the content the display will be used for and it would suck if your experience is a lot worse just a bit below its optimal point. that maybe 99% wont reach or use.

But that's like saying "Top Gear should test which car is the fastest, but they are not allowed to drive above the speed limit".

The end result will just be that a lot of monitors will be the same, because you are not allowing them to stretch their legs. 

 

I would not be surprised if no monitor was able to pass even ClearMR 5000 if they were limited to 60Hz.

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, LAwLz said:

I get that you think contrast is the most important thing for HDR and that it is the thing that should be focusing on, but not everyone shares the same opinion.

 

Let's look at what HDR10+ defines in its spec:

  • 10 bit color depth or higher (maximum of 16 bits)
  • Up to 8K resolution.
  • A different gamma curve compared to SDR (up to 10k nits and down to 0.0001 nits).
  • 4:2:0 chroma subsampling.
  • Rec. 2020 color space.

 

I get that you don't think high bit depth, resolution, chroma subsampling or color space are relevant to HDR because they can also be used for SDR content, but that is YOUR opinion of what the definition of HDR is. The monitor and video industry does not agree with you. They think HDR is more than just contrast. It's the full package, trying to raise the bar in more aspects than just contrast. That's what DisplayHDR does as well. It's not a certification about contrast. It's a certification that has some contrast elements in it, but takes into account a lot of different factors. Some with more focus than other ones, such as peak brightness which is the thing the various tiers are named after.

The HDR10+ spec doesn't account for any kind of quality level of the HDR playback. It merely defines in what format HDR content is created. There is no level of quality associated with HDR10, HDR10+, Dolby Vision, HLG, etc.

 

The intent is not to redefine what HDR video is, but to assure a certain level of playback quality. That's what the DisplayHDR certification tried to achieve with it's tiers. And that's where they failed.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, LAwLz said:

I get that you think contrast is the most important thing for HDR and that it is the thing that should be focusing on, but not everyone shares the same opinion.

 

Let's look at what HDR10+ defines in its spec:

  • 10 bit color depth or higher (maximum of 16 bits)
  • Up to 8K resolution.
  • A different gamma curve compared to SDR (up to 10k nits and down to 0.0001 nits).
  • 4:2:0 chroma subsampling.
  • Rec. 2020 color space.

 

I get that you don't think high bit depth, resolution, chroma subsampling or color space are relevant to HDR because they can also be used for SDR content, but that is YOUR opinion of what the definition of HDR is. The monitor and video industry does not agree with you. They think HDR is more than just contrast. It's the full package, trying to raise the bar in more aspects than just contrast. That's what DisplayHDR does as well. It's not a certification about contrast. It's a certification that has some contrast elements in it, but takes into account a lot of different factors. Some with more focus than other ones, such as peak brightness which is the thing the various tiers are named after.

HDR is by definition contrast.... High Dynamic range means MORE f stops. So you can show more highlights without crushing the shadows, show more shoadows without blowing out the highlights.

All those color spaces, gama curves, 10 bit color depth and the ABILITY of a display to display those at the same time are all PREDICATED on the ability of a screen to show contrast. HDR10, and HDR10+ are SIGNALING standards, not the standard on how to display said signal. That is where the vesa DIsplayHDR spec (which DOES include contrast) and UHD Alliance certifications for displays come in to play. UHD HDR certification requires 20k Contrast ratio for LCD and 1m for OLED. 

UHD specs are A minimum brightness of 1,000 nits, along with a black level of a maximum of 0.05 nits (20,000:1 contrast ratio), or a minimum brightness of 540 nits, along with a black level of a maximum of 0.0005 (1,080,000:1).
Vesa specs are

image.thumb.png.d45f72927bb72024795dc33c75c42953.png

SDR content is generally mastered at 100 nits peaks with .1nit blacks... AKA the same as HDR 400... HDR 400 literally has the SAME range as SDR, you can just see it with the lights on in the room. 

It is physically IMPOSSIBLE to show the wider color space HDR requires a display to show, without increasing its contrast ratio.
 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, LAwLz said:

But that's like saying "Top Gear should test which car is the fastest, but they are not allowed to drive above the speed limit".

The end result will just be that a lot of monitors will be the same, because you are not allowing them to stretch their legs. 

 

I would not be surprised if no monitor was able to pass even ClearMR 5000 if they were limited to 60Hz.

yeah sure, but that just means one has to do more. that one can do 2 ratings instead, one for its x top speed and perf to its more overall score.

also this is again not like a car, but agree that it would be best to have it run at its best speed and visual quality. if one had a better way to deal with it.

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, starsmine said:

HDR is by definition contrast.... High Dynamic range means MORE f stops. So you can show more highlights without crushing the shadows, show more shoadows without blowing out the highlights.

All those color spaces, gama curves, 10 bit color depth and the ABILITY of a display to display those at the same time are all PREDICATED on the ability of a screen to show contrast. HDR10, and HDR10+ are SIGNALING standards, not the standard on how to display said signal. That is where the vesa DIsplayHDR spec (which DOES include contrast) and UHD Alliance certifications for displays come in to play. UHD HDR certification requires 20k Contrast ratio for LCD and 1m for OLED. 

UHD specs are A minimum brightness of 1,000 nits, along with a black level of a maximum of 0.05 nits (20,000:1 contrast ratio), or a minimum brightness of 540 nits, along with a black level of a maximum of 0.0005 (1,080,000:1).
Vesa specs are

image.thumb.png.d45f72927bb72024795dc33c75c42953.png
 

The way Vesa measures for their certification is hugely flawed. Again, i urge you to watch these 6 minutes to understand why:

If measured with Vesa's methods the contrast is around 18.000:1 which sounds impressive at first, but when measured in a more standard (and closer to reality) method it's around 4300:1, which is just not enough for HDR.

 

So no, they don't measure static contrast. They measure a best-case white luminance and a best case black luminance one after the other. That's dynamic contrast, which has next to no impact in reality.

 

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Stahlmann said:

The HDR10+ spec doesn't account for any kind of quality level of the HDR playback. It merely defines in what format HDR content is created. There is no level of quality associated with HDR10, HDR10+, Dolby Vision, HLG, etc.

You are constantly missing the points I am trying to make.

My point is that HDR, when talking about monitors and video, is not just contrast. The "ecosystem" surrounding HDR touches on way more than just contrast.

 

5 hours ago, Stahlmann said:

The intent is not to redefine what HDR video is, but to assure a certain level of playback quality. That's what the DisplayHDR certification tried to achieve with it's tiers. And that's where they failed.

According to you and your narrow definition that says static contrast is the most important thing.

 

 

5 hours ago, starsmine said:

All those color spaces, gama curves, 10 bit color depth and the ABILITY of a display to display those at the same time are all PREDICATED on the ability of a screen to show contrast. HDR10, and HDR10+ are SIGNALING standards, not the standard on how to display said signal. That is where the vesa DIsplayHDR spec (which DOES include contrast) and UHD Alliance certifications for displays come in to play. UHD HDR certification requires 20k Contrast ratio for LCD and 1m for OLED. 

Are you trying to say that things like a wider color gamut and 10 bit color just exist to be able to show higher contrast? Because if that's how I interpret your post, and that indicates a fundamental lack of understanding how video signals work.

You can have a bit depth of 1 and still have amazing contrast. You can have an extremely narrow color gamut and still have amazing contrast.

 

There are several aspects of "HDR video" that are completely unrelated to contrast. Color gamut and bit depth being two of them, and both are accounted for not only in the DisplayHDR certification, but also in standards for mastering HDR content.

 

 

5 hours ago, starsmine said:

UHD specs are A minimum brightness of 1,000 nits, along with a black level of a maximum of 0.05 nits (20,000:1 contrast ratio), or a minimum brightness of 540 nits, along with a black level of a maximum of 0.0005 (1,080,000:1).

Yes, and the Ultra HD Premium certification also includes minimum requirements for resolution, color bit depth, color gamut in addition to specifications that determines contrast such as peak brightness and black levels. Again, HDR specifications are not just about contrast. The specification you are referring to is proof of this, because they, just like DisplayHDR, are also setting minimum requirements for things completely unrelated to contrast.

 

I'll say it again just so that it is hard to miss.

HDR IS NOT JUST ABOUT CONTRAST! THESE COMPANIES AND ORGANISATIONS ARE TRYING TO CREATE ECO-SYSTEMS WHICH TOUCHES ON MORE THAN JUST CONTRAST! YES, I KNOW DYMAMIC RANGE ORIGINALLY REFERED TO CONTRAST BUT THAT IS NOT WHAT IT MEANS IN THIS CONTEXT!

 

Dynamic range when talking about things like DisplayHDR, Ultra HD Premium, various signalling standards like HDR10+, etc, all refer to various improvements to image quality, not just contrast. That is why whenever you look up HDR content you will see higher bit depth (which has NOTHING to do with contrast), wider color gamut (which has NOTHING to do with contrast) and other non-contrast related specifications like resolution. Hell, in the case of the UHD spec you mentioned they also have requirements for which audio formats needs to be supported. 

 

Stop thinking about HDR as "high" "dynamic" "range" in a literal sense, and think of it more as "SDR, but better", because that is how the industry is treating it.

Until you and people like Stahlmann realize this you will both be rumpy about how the industry is "wrong" because you think HDR should mean something different.

 

 

5 hours ago, starsmine said:

It is physically IMPOSSIBLE to show the wider color space HDR requires a display to show, without increasing its contrast ratio.

No it's not. You don't understand what you are talking about.

You can have great contrast with extremely limited color space. Case in point, black and white images. A black and white image has about as limited color gamut as you can get, and yet it can have infinite contrast.

You can also have terrible contrast but a really wide color gamut. This is way less common because you need fairly high end monitors for a wide color gamut, and at that point they typically have good contrast as well.

 

But contrast and color gamut are two completely different things that are unrelated to one another. One is not dependent on the other. You could create a monitor that has terrible contrast, but can display a very wide range of red and green colors. Just look up HSV for an example of this. In that representation of RGB, you can change the brightness without affecting hue at all.

Same with the other aspects of DisplayHDR. Color bit depth has nothing to do with contrast, yet it is a very important part of the DisplayHDR certification.

 

 

 

5 hours ago, Quackers101 said:

yeah sure, but that just means one has to do more. that one can do 2 ratings instead, one for its x top speed and perf to its more overall score.

also this is again not like a car, but agree that it would be best to have it run at its best speed and visual quality. if one had a better way to deal with it.

Got any examples of what those tests would look like? And don't you think having two or more ratings would just confuse consumers even more? All of a sudden we might end up in a situation where a really high refresh rate monitor has like 4 different ClearMR ratings because it was tested at 4 different refresh rates, while a 60Hz monitor might only have 1 rating.

And then we have the issue that the rating will inevitably be lower the lower the refresh rate is. So a 60Hz monitor might have the same score as one of the scores on the 240Hz monitor. That would just confuse people even more.

I think letting the monitors perform in a best case scenario and then judge them based on that is the most consistent, easiest and least confusing way of dealing with this. Is it perfect? No, but no test will ever be perfect. 

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, starsmine said:

SDR content is generally mastered at 100 nits peaks with .1nit blacks... AKA the same as HDR 400... HDR 400 literally has the SAME range relative contrast as SDR, you can just see it with the lights on in the room. 

Our eyes aren't linear, so in this regard I could imagine that 400:0.4 has a higher perceived contrast than 100:0.1 simply because of brightness.

  

18 hours ago, Stahlmann said:

The way Vesa measures for their certification is hugely flawed. Again, i urge you to watch these 6 minutes to understand why:

If measured with Vesa's methods the contrast is around 18.000:1 which sounds impressive at first, but when measured in a more standard (and closer to reality) method it's around 4300:1, which is just not enough for HDR.

 

So no, they don't measure static contrast. They measure a best-case white luminance and a best case black luminance one after the other. That's dynamic contrast, which has next to no impact in reality.

 

The 10% test is fine I guess, but man that "checkerboard test" is absolute nonsense. I expected better. VESA had a golden opportunity here to finally get some form "true HDR" reference for the consumer. Shame it's tainted with questionable measurements. I am curious about one thing though. Unless I misunderstood or missed it in the video, how does it reach 1,000 nits in VESA's 10% test but not in his own 10% test? Is there some Samsung-level shenanigans going on?

14 hours ago, LAwLz said:

You can have a bit depth of 1 and still have amazing contrast. You can have an extremely narrow color gamut and still have amazing contrast.

You'd have terrible image quality though because your 1 bit cannot capture gradients sufficiently. In a higher dynamic range you would need to be able to represent not just more fine-grained brightness gradient in terms of light output, but also more shades of the same colour to accurately capture that gradient, so a higher bit-depth can naturally benefit your there.

13 hours ago, LAwLz said:

Stop thinking about HDR as "high" "dynamic" "range" in a literal sense, and think of it more as "SDR, but better", because that is how the industry is treating it.

Until you and people like Stahlmann realize this you will both be rumpy about how the industry is "wrong" because you think HDR should mean something different.

Being "<old thing> but better" is the point of a succeeding new thing though. We don't label UHD as "FHD-but-better" and not in a literal sense having UHD resolution either. HDR is the "SDR-but-better", but it goes beyond what we now call SDR so it's not called SDR-but-better. I think you could even argue that the SDR-but-better already exists as e.g. Rec. 2020 which makes no mention of HDR and has Rec. 2100 as the HDR version as far as I know.

 

Yes the formats/standards include more than pure contrast, but if you can't create enough contrast for your specular highlight (i.e. can't create the high dynamic range), you can't render the HDR picture as intended, so I side with the notion that contrast should be an important factor for a certification about HDR performance. For me it is reinforced that it is an important factor, besides the name, by the arguably misleading way of measuring contrast by VESA here, the increase in peak brightness with the HDR formats and the age-old marketing obsession with contrast and now also peak brightness.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

On the "what happens at lower fps" question, the following may be a consideration. It is mainly about AdaptiveSync, but you could argue depending on the outcome they could expand it to other scenarios like this one too.

 

 

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, tikker said:

You'd have terrible image quality though because your 1 bit cannot capture gradients sufficiently. In a higher dynamic range you would need to be able to represent not just more fine-grained brightness gradient in terms of light output, but also more shades of the same colour to accurately capture that gradient, so a higher bit-depth can naturally benefit your there.

Yes, which is exactly my point.

Someone was trying to argue that the only thing that mattered in HDR is contrast, and that everything else was just a way to achieve higher contrast.

 

My argument is that it isn't, and that things like bit depth is also important, not because they increase contrast but because they increase image quality. 

 

 

 

1 hour ago, tikker said:

Being "<old thing> but better" is the point of a succeeding new thing though. We don't label UHD as "FHD-but-better" and not in a literal sense having UHD resolution either. HDR is the "SDR-but-better", but it goes beyond what we now call SDR so it's not called SDR-but-better. I think you could even argue that the SDR-but-better already exists as e.g. Rec. 2020 which makes no mention of HDR and has Rec. 2100 as the HDR version as far as I know.

That's exactly my point.

People are arguing that HDR only means higher contrast, and nothing else is included in the definition of "HDR". My argument is that the "HDR" that is being pushed by the industry refers to "SDR-but-better". Some people are trying to argue that HDR isn't "SDR-but-better" because to them, HDR is ONLY contrast. If something has high contrast, then it's HDR. 

 

When I say Rec. 2020 I referred to the color space aspect of Rec 2020. As far as I know, both Rec. 2020 and Rec. 2100 uses the same color space and white point. So for this particular discussion they are more or less the same.

 

 

 

1 hour ago, tikker said:

Yes the formats/standards include more than pure contrast, but if you can't create enough contrast for your specular highlight (i.e. can't create the high dynamic range), you can't render the HDR picture as intended, so I side with the notion that contrast should be an important factor for a certification about HDR performance. For me it is reinforced that it is an important factor, besides the name, by the arguably misleading way of measuring contrast by VESA here, the increase in peak brightness with the HDR formats and the age-old marketing obsession with contrast and now also peak brightness.

Yes I agree. I would also like for VESA to change the way that particular measurement is done, and put more focus on contrast.

The things I object to are:

1) People who say HDR is only about contrast and nothing else.

2) The people who say the certification is worthless because it doesn't focus on contrast.

 

If it was so worthless, then we would have a ton of monitors on the market with really high certificates. But we don't, because the other requirements are pretty damn high. So most monitors fail.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, LAwLz said:

 

That's exactly my point.

People are arguing that HDR only means higher contrast, and nothing else is included in the definition of "HDR". My argument is that the "HDR" that is being pushed by the industry refers to "SDR-but-better". Some people are trying to argue that HDR isn't "SDR-but-better" because to them, HDR is ONLY contrast. If something has high contrast, then it's HDR. 

Literally no one argued that

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×