Jump to content

Why 200 nit calibration?

e22big

When reviewing a monitor or TV, I often see people calibrate it to 200 nit for reference when testing other areas of their performance (contrast, power consumption etc.) 

 

I wonder why this is the case though? Why exactly 200 nits and why not more or less? It doesn't sond like a standard brightness that most people use (or do they?)

Link to comment
Share on other sites

Link to post
Share on other sites

Everyone has to set the brightness to the value they are most comfortable with. For me it ended up at 150 nits. This value is not too dim for daytime viewing and not too bright for night time viewing. You'll have to play around with the brightness slider until you find something comfortable. The result will change for each person and each room. I recently did a calibration for a friend and because his room is rather dark even at daytime, his comfortable brightness ended up at 80 nits.

 

Also, the lower you can get your brightness, the more comfortable it will be for your eyes.

 

There is no real "standard value". Some reviewers measure while calibrated to 200 nits (Hardware Unboxed), some do it at 100 nits (Rtings). It really doesn't matter.

 

When you set your monitor to be dimmer, first it might look too dim. Just let your eyes adjust a few minutes and it and then make your decision to go brighter or dimmer.

 

Or course, capping your brightness to such low values only applies for SDR. For HDR you should keep it at 100% and let the PQ-mapping decide how bright it should be. In HDR the content-creator has to decide at what brightness it should run.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

200 is way too bright for me, it's actually around 120-160, of course it's personal preference, color accuracy at the brightness you are comfortable with is important for some, there are also people that like their monitors oversaturated 😆 

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Stahlmann said:

Everyone has to set the brightness to the value they are most comfortable with. For me it ended up at 150 nits. This value is not too dim for daytime viewing and not too bright for night time viewing. You'll have to play around with the brightness slider until you find something comfortable. The result will change for each person and each room. I recently did a calibration for a friend and because his room is rather dark even at daytime, his comfortable brightness ended up at 80 nits.

 

Also, the lower you can get your brightness, the more comfortable it will be for your eyes.

 

There is no real "standard value". Some reviewers measure while calibrated to 200 nits (Hardware Unboxed), some do it at 100 nits (Rtings). It really doesn't matter.

 

When you set your monitor to be dimmer, first it might look too dim. Just let your eyes adjust a few minutes and it and then make your decision to go brighter or dimmer.

 

Or course, capping your brightness to such low values only applies for SDR. For HDR you should keep it at 100% and let the PQ-mapping decide how bright it should be. In HDR the content-creator has to decide at what brightness it should run.

guess I am the minority then, I actually prefer my monitor at at least 300 nit, perferably 350 although I tried to tone it down to 200 nit recently to save some electric bill

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, e22big said:

guess I am the minority then, I actually prefer my monitor at at least 300 nit, perferably 350 although I tried to tone it down to 200 nit recently to save some electric bill

No you're not a minority. Like i said everyone has to set it how he needs or wants it. There is literally no drawback or advantage to higher or lower brightness when we're talking about SDR other than eye strain. And even this varies from person to person with some being affected less and some more.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×