Jump to content

Does the perceived brightness of a display vary with distance from it?

avg123

HDR capable TVs need to display a minimum amount of brightness among other things. If a phone is made HDR capable, would the brightness requirement be lower since we hold the screen much closer compared to a TV?

Link to comment
Share on other sites

Link to post
Share on other sites

while there are different brightnesses for HDR on phones vs TVs, this is mainly due to the size of the screen and doesn't matter what distance you are from it. light will always be the same brightness regardless if you are 1cm or 10 m from it all that changes is it's diffusion. the more variance between the darkest color a screen can display and the brightest it can be display is what determines the HDR values and standard.

 

You will generally find that a phone will have a lower peak brightness but also a much darker black level, whereas TV's will have a higher peak birghtness (in some cases double a phone) but will also not have true black. It's still HDR but there is no reason for a phone to have a 1000nit screen since it would drain the battery soooo fast

 

 

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, avg123 said:

HDR capable TVs need to display a minimum amount of brightness among other things. If a phone is made HDR capable, would the brightness requirement be lower since we hold the screen much closer compared to a TV?

While physically it makes sense as air does affect light (block, absorb, scatter?), in practice it doesn't matter because the effect is too small to matter at the typical distances you're viewing the display.

 

To put in perspective, you wouldn't be able to see the stars if it was really that bad. However, there is merit to having telescopes in outer space or at the very least, higher in altitude for reasons of having less air to screw with whatever you're trying to pick up from space.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, M.Yurizaki said:

While physically it makes sense as air does affect light (block, absorb, scatter?), in practice it doesn't matter because the effect is too small to matter at the typical distances you're viewing the display.

*sighs*

 

Yes, yes it does, it does SIGNIFICANTLY so, it's called the 'Inverse Square Law'.

 

https://en.wikipedia.org/wiki/Inverse-square_law

 

In short, if your distance from a light source is doubled, then your subject (In this case your eyeballs) receives only one quarter of the light.  This is like a basic lesson in photography, film production, or just 11th grade science class. >.<  Increase the distance by 4x?  You get 1/16th of the light.  And so on.

 

And no it has NOTHING to do with the air causing absorption or scatter or anything like that.  That's an ADDITIONAL factor.  The inverse square law assumes light is operating in a vacuum, adding any kind of atmosphere creates additional challenges to the transmission of light.

Link to comment
Share on other sites

Link to post
Share on other sites

While I appreciate the science tidbit, 11th grade science class for me was chemistry and I didn't take classes/lessons in photography or film production so ¯\_(ツ)_/¯

 

Anyway, the question is still up in the air: does the perceived brightness vary with distance? Because taking the inverse square law at face value, there should be perceptible differences in apparent brightness just by not moving much at all. As a not so scientific test, I looked at my monitor from up close and then from a good distance away. The brightness is still pretty much perceived to be the same. And I don't think my pupils reacting would have any effect as I did this in a well light room and the monitor's brightness isn't cranked up. I mean, if I had my nose up against the monitor (give or take two inches) and then walked away about four feet, that's 24 times the distance, and according to the inverse square law, that should mean about 1/576 of the light is reaching my eyes. But the monitor's brightness is still apparently the same.

 

I went around trying to figure out what would explain that. It's Fechner's Law. Or at least it was, now it's thought to be a power function. And then there's other factors like how focused the light is or whatnot. So sure, the actual brightness of a display varies with distance but the perceived brightness might not change dramatically if at all.

 

However for craps and laughs I decided to take pictures of my monitor at varying distances with the camera in manual mode so the settings were locked down. And again, if I were to take the inverse square law at face value, there should be noticeable differences in the brightness of these images (they were scaled to be roughly the same resolution):

 

monitor-pictures.jpg.416a274e4870e75445a4a21a14dbcc5d.jpg

 

So the inverse square law probably works if you're trying to illuminate something with a light source. But if you're perceiving the light source directly, it doesn't make sense.

 

I also welcome any more explanations, preferably without the subtle insults.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×