Jump to content

HDR brightness to properly simulate a sun on screen

Stahlmann

Hey good people,

 

I've been thinking: How bright does a HDR display actually have to be to properly simulate a blinding sun on the screen?

 

I know there are screens like the Vizio P Series Quantum X with over 2000 Nits of peak brightness, but i've never actually seen such a bright panel in real-life, so i have no personal context on how bright this actually is to the human eyes.

The brightest displays i have ever seen personally are my LG 27GN950, which goes to about 750 Nits in HDR and my LG C9, which can reach about 850 Nits. And with both of these displays i have never had a moment in HDR content when i actually had the feeling of being blinded and the reflex to close my eyes.

 

I know some movies are mastered at up to 4000 Nits. How close is this to my question about the sun?

 

I'd like to hear your thoughts about this!

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

it really depends on your environment, a flash of 1000 nits in a pitch black room will blind you but in a normal environment with sunlight and stuff it won't 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, ki8aras said:

it really depends on your environment, a flash of 1000 nits in a pitch black room will blind you but in a normal environment with sunlight and stuff it won't 

Normally i watch movies in a competely dark room. And because of the OLED screen that's pretty much the way to go. But the sun will even blind you when you're on the outside. And at least the screens with their peak brightness i have don't do that in the slightest when it's day outside.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Stahlmann said:

Normally i watch movies in a competely dark room. And because of the OLED screen that's pretty much the way to go. But the sun will even blind you when you're on the outside. And at least the screens with their peak brightness i have don't do that in the slightest when it's day outside.

yeah but the sun is a hot ball of mass 150 million km from the earth beaming enough energy at us to support human life

 

while your display uses a couple hundred watts at most

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, ki8aras said:

yeah but the sun is a hot ball of mass 150 million km from the earth beaming enough energy at us to support human life

 

while your display uses a couple hundred watts at most

I know but i'm just interested on the theoretical part of it, and how bright a screen would actually have to be. I know it's not a realistic expectation to have a screen that IS that bright.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

You're talking of the actual sun disk?

Up to 1.6 billion apparently.

 

https://en.wikipedia.org/wiki/Orders_of_magnitude_(luminance)

F@H
Desktop: i9-13900K, ASUS Z790-E, 64GB DDR5-6000 CL36, RTX3080, 2TB MP600 Pro XT, 2TB SX8200Pro, 2x16TB Ironwolf RAID0, Corsair HX1200, Antec Vortex 360 AIO, Thermaltake Versa H25 TG, Samsung 4K curved 49" TV, 23" secondary, Mountain Everest Max

Mobile SFF rig: i9-9900K, Noctua NH-L9i, Asrock Z390 Phantom ITX-AC, 32GB, GTX1070, 2x1TB SX8200Pro RAID0, 2x5TB 2.5" HDD RAID0, Athena 500W Flex (Noctua fan), Custom 4.7l 3D printed case

 

Asus Zenbook UM325UA, Ryzen 7 5700u, 16GB, 1TB, OLED

 

GPD Win 2

Link to comment
Share on other sites

Link to post
Share on other sites

58 minutes ago, Stahlmann said:

I know but i'm just interested on the theoretical part of it, and how bright a screen would actually have to be. I know it's not a realistic expectation to have a screen that IS that bright.

I mean, you're talking relative to Earth which is something that can be calculated...

Quote

Earth receives around 133,200 lumens per square metre from the Sun, noting that this square metre has the Sun directly overhead. We can also divide this figure by the total power from the Sun, 133200/1366 = 97.5 lumens per watt. This is called luminous efficacy, the lighting efficiency of the Sun per watt of power

https://everythingwhat.com/how-many-lumens-is-the-sun-on-earth

 

Sure its bright but its nothing we cannot easily overcome with enough power.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

Well, for example movies are mastered at 4000cd/m2 and some at 10000cd/m2 but obviously people don't have such displays. At least not yet.

We'll definitely need a new display tech like MicroLED to see even further visual and HDR advances.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

On 2/26/2021 at 4:45 PM, Doobeedoo said:

Well, for example movies are mastered at 4000cd/m2 and some at 10000cd/m2 but obviously people don't have such displays. At least not yet.

We'll definitely need a new display tech like MicroLED to see even further visual and HDR advances.

Yeah microLED will likely still be a few years off. And then it'll need a few more years to get cheap enough to get into the mainstream. I'd say not to expect any significant microLED TV in the consumer space for the next 8 years. Currently the battle is still going down between LED and OLED.

 

At least the new gen of OLEDs will get a lot brighter. One of the next Sony OLEDs will apparently get up to 1300cd/m2 peak brighness. And like i already said, the Vizio Quantum Series X already has over 2000cd/m2 peak brightness.

 

Man i'd really like to experience the Vizio just one time in a pitch black room to see what such HDR looks like.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Stahlmann said:

Yeah microLED will likely still be a few years off. And then it'll need a few more years to get cheap enough to get into the mainstream. I'd say not to expect any significant microLED TV in the consumer space for the next 8 years. Currently the battle is still going down between LED and OLED.

 

At least the new gen of OLEDs will get a lot brighter. One of the next Sony OLEDs will apparently get up to 1300cd/m2 peak brighness. And like i already said, the Vizio Quantum Series X already has over 2000cd/m2 peak brightness.

 

Man i'd really like to experience the Vizio just one time in a pitch black room to see what such HDR looks like.

Yeah, it will take time... For monitors though, we'll be limited to LCD for a while. I even doubt we'll see more regular sized high refresh rate HDR OLED monitors. The OLED tech can see improvements, but still it has it's organic nature that definitely will pose issues with burn-in for sure.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Doobeedoo said:

Yeah, it will take time... For monitors though, we'll be limited to LCD for a while. I even doubt we'll see more regular sized high refresh rate HDR OLED monitors. The OLED tech can see improvements, but still it has it's organic nature that definitely will pose issues with burn-in for sure.

LG has announced a 32" 4K OLED monitor for 2021. Sadly it's only 60Hz and only get's the HDR400 "True black" certification, so it doesn't really get that bright aswell. That's likely to prevent burn-in as it's severity is directly dependant on the OLED's brightness. Not to mention the outlandish price tag. (I'm getting a little off-topic here 😄)

 

It'll be interesting how these brighter OLEDs are in terms of burn-in risk. At least my C9 OLED doesn't yet have any signs of burn-in after about 1 year of use. Including a lot of use as my PC monitor.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Stahlmann said:

LG has announced a 32" 4K OLED monitor for 2021. Sadly it's only 60Hz and only get's the HDR400 "True black" certification, so it doesn't really get that bright aswell. That's likely to prevent burn-in as it's severity is directly dependant on the OLED's brightness. Not to mention the outlandish price tag. (I'm getting a little off-topic here 😄)

 

It'll be interesting how these brighter OLEDs are in terms of burn-in risk. At least my C9 OLED doesn't yet have any signs of burn-in after about 1 year of use. Including a lot of use as my PC monitor.

Yeah I know, there are some OLED monitors, but no 1000+ nits or 240Hz ones for example like we've seen from LCD monitors. Can only imagine burn-in on OLED with such brightness with all day static UIs on it, oof. On the other hand I'm surprised we haven't seen literally any very high refresh rate model though, OLED being faster than LCD.

 

Man I'd be scared using those 4K 120Hz monitors/TVs as monitor all day with static desktop/game elements. But yeah good to see hear no issues so far.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Doobeedoo said:

Yeah I know, there are some OLED monitors, but no 1000+ nits or 240Hz ones for example like we've seen from LCD monitors. Can only imagine burn-in on OLED with such brightness with all day static UIs on it, oof. On the other hand I'm surprised we haven't seen literally any very high refresh rate model though, OLED being faster than LCD.

 

Man I'd be scared using those 4K 120Hz monitors/TVs as monitor all day with static desktop/game elements. But yeah good to see hear no issues so far.

Burn-In is basically not a big problem if you have mixed content. I don't use my TV exclusively as a PC-monitor. Sometimes i watch movies, sometimes i watch games. For most PC games i use my other 4K 144Hz monitor. The OLED is mostly being used for singleplayer games like AC: Odyssey. And in these games you can also often disable static UI elements you don't want/need. Burn-in is very low-risk with modern OLEDs and it makes me sad it's enough of a reason for many people to steer away from OLED. The picture is just next-level compared to LED/LCD.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Stahlmann said:

Burn-In is basically not a big problem if you have mixed content. I don't use my TV exclusively as a PC-monitor. Sometimes i watch movies, sometimes i watch games. For most PC games i use my other 4K 144Hz monitor. The OLED is mostly being used for singleplayer games like AC: Odyssey. And in these games you can also often disable static UI elements you don't want/need. Burn-in is very low-risk with modern OLEDs and it makes me sad it's enough of a reason for many people to steer away from OLED. The picture is just next-level compared to LED/LCD.

That may be true, mixed content, also having multiple displays like so. Though if you'd only have a single display, that display would be used more and in my case I'd be worried long-term since I have my PC on for very long daily and be it on desktop work or entertainment, a lot of static UI. Same for many games I'd play for long, a lot of static UI. So for such high end display investment I'd expect to have up to a decade for example, yeah I don't want burn-in UIs 😄

But no doubt image quality is next-level over any LCD.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Stahlmann said:

Burn-In is basically not a big problem if you have mixed content. I don't use my TV exclusively as a PC-monitor. Sometimes i watch movies, sometimes i watch games. For most PC games i use my other 4K 144Hz monitor. The OLED is mostly being used for singleplayer games like AC: Odyssey. And in these games you can also often disable static UI elements you don't want/need. Burn-in is very low-risk with modern OLEDs and it makes me sad it's enough of a reason for many people to steer away from OLED. The picture is just next-level compared to LED/LCD.

Chrome burnt in on mine so being on forums and YouTube did it. Now it is games and TV content only for my OLEDs. 

 

Here is what it looked like a 4900 hours.

Chromeburnin.thumb.jpg.6600ebd9fb2b2ee019506c4bcaab3d7d.jpg

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, jones177 said:

Chrome burnt in on mine so being on forums and YouTube did it. Now it is games and TV content only for my OLEDs. 

 

Here is what it looked like a 4900 hours.

Chromeburnin.thumb.jpg.6600ebd9fb2b2ee019506c4bcaab3d7d.jpg

Yeah but that's not really noticeable in real content is it? You can barely see it even when especially watching out for it.

 

(Tbh i don't even see it in your picture...)

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Stahlmann said:

Yeah but that's not really noticeable in real content is it? You can barely see it even when especially watching out for it.

 

(Tbh i don't even see it in your picture...)

It barely shows up viewed on a monitor or on the NanoCell TV I am using now but it is easy to spot viewed on an OLED.

It is only noticeable in games that have lots of blue sky like Horizon Zero Dawn.  In movies it usually blends in with the sky.

 

Using the OLED as a normal monitor was a test that failed. It has not put me off using them for gaming.  

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

On 2/26/2021 at 5:43 PM, Stahlmann said:

I know but i'm just interested on the theoretical part of it, and how bright a screen would actually have to be. I know it's not a realistic expectation to have a screen that IS that bright.

Well, the perception of solar brightness depends as much on raw luminance glow of the monitor as well as the distance you sit from them. One of a more scientific way to consider this is to look at the specialised medical lamp for people with SAD condition (Seasonal Affective Disorder) which stimulate the effect of natural sun light for people who don't have easy access for it. SAD lamp is often measure in lux which in turn measure in lumen (how further can the said light travel from the said source) per squre metre, by some lux 10 nit is measured at roughly 30 lux - in order for SAD lamp to be effective, you are looking for something around 10,000 lux which probably, you can workout the rest of the math

 

https://sciencing.com/convert-footcandles-lumens-5946754.html

Alternatively, if you want to experience for yourself the effect of a 1000 nit brightness and more, the easy source for that is basically your phone. Phone like Samsung S21 can do up to 1500 nit, you can go to their store and try crank up their brightness to the max. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×