Jump to content

Are top end HDR 1400 level monitors TOO bright (Hurts?)

jmc111

I've seen reviews that show the reviewer wearing sunglasses.

I assumed that the sunglasses were a joke...but now I'm wondering, were they?

 

Example monitor specs - ASUS ProArt Display PA32UCG-K - 32-inch, 4K, HDR 1400, 1600 nits.

This was tested here so hope someone here can say for sure about the HDR being truly uncomfortable.

 

I want to buy top end HDR but if that will cause great discomfort to my eyes in a game (dark area to sunlight) then no.

Needing sunglasses goes beyond "whose eyes are looking at the screen".

 

Thanks,

jmc111

 

Link to comment
Share on other sites

Link to post
Share on other sites

IMO yes, if the entire panel has a really high sustained brightness like >1000 nits, it's like getting flashbanged IRL.

 

I think it matters more if you only want one section of the screen to get really bright to emphasize like if you have a sun being displayed in a corner.

Workstation:  14700nonk || Asus Z790 ProArt Creator || MSI Gaming Trio 4090 Shunt || Crucial Pro Overclocking 32GB @ 5600 || Corsair AX1600i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

Ok, I think I'll step it down a bit and try for 1000HDR and stay off the bleeding edge.

Don't want to take a chance and have regrets at that level.

 

Looks like I'm in no rush as I don't know when I'll be buying a videocard. Not this year maybe next.

And they are talking about Displayport 2.0 now so next gen videocard probably.

I'm still running my Vega56 & 28 inch 4K Samsung from 2014. Maybe it will die and I'll have to get what I can get. 🙂

 

Thanks,

jmc111

Link to comment
Share on other sites

Link to post
Share on other sites

On 8/27/2021 at 12:44 AM, jmc111 said:

I've seen reviews that show the reviewer wearing sunglasses.

I assumed that the sunglasses were a joke...but now I'm wondering, were they?

 

Example monitor specs - ASUS ProArt Display PA32UCG-K - 32-inch, 4K, HDR 1400, 1600 nits.

This was tested here so hope someone here can say for sure about the HDR being truly uncomfortable.

 

I want to buy top end HDR but if that will cause great discomfort to my eyes in a game (dark area to sunlight) then no.

Needing sunglasses goes beyond "whose eyes are looking at the screen".

 

Thanks,

jmc111

 

When they advertise those nits specifications, its only gonna realistically be that bright on screen for a few seconds at best. There's a reason why its called 'peak brightness'. Sustained brightness is a lot different and for most scenes it should alright but if you primarily game in a dark environment then maybe you should go for oleds or something.

Link to comment
Share on other sites

Link to post
Share on other sites

You have to remember that most HDR content will not constantly blast your eyes with the monitors max brightness. ONLY objects (so called highlights) that are meant be this bright. (For example when looking into the sun or bright neon lights in a city) A higher peak brightness will not increase the overall scene brightness.

 

HDR content is not calling for a percentage when a certain brightness is required, but a Nit value. So content that is mastered for 1000 Nits peak brightness will run on the same brighness no matter if your monitor is HDR1000 or HDR1400. (If no Tone Mapping is enabled, but i don't want to get too technical)

 

SDR is different in that regard. It will always call out for a certain percentage when it comes to brightness, with 100% always being as bright as your monitor can be at it's currents settings.

 

Many objects and reflections out in the real world are easily several thousand "nits". So even HDR monitors are still extremely far away from being able to have real-world dynamic range. (Some HDR content is even mastered for 10.000 Nits peak brightness these days, which gives you an idea how far away from real world even HDR1400 monitors are)

 

People that say that you need sunglasses to sit in front of a 1400 Nit (peak) display just don't have any context as to how bright it actually is. And brightness doesn't scale linear with how our eyes perceive it. For example 1000 Nits does not look twice as bright as 500 Nits to our eyes. Sadly many people don't know how it works and then talk about it being too bright.

 

In the end HDR is a technology that can make games and movies look much closer to reality. But if you don't want that you can happily use any SDR monitor, turn down the brightness and save a lot of money. But if a game supports it and you have a good HDR monitor, the experience and realism is just on a whole other level. I can confirm that HDR adds a bit more eye-strain at the start, but you can and will get used to it.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

OK, wow, more to chew on!

 

It seems like the HDMI 2.1s are SLOWLY coming out now. DisplayPort 2.0 has gotta be next gen at best.

So I'll keep looking and maybe find the monitor I can't say no to by "next gen videocard" release time.

 

Thanks to all!

jmc111

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, jmc111 said:

OK, wow, more to chew on!

 

It seems like the HDMI 2.1s are SLOWLY coming out now. DisplayPort 2.0 has gotta be next gen at best.

So I'll keep looking and maybe find the monitor I can't say no to by "next gen videocard" release time.

 

Thanks to all!

jmc111

 

DP 2.0 is still a few years away. First we need monitors that actually need that much bandwidth before the port will be adopted. The first monitors that will NEED DP 2.0 will be 4K 240Hz monitors. Everything else will run just fine on DP1.4 with DSC. Then again 4K 240Hz doesn't make a lot of sense because there is no game that will run at this resolution and refresh rate either way.

 

If you're on the lookout for a really good HDR monitor, then the best ones that are currently available are the Asus PG35VQ and the Acer Predator X35. They're a few years old by now but still sport top of the line specs and performance even by today's standards combined with true HDR. 35" 3440x1440 200Hz HDR1000 monitors with 512 local dimming zones, etc. If you want something now that will carry you over the next years, these monitors are worth a look at. But they're expensive at $2500. I got mine as a deal for $1700 though.

 

I have the PG35VQ myself and it's HDR is extremely impressive. And that's coming from someone who also has an OLED TV.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, jmc111 said:

.

 

The preferred brightness depending on how bright the room is 100-250nits, the max full screen sustained brightness even on an hdr panel is about 400-600nits, but it can be tuned  down, tuned hdr works well and games look nice, dp2.0 isn't really needed as 1.4 with dsc works well with rtx 3000, but you do wanna make sure it has hdmi 2.1.

 

The UCG does have all that already but it's just 2000dollars more for a hdmi 2.1 port, and i've had multiple people told me that they don't see a difference between the pg27uq and pg32uqx (basically a pa32ucg-k), If money is no issue then yes you can get the ucg now, but know that the monitor isn't a true 120hz (there's slight ghosting in fast scenes) but it's as good as it gets, the panel is clearly pushed to the limit and it's alot of colors and pixels at 120hz. I can only recommend it as your first hdr experience and not if you have a similar monitor.

 

You can also look at a pg35vq, no port issues, realistically a 144hz-180hz uw hdr monitor that would give u 95% of the experience at 1800usd without port issues, just a bit less colors.

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 month later...

@ Stahlmann

@ xg32

 

Thanks a lot for the info.

 

I've never seen HDR, still on my $600, 2016 4k, 60, Samsung.

Which I bought so that I would NOT see "monitor" pixels anymore.

And I love the smooth screen!

 

EDIT... I am concerned that when I go from my 28" 4k monitor to a 32" 4k that I might see pixels.

(20 inch view distance) Sure hope not.

 

I've settled on HDMI 2.1 as a baseline acceptable level.

Am backing off the HDR1400 spec. Looking for HDR1000 level now.

 

The UCG is crazy level money as a sight unseen purchase.

 

The much cheaper $3000 Asus monitor under that is not HDMI 2.1 which I thinks is just nuts.

I would probably buy it if it was HDMI 2.1

 

Heh, it can not cost $2000 to put HDMI 2.1 on a monitor!

 

So maybe by late next year there will be an acceptable monitor out there.

 

jmc111

 

 

 

 

 

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×