Jump to content

Is HDR even worth it on 300 nits ?

lka
Go to solution Solved by Rauten,
26 minutes ago, Mark Kaine said:

Actually i have a "300 nits" hdr monitor  (msi) and i like the "hdr" with one big downside: its way too bright especially in a dark room. so i suppose the more "nits" the worse it gets? 

You don't need HDR for that, though - my main monitor is 350 nits and, if I open a white page, it gets blistering-ly bright.

All hail dark mode everything!

 

HDR, the way I see it, is more about "brightness contrast", rather than "let's make the user's eyes explode and then sell them new eye balls as DLC".

You can see this in LTT's own videos when they check HDR displays, they pay a lot more attention to the contrast in brightness between the dark areas and the bright areas of images/scenes.

And at 300 nits, yeah, don't even bloody bother. 400 nits is the bare minimum to even try, and even then, most 400 nits displays are often considered "HDR-ren't".

I have a Gigabyte G24F-EK monitor which has 300 nits and can turn on HDR in windows and games, but I heard that to get good HDR you need like 1000 nits. So I am wondering is it worth to turn it on, will the image be any better or worse ?

Link to comment
Share on other sites

Link to post
Share on other sites

From what I've heard, you really need a bare minimum of 600 nits to have an HDR experience. Also, without hundreds of local dimming zones, it just isn't worth it.

 

If you like the look, you can turn it on, but I keep it off on my Gigabyte M27Q - as is recommended for all HD-aRen't monitors.

Link to comment
Share on other sites

Link to post
Share on other sites

The whole point of HDR is an extended dynamic range. It's literally SDR with more stops on both the dark and light end of the spectrum.

 

As such, you need both a screen that can get very bright and one that can display pure black or as close as reasonably possible. LED driven panels are always on, so without full array local dimming (i.e. zones that can be individually controlled to be on or off, and preferably a high number of them), you can't display anything but shades of grey. Without brightness exceeding 600 nits, you can't display a "white" that differentiates enough from grey.

CPU: AMD Ryzen 9 5900X · Cooler: Artic Liquid Freezer II 280 · Motherboard: MSI MEG X570 Unify · RAM: G.skill Ripjaws V 2x16GB 3600MHz CL16 (2Rx8) · Graphics Card: ASUS GeForce RTX 3060 Ti TUF Gaming · Boot Drive: 500GB WD Black SN750 M.2 NVMe SSD · Game Drive: 2TB Crucial MX500 SATA SSD · PSU: Corsair White RM850x 850W 80+ Gold · Case: Corsair 4000D Airflow · Monitor: MSI Optix MAG342CQR 34” UWQHD 3440x1440 144Hz · Keyboard: Corsair K100 RGB Optical-Mechanical Gaming Keyboard (OPX Switch) · Mouse: Corsair Ironclaw RGB Wireless Gaming Mouse

Link to comment
Share on other sites

Link to post
Share on other sites

Having crappy HDR it tends to look worse except for Doom Eternal that looks a decent amount better to me and I don't have to turn it on in windows.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

Actually i have a "300 nits" hdr monitor  (msi) and i like the "hdr" with one big downside: its way too bright especially in a dark room. so i suppose the more "nits" the worse it gets? 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, Mark Kaine said:

Actually i have a "300 nits" hdr monitor  (msi) and i like the "hdr" with one big downside: its way too bright especially in a dark room. so i suppose the more "nits" the worse it gets? 

You don't need HDR for that, though - my main monitor is 350 nits and, if I open a white page, it gets blistering-ly bright.

All hail dark mode everything!

 

HDR, the way I see it, is more about "brightness contrast", rather than "let's make the user's eyes explode and then sell them new eye balls as DLC".

You can see this in LTT's own videos when they check HDR displays, they pay a lot more attention to the contrast in brightness between the dark areas and the bright areas of images/scenes.

And at 300 nits, yeah, don't even bloody bother. 400 nits is the bare minimum to even try, and even then, most 400 nits displays are often considered "HDR-ren't".

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, Mark Kaine said:

Actually i have a "300 nits" hdr monitor  (msi) and i like the "hdr" with one big downside: its way too bright especially in a dark room. so i suppose the more "nits" the worse it gets? 

IDK for computer I have only tried it on a display that could handle 300 nits.  The one TV I have that supports HDR is ok (600 nits).  I think for the best experience you need 1000 and a lot of dimming zones.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×