Jump to content

What does HDR performance looks like in a TV or monitor with 1000 nits but no full array local dimming?

e22big

Just kind of wondering, from just the data, FALD doesn't seem to help with the overall contrast by that much (in VA TV, it seems to increase the contrast from 1:3000 to 1:4000 or something like that from what I remembered.) In that case, is nit brightness more important when it comes to HDR? Like what if a monitor can do 1000 nits and more but only have edge-lit local dimming or none at all, would it perform better than a display with 700 nits brightness but have with full array? Personally, I have never seen HDR on a 1000 nits display but I've had one on a 700 something nit VA TV with FALD and I've never noticed any image quality improvement in particular 

 

No counting OLED of course as they can go as black as needed. 

Link to comment
Share on other sites

Link to post
Share on other sites

Actually black level is more important to HDR than brightness, though both are obviously important. Just take OLED for example. Typical OLEDs only have about 400 nits, but they're the gold standard for HDR because of the deep, inky blacks.

 

Specifically speaking of LED TVs or IPS monitor displays, having more brightness without a good FALD solution actually works against you. To reproduce black on these displays, you have to literally turn the backlight off. Otherwise, you just get grey. The brighter it is, the farther that grey is going to be from true black.

CPU: AMD Ryzen 9 5900X · Cooler: Artic Liquid Freezer II 280 · Motherboard: MSI MEG X570 Unify · RAM: G.skill Ripjaws V 2x16GB 3600MHz CL16 (2Rx8) · Graphics Card: ASUS GeForce RTX 3060 Ti TUF Gaming · Boot Drive: 500GB WD Black SN750 M.2 NVMe SSD · Game Drive: 2TB Crucial MX500 SATA SSD · PSU: Corsair White RM850x 850W 80+ Gold · Case: Corsair 4000D Airflow · Monitor: MSI Optix MAG342CQR 34” UWQHD 3440x1440 144Hz · Keyboard: Corsair K100 RGB Optical-Mechanical Gaming Keyboard (OPX Switch) · Mouse: Corsair Ironclaw RGB Wireless Gaming Mouse

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Chris Pratt said:

Actually black level is more important to HDR than brightness, though both are obviously important. Just take OLED for example. Typical OLEDs only have about 400 nits, but they're the gold standard for HDR because of the deep, inky blacks.

 

Specifically speaking of LED TVs or IPS monitor displays, having more brightness without a good FALD solution actually works against you. To reproduce black on these displays, you have to literally turn the backlight off. Otherwise, you just get grey. The brighter it is, the farther that grey is going to be from true black.

hmn I don't know, HDR works by sending a signal in an area where you need to brighten the spot and where to dim in order to make the spot brighter, OLED can display that information accurately of course but I don't think its deep black have anything to do with the average HDR signal reproduction (the deep inky black area will be there anyway in both SDR and HDR, and both will look better in the dark with OLED)

 

I've tried forcing a VA TV with high contrast to show just a black screen at peak brightness and it's still look black, the only time where maxing brightness seems to affect black level quality is when I use an IPS monitor but black in IPS appears grey even at lower nits

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, e22big said:

hmn I don't know, HDR works by sending a signal in an area where you need to brighten the spot and where to dim in order to make the spot brighter, OLED can display that information accurately of course but I don't think its deep black have anything to do with the average HDR signal reproduction (the deep inky black area will be there anyway in both SDR and HDR, and both will look better in the dark with OLED)

 

I've tried forcing a VA TV with high contrast to show just a black screen at peak brightness and it's still look black, the only time where maxing brightness seems to affect black level quality is when I use an IPS monitor but black in IPS appears grey even at lower nits

No, HDR works by sending a higher bit signal with an extended range. It's independent of display technology. The job then is on the display to best represent that signal as it can. That's easier to do with true blacks that higher brightness, because the signal literally has a curve applied to it that favors more information at lower brightness levels.

 

This video does a good job of explaining (cued up to where it talks about the curve that's applied, and why that matters):

 

 

CPU: AMD Ryzen 9 5900X · Cooler: Artic Liquid Freezer II 280 · Motherboard: MSI MEG X570 Unify · RAM: G.skill Ripjaws V 2x16GB 3600MHz CL16 (2Rx8) · Graphics Card: ASUS GeForce RTX 3060 Ti TUF Gaming · Boot Drive: 500GB WD Black SN750 M.2 NVMe SSD · Game Drive: 2TB Crucial MX500 SATA SSD · PSU: Corsair White RM850x 850W 80+ Gold · Case: Corsair 4000D Airflow · Monitor: MSI Optix MAG342CQR 34” UWQHD 3440x1440 144Hz · Keyboard: Corsair K100 RGB Optical-Mechanical Gaming Keyboard (OPX Switch) · Mouse: Corsair Ironclaw RGB Wireless Gaming Mouse

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, e22big said:

Just kind of wondering, from just the data, FALD doesn't seem to help with the overall contrast by that much (in VA TV, it seems to increase the contrast from 1:3000 to 1:4000 or something like that from what I remembered.) In that case, is nit brightness more important when it comes to HDR? Like what if a monitor can do 1000 nits and more but only have edge-lit local dimming or none at all, would it perform better than a display with 700 nits brightness but have with full array? Personally, I have never seen HDR on a 1000 nits display but I've had one on a 700 something nit VA TV with FALD and I've never noticed any image quality improvement in particular 

 

No counting OLED of course as they can go as black as needed. 

Depends on the image being shown.

An OLED for example with lower peak brightness will have a far superior image displaying a starfield and other space images.

While an LED Display, even one with FALD will loose some of the stars in the image that are not bright enough to activate the backlight in that area.

If the LED display does not have FALD then the star field will display everything but wont have a good black level and will result in an obvious grey in dark room environments.

 

For 'normal' content however, especially bright scenes, peak brightness will be more important, more so specificaly the 5% and 10% window peak brightness results as they account for highlights.

That said, most 1000nit displays will being using FALD as high brightness requires direct backlighting, edge lit backlights generally cant achieve very high brightness levels especially on large displays..

 

As for noticing image improvements with HDR, that depends on the source and the TVs luminance tone mapping to roll of brightness instead of clipping it. If your loosing detail in very bright scenes it means its being clipped and not being tone mapped very well.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, SolarNova said:

Depends on the image being shown.

An OLED for example with lower peak brightness will have a far superior image displaying a starfield and other space images.

While an LED Display, even one with FALD will loose some of the stars in the image that are not bright enough to activate the backlight in that area.

If the LED display does not have FALD then the star field will display everything but wont have a good black level and will result in an obvious grey in dark room environments.

 

For 'normal' content however, especially bright scenes, peak brightness will be more important, more so specificaly the 5% and 10% window peak brightness results as they account for highlights.

That said, most 1000nit displays will being using FALD as high brightness requires direct backlighting, edge lit backlights generally cant achieve very high brightness levels especially on large displays..

 

As for noticing image improvements with HDR, that depends on the source and the TVs luminance tone mapping to roll of brightness instead of clipping it. If your loosing detail in very bright scenes it means its being clipped and not being tone mapped very well.

I think most HDR1000 display has an edge lit dimming though (Predator X27, Samsung G9, ROG PG43UQ etc.) very few model outside of a full on TV actually come with a full array local dimming but I get your point

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, e22big said:

I think most HDR1000 display has an edge lit dimming though (Predator X27, Samsung G9, ROG PG43UQ etc.) very few model outside of a full on TV actually come with a full array local dimming but I get your point

I think that's just a side effect of the focus for HDR lying more on movies compared to games and that HDR also typically is tied to 4k or at least higher-than-FHD resolutions. If the Steam hardware survey is representative the majority of people are still running 1920x1080 on XX60 level cards. For the money proper ones cost people would rather buy a 50" class HDR TV than a 27" class monitor for gaming. HDR monitors are just an extremely niche and expensive market currently.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, tikker said:

I think that's just a side effect of the focus for HDR lying more on movies compared to games and that HDR also typically is tied to 4k or at least higher-than-FHD resolutions. If the Steam hardware survey is representative the majority of people are still running 1920x1080 on XX60 level cards. For the money proper ones cost people would rather buy a 50" class HDR TV than a 27" class monitor for gaming. HDR monitors are just an extremely niche and expensive market currently.

But of course, a monitor with proper HDR capability is insanely expensive and don't yield much advantage in competitive gameplay

 

The majority of Steam survey participant wouldn't be in the market for a monitor with HDR in the first place I think, and it's kind of hard to interprete that data since there are many overlapping group of market represented in the overall concept of PC gamers. Just imagine that VAG video Linus hosted yesterday, a people who mainly play AAA open world or RPG is very differnet from those who mainly play competitive FPS yet the latter overwhelmingly dominated the market.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, e22big said:

But of course, a monitor with proper HDR capability is insanely expensive and don't yield much advantage in competitive gameplay

Yeah. I intended to say that that's probably the reason they don't want to really invest in FALD there yet, but perhaps I misunderstood your comment in that regard.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, e22big said:

Just kind of wondering, from just the data, FALD doesn't seem to help with the overall contrast by that much (in VA TV, it seems to increase the contrast from 1:3000 to 1:4000 or something like that from what I remembered.) In that case, is nit brightness more important when it comes to HDR? Like what if a monitor can do 1000 nits and more but only have edge-lit local dimming or none at all, would it perform better than a display with 700 nits brightness but have with full array? Personally, I have never seen HDR on a 1000 nits display but I've had one on a 700 something nit VA TV with FALD and I've never noticed any image quality improvement in particular 

 

No counting OLED of course as they can go as black as needed. 

hdr monitors with certain certifications generally gives higher peak brightness and brighter colors in general, whether it looks better still depends on the content and display. FALD and OLED has better contrast/true blacks so the bright scenes in hdr pops even more. Without FALD and OLED tech, ips glow in dark scenes can obviously ruin a scene.

 

HDR for the most part isn't ready yet, you only need an oled tv or monitor with good color coverage for everything to look nice.

 

But here are some of the best hdr implementations i've seen in games: sekiro, hitman metro exodus, anthem (i know) cyberpunk 2077 mhw special k hdr filters. I found the hdr in bfv and division 2 to be slightly off, which is likely a personal perference.

 

I don't use the full 1000nits, the brightness is turned down and peak is around 500 but it still looks better than hdr400/600 displays.

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, xg32 said:

hdr monitors with certain certifications generally gives higher peak brightness and brighter colors in general, whether it looks better still depends on the content and display. FALD and OLED has better contrast/true blacks so the bright scenes in hdr pops even more. Without FALD and OLED tech, ips glow in dark scenes can obviously ruin a scene.

 

HDR for the most part isn't ready yet, you only need an oled tv or monitor with good color coverage for everything to look nice.

 

But here are some of the best hdr implementations i've seen in games: sekiro, hitman metro exodus, anthem (i know) cyberpunk 2077 mhw special k hdr filters. I found the hdr in bfv and division 2 to be slightly off, which is likely a personal perference.

 

I don't use the full 1000nits, the brightness is turned down and peak is around 500 but it still looks better than hdr400/600 displays.

Cyberpunk has the worst HDR implementation ever imo -_-

 

The best looking HDR I've seen is Star War Jedi Fallen Order (but I am not that into this kind of game)

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×