Jump to content

PSA: Fix (Sort of) HDR on Samsung Odyssey G7

So for a while, I was struggling to get my Odyssey G7 to look good with HDR enabled. Games looked extremely desaturated and washed out. Almost a year later I finally discovered what was going on the whole time. Here's an explanation:

 

On SDR, the Odyssey G7 defaults to 50 50 50 for RGB color balance. This is the correct setting and setting to 100 100 100 looks very oversaturated. However, on HDR, you have to set the colors to 100 100 100. When switching back and forth from HDR, the color balance will then automatically adjust to 50 50 50 or 100 100 100 depending what mode you're in.

 

Some games now look mind-blowing (such as Horizon Zero Dawn), while others with mediocre implementation look actually acceptable instead of a gray mess. 

 

Not sure if this is fixed on newer monitors or firmware but just thought I'd leave this here in case anyone was struggling like me.

Link to comment
Share on other sites

Link to post
Share on other sites

Sort of? HDR typically has brighter and more vivid colors because of the higher dynamic range, so you're basically compensating here for the subpar HDR implementation by oversaturating. That does kind of give more of an HDR-like pop, but it's still not HDR. The G7 simply doesn't get bright enough for HDR, and while the VA panel helps with black levels because of the higher native contrast, it's still not true black, because of the always on LED backlight.

 

HDR requires 600+ nits and full array local dimming (FALD) or a display technology where each pixel can be turned on/off like OLED or Micro LED. Anything less isn't going to cut it, and things will look desaturated because the monitor is only really capable of displaying the middle part of the HDR range.

 

Unfortunately, manufacturers just love marketing their monitors as "HDR Ready" or "HDR10", but all that really means is that they support an HDMI/DP standard that supports an HDR signal, not they they are actually capable of displaying HDR content.

 

CPU: AMD Ryzen 9 5900X · Cooler: Artic Liquid Freezer II 280 · Motherboard: MSI MEG X570 Unify · RAM: G.skill Ripjaws V 2x16GB 3600MHz CL16 (2Rx8) · Graphics Card: ASUS GeForce RTX 3060 Ti TUF Gaming · Boot Drive: 500GB WD Black SN750 M.2 NVMe SSD · Game Drive: 2TB Crucial MX500 SATA SSD · PSU: Corsair White RM850x 850W 80+ Gold · Case: Corsair 4000D Airflow · Monitor: MSI Optix MAG342CQR 34” UWQHD 3440x1440 144Hz · Keyboard: Corsair K100 RGB Optical-Mechanical Gaming Keyboard (OPX Switch) · Mouse: Corsair Ironclaw RGB Wireless Gaming Mouse

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Chris Pratt said:

Sort of? HDR typically has brighter and more vivid colors because of the higher dynamic range, so you're basically compensating here for the subpar HDR implementation by oversaturating. That does kind of give more of an HDR-like pop, but it's still not HDR. The G7 simply doesn't get bright enough for HDR, and while the VA panel helps with black levels because of the higher native contrast, it's still not true black, because of the always on LED backlight.

 

HDR requires 600+ nits and full array local dimming (FALD) or a display technology where each pixel can be turned on/off like OLED or Micro LED. Anything less isn't going to cut it, and things will look desaturated because the monitor is only really capable of displaying the middle part of the HDR range.

 

Unfortunately, manufacturers just love marketing their monitors as "HDR Ready" or "HDR10", but all that really means is that they support an HDMI/DP standard that supports an HDR signal, not they they are actually capable of displaying HDR content.

 

In Assassins Creed games, there's a slider to set your monitor's brightness in nits. Using the 50 50 50 color balance, 600 nits (the brightness of the G7) made bright points like the sun look inverted so they looked like black holes or something, so I had to use like 1000 nits which is of course not the brightness of my monitor. But after changing the colors to 100 100 100, the 600 nits setting matches perfectly, so is this really over-saturating?

 

AC Odyssey in particular still looks pretty grayed out unfortunately but the brightness settings only match up when using 100 100 100.

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/5/2021 at 5:39 AM, rmgdnz said:

In Assassins Creed games, there's a slider to set your monitor's brightness in nits. Using the 50 50 50 color balance, 600 nits (the brightness of the G7) made bright points like the sun look inverted so they looked like black holes or something, so I had to use like 1000 nits which is of course not the brightness of my monitor. But after changing the colors to 100 100 100, the 600 nits setting matches perfectly, so is this really over-saturating?

 

AC Odyssey in particular still looks pretty grayed out unfortunately but the brightness settings only match up when using 100 100 100.

That is because the HDR tonemapping implementation in the G7 calls out for a 1000 nits signal. And changing your game to send a lower signal will mess that up.

 

But even if you fight the washed out colors with manual oversaturation you still have another big problem at hand that will make HDR look bad: Raising the monitor brightness will also raise the black level. 

 

Tbh the best option for HDR on the G7 is to not use it. Like it is with most HDR400, HDR600 and even some HDR1000 monitors. That's the sad reality.

 

You really net a PROPER HDR implementation to make HDR look better than it's SDR counterpart. Without a proper wide gamut (>90% DCI-P3) and plenty of local dimming zones (at least 384 for a 27" monitor) you won't end up anywhere close to how HDR should look, even if the peak brightness of your monitor is enough. HDR is not just brightness after all.

 

If you want real HDR you will either have to wait several more years when it (maybe) is more affordable or you have to spend upwards of $1500 for a real HDR monitor. That being said HDR monitors have been there for a few years now and if anything they have become more expensive.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Stahlmann said:

HDR is not just brightness after all.

If anything, black level is actually more important, because all the HDR standards apply a PQ curve to shift more data into the dark areas. If you can't properly display true black, you're losing a ton of detail.

CPU: AMD Ryzen 9 5900X · Cooler: Artic Liquid Freezer II 280 · Motherboard: MSI MEG X570 Unify · RAM: G.skill Ripjaws V 2x16GB 3600MHz CL16 (2Rx8) · Graphics Card: ASUS GeForce RTX 3060 Ti TUF Gaming · Boot Drive: 500GB WD Black SN750 M.2 NVMe SSD · Game Drive: 2TB Crucial MX500 SATA SSD · PSU: Corsair White RM850x 850W 80+ Gold · Case: Corsair 4000D Airflow · Monitor: MSI Optix MAG342CQR 34” UWQHD 3440x1440 144Hz · Keyboard: Corsair K100 RGB Optical-Mechanical Gaming Keyboard (OPX Switch) · Mouse: Corsair Ironclaw RGB Wireless Gaming Mouse

Link to comment
Share on other sites

Link to post
Share on other sites

Thanks for the info! Yeah tbh, certain games occasionally look better to my eyes with HDR and oversaturation, but leaving it off does result in a consistent good experience in all games. I'll modify the title of this post to specify it's not a REAL fix.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 months later...

Just wanna say, the reason SDR looks bad on HDR is because there's no standard way to transform SDR to HDR. Your perception is always going to be skewed as well because the SDR standard specifies a 100 cd/m2 maximum luminance, but most modern displays artificially inflate the luminance range well beyond the standard. HDR instead standardizes luminance for RGB values. Windows also does its own SDR luminance inflation through the SDR content brightness slider when in HDR mode. If you set that slider to 0 you'll see the standard for SDR content, which is a luminance range from 0 to 100.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 6 months later...

Its the samsung and gpu/wddm driver not communicating well on this one.

 

You confirm this by enabling windows hdr,display hdr and simply lowering the display brightness in the gpu settings.

 

For some reason the black levels over DP behave correctly only from the gpu driver settings for this monitor. 

 

Neither windows nor display settings themselves (hdr or sdr) bring the luminance down for the black levels when done from here.

 

Works perfectly for me on AMD gpu with  ÷12 on the brightness setting alone.

 

Also red 43, green 46 and blue 50 for my monitors color settings. (Otherwise red in Elden ring is so oversatured that it loses details in embers and flames)

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 year later...
On 9/4/2021 at 9:03 PM, rmgdnz said:

So for a while, I was struggling to get my Odyssey G7 to look good with HDR enabled. Games looked extremely desaturated and washed out. Almost a year later I finally discovered what was going on the whole time. Here's an explanation:

 

On SDR, the Odyssey G7 defaults to 50 50 50 for RGB color balance. This is the correct setting and setting to 100 100 100 looks very oversaturated. However, on HDR, you have to set the colors to 100 100 100. When switching back and forth from HDR, the color balance will then automatically adjust to 50 50 50 or 100 100 100 depending what mode you're in.

 

Some games now look mind-blowing (such as Horizon Zero Dawn), while others with mediocre implementation look actually acceptable instead of a gray mess. 

 

Not sure if this is fixed on newer monitors or firmware but just thought I'd leave this here in case anyone was struggling like me.

you just CHANGED my HDR WORLD. I never used hdr on my monitor for over two years now because it looked terrible. THANK YOU FOR THIS POST

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×