Jump to content

HDR or 12 Bit Monitor?

Go to solution Solved by Eigenvektor,
3 minutes ago, BilderEsser69 said:

Seems like I messed somthing up there... I somehow remembered the Asus ProArt PA27AC can do something with 32Bit AdobeRGB or something, not sure tho...

I meant 10 (or 12) bit per channel. 

Probably that it covers a certain percentage of the AdobeRGB color space.

 

3 minutes ago, BilderEsser69 said:

With what you said about HDR displays emulating 10 bit colors etc, my question is: instead of using an HDR display, can I get the same color range/depth with a monitor without HDR but with 10 bit per channel? 

HDR means the monitor has a High Dynamic Range (i.e. large difference between darkest and lightest color it can display). Something like HDR1000 means it can produce at least 1000 nits of brightness. 10 bit color depth per subpixel means the display can show 1.07 billion different colors instead of the more typical 16.7 million.

 

HDR monitors typically have 10 bit color depths. But the technologies are not the same and can't emulate one another. Also some monitors that claim to have a 10 bit color depth actually have 8 bit, but use a technology called frame rate control (FRC) to emulate 10 bit color depth by quickly switching between colors each frame.

 

3 minutes ago, BilderEsser69 said:

If I run a game on a 10 bit monitor without having HDR enabled in-game, do I still see the same image as I would do with an HDR display? Surely something about the shaders or color profile is changing if I enable HDR, right? 

 

I try to get the advantage of HDR without actually using HDR, does that make sense?

Not really, no. As I said above HDR means you get more dynamic range. The difference in brightness between the darkest pixels and brightest pixels is higher than normal. This is not the same as (only) having more color depth.

I'm currently running the BenQ EX2780Q in HDR on Win10 (per dp) primarily for gaming and digital illustration.

As many other, I encounter problems caused by Windows, so I began to wonder if changing to a monitor with higher color depth would improve my image quality, like HDR does, as well as not having to deal with Window's poor implementation of HDR.

 

I know that HDR displays encode the image differently to how a generic monitor encodes, but monitors for artists and professionals with 32 Bits of color depth are there for a reason.

 

How would my image quality and experience differ from what I have now?

Does a monitor with 12 Bits (no HDR) do the same as an HDR monitor?

 

Thanks in advance

 

 

 

BenQ EX2780Q specs

Size			27"
Screen Area		23.49 x 13.22" / 596.736 x 335.664 mm
Panel Type		IPS-Type LCD
Resolution		2560 x 1440
Aspect Ratio		16:9
Pixels Per Inch		109 ppi
Maximum Brightness	350 cd/m2
Contrast Ratio		1000:1
DCR			20,000,000:1
Refresh Rate		144 Hz
FreeSync
Bit Depth		10-Bit (1.07 Billion Colors)
VESA-Certified		DisplayHDR 400
Color Gamut		95% DCI-P3
Response Time		5 ms (GtG)
Viewing Angle (HxV)	178 x 178°

 

T̴͙̠̆h̸̢͇͆͠͝e̸͔̼̯̽̐̂ ̷͖̦͗͜R̶͔̯̈̑͘i̸̝̠͇̓́̒g̸̡̨͔͘

 

GTX 1080	(schutup I get to the 144fps in some games...)
i7-7700
32GB DDR4 RAM
MSI PcMate h270
Windows 10 Pro 64bit

 

Link to comment
Share on other sites

Link to post
Share on other sites

As a creative, you want to purchase a monitor that cloesly mathes your target medium.

 

If what you're creating won't be used outside of standard colour space, there's no point in getting a fancy HDR monitor. 12 bit is best for most purposes, you'll really only need a HDR monitor if you want to view/create HDR content.

 

Create something in HDR colour space for print and it's going to come out trash. Create something in HDR and then view that in SDR and it will also look like trash.

 

Imagine you're working on a ad banner that looks absolutely amazing on your monitor, but when the target seens it in sRGB, the vibrancy will be gone.

 

Having said all of that, a HDR monitor may open up more options in terms of target colour space.

 

If colour accuracy is important to you, either way, you'll want to calibrate, even if your monitor claims factory calibration. 

 

Finally - HDR400 isn't really HDR at all, and the 12bit mode is a type of dither which is probably going to be sufficient if you already don't know better. (as in, you need HDR, you'd already know why and that HDR400 is fairly useless for content creation.) 

 

I have a "HDR400" monitor and I do switch it on for gaming but it's more of a shortcut to maxing out screen brightness and improves contrast and vibrancy slightly just for content consumption, some HDR videos look slightly nicer but it's a very minor improvement. The panel is edgelit so it's interpretation of HDR will be full screen brightness ony.

 

 

Case - Phanteks Evolv X | PSU - EVGA 650w Gold Rated | Mobo - ASUS Strix x570-f | CPU - AMD r9 3900x | RAM - 32GB Corsair Dominator Platinum 3200mhz @ 3600mhz | GPU - EVGA nVidia 2080s 8GB  | OS Drive - Sabrent 256GB Rocket NVMe PCI Gen 4 | Game Drive - WD 1tb NVMe Gen 3  |  Storage - 7TB formatted
Cooled by a crap load of Noctua fans and Corsair H150i RGB Pro XT

Link to comment
Share on other sites

Link to post
Share on other sites

57 minutes ago, BilderEsser69 said:

I know that HDR displays encode the image differently to how a generic monitor encodes, but monitors for artists and professionals with 32 Bits of color depth are there for a reason.

 

How would my image quality and experience differ from what I have now?

Does a monitor with 12 Bits (no HDR) do the same as an HDR monitor?

Are you talking about 32 bit color depth per channel or 32 bit overall? Normally when people say 32 bit color depth they mean 8 bit per channel (RGBA), so 16.7M colors or SDR. Programs like Photoshop can use 32 bit per color channel, but I've never heard of a monitor that has a 32 bit panel.

 

HDR monitors typically either uses 10 bits per channel or 8-bit with FRC to emulate 10 bit color depth. Monitors that talk about 12 bit usually refer to a 12-bit LUT (Look-up table) and not actual 10 bit of color depth per subpixel.

Remember to either quote or @mention others, so they are notified of your reply

Link to comment
Share on other sites

Link to post
Share on other sites

whats the specific monitor you are looking at, 12bit LUT isn't the same as a 12bit color channels. 

 

I'd suggest just ignoring hdr overall and view it just as a bonus feature.

 

As for upgrades in color coverage, you are gonna have to decide on a budget, resolution, 1080p 1440p or 4k and a refresh rate 

 

for 1080p without knowing ur budget i recommend aoc 24g2

for 1440p, i recommend the LG 27GL83A-B or the msi MAG274QRF-QD if you have a higher budget. or just turn off hdr on ur current mon and see how it is

For 1440p 60hz wide gamue theres this on ebay with dead pixels (https://www.ebay.com/itm/294272885483?epid=175065029&hash=item448407e2eb:g:6gMAAOSwuNRg1fSY) 14bit LUT 10bit panel, it probably has best colors outside of the 2021 mini-leds (i've yet to see them side by side)

For the best balance of colors, refresh rate and hdr my pick is the PG35VQ

https://www.bhphotovideo.com/c/product/1511615-REG/asus_pg35vq_35_rog_swift_ultra_wide.html

 

4k is a whole other can of worms but id start with the 27gn950-b, pa329c, or a used x27/pg27uq (i've seen them for as low as 750)

                                  

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, cacoe said:

As a creative, you want to purchase a monitor that cloesly mathes your target medium.

 

If what you're creating won't be used outside of standard colour space, there's no point in getting a fancy HDR monitor. 12 bit is best for most purposes, you'll really only need a HDR monitor if you want to view/create HDR content.

 

Create something in HDR colour space for print and it's going to come out trash. Create something in HDR and then view that in SDR and it will also look like trash.

 

Imagine you're working on a ad banner that looks absolutely amazing on your monitor, but when the target seens it in sRGB, the vibrancy will be gone.

 

Having said all of that, a HDR monitor may open up more options in terms of target colour space.

 

If colour accuracy is important to you, either way, you'll want to calibrate, even if your monitor claims factory calibration. 

 

Finally - HDR400 isn't really HDR at all, and the 12bit mode is a type of dither which is probably going to be sufficient if you already don't know better. (as in, you need HDR, you'd already know why and that HDR400 is fairly useless for content creation.) 

 

I have a "HDR400" monitor and I do switch it on for gaming but it's more of a shortcut to maxing out screen brightness and improves contrast and vibrancy slightly just for content consumption, some HDR videos look slightly nicer but it's a very minor improvement. The panel is edgelit so it's interpretation of HDR will be full screen brightness ony.

 

 

I wanted to use HDR primarily for gaming, since I use a pen monitor to do my illustrations, I should've specified that lol.

I completely support your point of view on HDR.

Thanks for your input, it's always nice to talk someone who uses hardware professionally.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Eigenvektor said:

Are you talking about 32 bit color depth per channel or 32 bit overall? Normally when people say 32 bit color depth they mean 8 bit per channel (RGBA), so 16.7M colors or SDR. Programs like Photoshop can use 32 bit per color channel, but I've never heard of a monitor that has a 32 bit panel.

 

HDR monitors typically either uses 10 bits per channel or 8-bit with FRC to emulate 10 bit color depth. Monitors that talk about 12 bit usually refer to a 12-bit LUT (Look-up table) and not actual 10 bit of color depth per subpixel.

Seems like I messed somthing up there... I somehow remembered the Asus ProArt PA27AC can do something with 32Bit AdobeRGB or something, not sure tho...

I meant 10 (or 12) bit per channel. 

 

With what you said about HDR displays emulating 10 bit colors etc, my question is: instead of using an HDR display, can I get the same color range/depth with a monitor without HDR but with 10 bit per channel? 

If I run a game on a 10 bit monitor without having HDR enabled in-game, do I still see the same image as I would do with an HDR display? Surely something about the shaders or color profile is changing if I enable HDR, right? 

 

I try to get the advantage of HDR without actually using HDR, does that make sense?

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, xg32 said:

whats the specific monitor you are looking at, 12bit LUT isn't the same as a 12bit color channels. 

 

I'd suggest just ignoring hdr overall and view it just as a bonus feature.

 

As for upgrades in color coverage, you are gonna have to decide on a budget, resolution, 1080p 1440p or 4k and a refresh rate 

 

for 1080p without knowing ur budget i recommend aoc 24g2

for 1440p, i recommend the LG 27GL83A-B or the msi MAG274QRF-QD if you have a higher budget. or just turn off hdr on ur current mon and see how it is

For 1440p 60hz wide gamue theres this on ebay with dead pixels (https://www.ebay.com/itm/294272885483?epid=175065029&hash=item448407e2eb:g:6gMAAOSwuNRg1fSY) 14bit LUT 10bit panel, it probably has best colors outside of the 2021 mini-leds (i've yet to see them side by side)

For the best balance of colors, refresh rate and hdr my pick is the PG35VQ

https://www.bhphotovideo.com/c/product/1511615-REG/asus_pg35vq_35_rog_swift_ultra_wide.html

 

4k is a whole other can of worms but id start with the 27gn950-b, pa329c, or a used x27/pg27uq (i've seen them for as low as 750)

                                  

I will definetely have a look at your suggestions, thank you very much! 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, BilderEsser69 said:

Seems like I messed somthing up there... I somehow remembered the Asus ProArt PA27AC can do something with 32Bit AdobeRGB or something, not sure tho...

I meant 10 (or 12) bit per channel. 

Probably that it covers a certain percentage of the AdobeRGB color space.

 

3 minutes ago, BilderEsser69 said:

With what you said about HDR displays emulating 10 bit colors etc, my question is: instead of using an HDR display, can I get the same color range/depth with a monitor without HDR but with 10 bit per channel? 

HDR means the monitor has a High Dynamic Range (i.e. large difference between darkest and lightest color it can display). Something like HDR1000 means it can produce at least 1000 nits of brightness. 10 bit color depth per subpixel means the display can show 1.07 billion different colors instead of the more typical 16.7 million.

 

HDR monitors typically have 10 bit color depths. But the technologies are not the same and can't emulate one another. Also some monitors that claim to have a 10 bit color depth actually have 8 bit, but use a technology called frame rate control (FRC) to emulate 10 bit color depth by quickly switching between colors each frame.

 

3 minutes ago, BilderEsser69 said:

If I run a game on a 10 bit monitor without having HDR enabled in-game, do I still see the same image as I would do with an HDR display? Surely something about the shaders or color profile is changing if I enable HDR, right? 

 

I try to get the advantage of HDR without actually using HDR, does that make sense?

Not really, no. As I said above HDR means you get more dynamic range. The difference in brightness between the darkest pixels and brightest pixels is higher than normal. This is not the same as (only) having more color depth.

Remember to either quote or @mention others, so they are notified of your reply

Link to comment
Share on other sites

Link to post
Share on other sites

50 minutes ago, BilderEsser69 said:

Seems like I messed somthing up there... I somehow remembered the Asus ProArt PA27AC can do something with 32Bit AdobeRGB or something, not sure tho...

I of meant 10 (or 12) bit per channel. 

 

With what you said about HDR displays emulating 10 bit colors etc, my question is: instead of using an HDR display, can I get the same color range/depth with a monitor without HDR but with 10 bit per channel? 

If I run a game on a 10 bit monitor without having HDR enabled in-game, do I still see the same image as I would do with an HDR display? Surely something about the shaders or color profile is changing if I enable HDR, right? 

 

I try to get the advantage of HDR without actually using HDR, does that make sense?

https://www.displayspecifications.com/en/model/6bcd1009

it's an outdated 8bit panel with 14bit LUT, and the ARGB coverage is only 72%, 100% srgb but pretty much most gaming monitors when calibrated nowadays will be better 

 

What you are looking for is wide color gamut and accuracy, 10bit panel or 8bit+frc with good color coverage (100%+ srgb, 96%+ argb and 90%+ p3, or with the newer monitors 75%+ rec 2020, depending on which color space you work with, it's usually argb or p3.

 

a panel can be good without hdr (oled or panels with lower brightness, or just not added in as a feature on older professional monitors), and there are plenty of bad monitors with hdr@Eigenvektordid a better job than i would explaining what hdr is, 

 

If we eliminate all the hdr400 monitors, then the cheapest 1440p gaming monitor with hdr600+ and good color coverage is the dell aw2721D https://www.ebay.com/itm/224466260919?epid=7044754154&hash=item34433b53b7:g:d~AAAOSwqFlg8baN 825usd isn't bad for those specs,

 

If you don't use/need the hdr then the msi and LG monitors i mentioned before looks almost as good in terms of colors.

 

i have a x27 (hdr1000) monitor and i leave it on with lower brightness (in nvidia control panel) the actual peak brightness of my settings is only 5-600nits, it's still alot brighter/looks alot better than my 34GN850 (hdr400) if both has hdr on. The x27 looks very scuffed when hdr is off, and the 34GN850 looks better when HDR is off. The X27 (hdr on) only looks slightly better than the 34gn850 (hdr off) in terms of image quality.

 

So if the hdr is bad, leave it off, if u really do want hdr, make sure it's good   The best hdr experience for gaming/netflix is still LG OLED TVs, but they are 48inches,  and used in a dark room, not something i'd do for long term use, at least for a single monitor solution, and i don't have the space.

 

If you still can't decide, set a budget, then rank these in order of importance resolution, color coverage, hdr, refresh rate, size. For me it's color coverage>refresh rate>hdr,and that narrowed it down alot.

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×