Jump to content

Misleading monitor marketing

Stahlmann

! There is a TL/DR at the bottom for anyone who doesn't want to read through it all !

 

With the announcement of the lastest 34" QD-OLED gaming monitors from Samsung and Dell it bit me again. So here's a rant hoping to open up discussion:

 

Response times:

99% of all monitors, even non gaming ones are marketed with "1ms GtG" or even "0.5ms GtG" response times. To this day there is no LCD monitor that is actually that fast. The best TN/IPS/VA monitors top out at around 4ms response times. Examples for these models would be the Samsung G7 (VA), the MSI MAG251RX (IPS) and the HP Omen X27 (TN). And most mid to high-end IPS monitors sit around 4-6ms, yet they all call out 1ms or lower response times. Sometimes these 1ms response times are actually achieved, but only in some transitions and overdrive modes so ridiculously infested with overshoot (inverse ghosting) that these modes almost become unuseable or close to it.

 

Tim from Hardware Unboxed (the best monitor review source imo) actually reviewed his first true 1ms display, which was the LG C1 OLED TV. As OLED doesn't need overdrive or other tuning to get to these response times i'd expect pretty much all current OLED TV's to perform the same. And that while response times aren't even part of their marketing. Instead of taking that and using it in honesty, Samsung and Dell now announced their first QD-OLED monitors having "0.1ms" response times. Again, we're away from realistic values, back to basically lying for the sake of having the lowest number among competitors.

 

HDR:

Now that's still the biggest point of concern for me, as this is where most buyers are still not knowledgable enough about it to to make any informed buying decision. Even monitors certified by VESA with their "Vesa DisplayHDRXXX" certification are mostly straight up non-HDR monitors. This includes all HDR400, most HDR600 and even some HDR1000 monitors. For an LCD backlit monitor to have any HDR capabilities whatsoever, FALD (Full-Array-Local-Dimming) is needed. Otherwise using HDR will mostly just set the backlight to it's brightest value - also raising the black level - resulting in loss of detail for darker parts of the screen for the sake of increasing the overall picture brightness. That is not what HDR in meant to be and in TV's this technology is used a lot more and a lot more effective. Even in mid-range TV models we often see FALD backlights. But even if it's implemented in LCD monitors, brands call out for ridiculous prices for that. In the end the VESA DisplayHDR certification should be completely reworked as it really doesn't warrant any kind of real HDR experience. The DisplayHDR 1400 certification is so far the only one that has ONLY allowed real HDR displays to pass. Also add the fact that "G-Sync Ultimate" has been reworked and made into another useless certification that doesn't mean anything. Before the rework it was a safe bet for true HDR displays, nowadays it basically only means there is a G-Sync module.

 

G-Sync superiority:

G-Sync is a dead standard imo. Nowadays FreeSync monitors also support it's most important features, namely variable overdrive which was the main selling point for G-Sync in the past. Implementing a native G-Sync module in a monitor even introduces connectivity limitations, which is the reason why there is no single G-Sync HDMI 2.1 monitor yet. Their current module only allows for up to DP 1.4 and HDMI 2.0 connections, limiting versatility on high-end displays. The Asus PG32UQX would be one example of this.

 

Another problem is that the G-Sync module almost always introduces at least one fan to the monitor because of it's heat output that cannot be accounted for in a passive way. Fans are the most common point of failure on these monitors, not to mention even while they still work they will always put out a quiet (depending on the model), but high-pitched annoying whine. For people having a quiet computer these fans will almost always be the loudest and most annoying part of a setup. Even my Asus PG35VQ, where most reviewers said the fan wasn't audible is easily and by far the loudest part of my setup. And i can confirm it's not my particular unit because i had 4 of them in total (for RMA reasons) and they were all the same in terms of fan noise. My former LG 27GN950-B had similar fan noise, so that further strenghtens the point.

 

Then some brands (for example LG) started to market "G-Sync compatible" monitors as just "G-Sync", making people with AMD GPU's think it won't work for them, even though they are FreeSync monitors. Most of LG's UltraGear monitors have this problem.

 

Add the fact that 99% of all FreeSync monitors work without issues with G-Sync and i don't really see any point in including G-Sync modules, driving up the monitor's price, introducing fans and cutting out features in the process.

 

Color gamut coverage / color accuracity:

So many people shopping for monitors think that a wider color gamut means "better" or more accurate colors. This is simply not the case. The MSI MAG274QRF-QD is a great example for that. It's one of the best monitors out there in terms of color gamut coverage, sitting around 83% Rec. 2020, but the color accuracity is horrible out of the box, as all normal desktop applications use the sRGB color space, which is oversaturated to an extreme level. Only their later firmware update adding an sRGB picture mode made it useable in work regarding the sRGB color space. I get that many people like the oversaturated look of a wide-gamut monitor, which is why almost all of them ship with their gamut unclamped. But the same people also want a color accurate monitor... So far Asus is the only brand i know of that ships (at least some of) their wide-gamut monitors in an sRGB mode, so colors are correct out of the box.

 

The next problem is that in almost every case where a monitor has an sRGB mode that clamps the gamut, enabling it locks the user out of most settings. In most cases even the brightness control. And if the white point is too warm or too cold you get correct gamut clipping, but still inaccurate colors and gamma you can't do anything about.

 

Connectivity:

Most of you recently heard about the fact that virtually any HDMI port will be 2.1 going forward. That means consumers will - again - have to dig way deeper than they should to actually confirm what HDMI 2.1 features actually are implemented in the monitor. I really hope DisplayPort and it's 2.0 iteration won't fall into the same pit.

 

TL/DR:

There are a few other points about monitor marketing that make me really mad as they're obviously used to mislead consumers, making it harder for people to find the monitors they want or need. But the above are my main points of concern. In the end most specs are useless to determine a monitor's quality or capabilities, making reviews ESSENTIAL to even know what you're buying. It really shouldn't be like that. For me these things clearly fall under false and misleading marketing and shouldn't even be legal. Any brand using this way of marketing their producs loses it's credibility in my book, but sadly it's all of them. And i bet most of these brands have lawyers making sure what they do with their marketing stays legal or at least in some gray area.

 

Please feel free to also share your opinion, thoughts or concerns. And thank you to anyone who took the time reading through all of this.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

Of the rants you listed, I find HDR advertising the most annoying of all. IMO HDR400 looks worse than SDR. I haven't seen HDR600 but guessing if it doesn't have FALD it would also look as bad.

 

Around mid 2020 when I wanted to upgrade from 1080p TN monitor I don't know much about monitors at that time, but I seen many monitors that advertised HDR which made me think "cool, monitors can do HDR!" But upon researching I found out nearly all monitors with HDR are bogus.

 

Then there's response time which I also initially thought there's nothing wrong with it. When researching at difference between IPS and VA I found out VA has slower response time so I was skeptical how can VA monitor can really have 1ms response time. Again, bogus advertising.

 

I ended up trusting Hardware Unboxed videos, rtings review and couple other monitor reviews that actually measure monitor's performance, and skipped all monitor's bogus advertisement shown on their website. 

PC spec: CPU: RYZEN 9 5950X | GPU: SAPPHIRE NITRO+ SE AMD RADEON 6900XT (Undervolt to 1045mV) | MB: MSI MAG TOMAHAWK x570 RAM: G.SKILL TRIDENT Z NEO 32GB (2x16GB) DDR4-3600 (OC to 3800 14-15-15-25) COOLING: NOCTUA NH-D15, BE QUIET! SILENT WINGS 120 & 140mm| CASE: IN-WIN 707 | 5.25" BAY: LG WH16NS60 INTERNAL BLU-RAY OPTICAL DRIVE | PSU: SEASONIC PRIME PLATINUM 1000WUPS: POWERSHIELD COMMANDER TOWER 1100VA

PERIPHERALS: KEYBOARD: CORSAIR K95 PLATINUM XT BROWN SWITCH | MOUSE: CORSAIR SABRE PRO WIRELESS | CONTROLLER: PDP AFTERGLOW WIRED CONTROLLER, DUALSENSE
DISPLAYS: LG 34GN8502x DELL S2721DGF | LG C1 48" 

HT & audio stuff:  AVR: MARANTZ SR7013 | STEREO AMPLIFIER: YAMAHA AS-501 | SPEAKERS: DALI OBERON 7 & DALI ZENSOR 1 & 2x SVS-SB2000 | HEADPHONE DAC+AMP: TOPPING L30+E30 | HEADPHONE: SENNHEISER HD6XX, BOSE QUIETCOMFORT 35 II | MICROPHONE: AUDIO-TECHNICA AT9934USB | BLU-RAY PLAYER: PANASONIC UB820

Link to comment
Share on other sites

Link to post
Share on other sites

  • 4 months later...

Sorry if necro but it seemed to still be relevant. 

So im on AMD hardware contemplating the OLED 34" from dell. Surely it seems there should be no problem with VRR but is it just that easy?
I also read that, Gsync ultimate and Freesync premium pro, these "new" technolgies seem to differ from their older certifications in that they promise extra HDR functionality, like lowered response times with HDR enabled. And i have gotten the idea, maybe iam wrong, that the HDR 400 cert is actually usable and pretty good implementation on the Dell monitor?

Could you maybe say something about that? 

Other than that great write up! Ive worked in electronics sales and sold TV's a decade ago, ohh my to try and explain frequency of "800Hz" vs one with "400hz" but the first was actually 100Hz and the second 200hz.. Even my senior coworkers didnt understand this.
They deliberatly make it illogical to comprehend so that they can fool us more easily.

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, MrPapis said:

Sorry if necro but it seemed to still be relevant. 

So im on AMD hardware contemplating the OLED 34" from dell. Surely it seems there should be no problem with VRR but is it just that easy?
I also read that, Gsync ultimate and Freesync premium pro, these "new" technolgies seem to differ from their older certifications in that they promise extra HDR functionality, like lowered response times with HDR enabled. And i have gotten the idea, maybe iam wrong, that the HDR 400 cert is actually usable and pretty good implementation on the Dell monitor?

G-Sync Ultimate and FreeSync Premium Pro having any impact on HDR gaming is pure BS. Even non-certified monitors have no problem using VRR and HDR at the same time and HDR typically doesn't add significant input lag. (The only exception would be TV's that don't have a "game mode" of HDR)

 

The reason why the "HDR400 True Black" certification is actually useable is because the monitor can still do over 1000 nits in HDR. Plus, like the name suggests true black, which is always a great help in HDR. Pretty much any OLED - even with only 700 nits or so peak brightness - can likely display a better HDR image than most HDR 1000 or HDR 1400 monitors with 1000+ nits of peak brightness, simply because of OLED's superior dimming and lack of blooming.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Stahlmann said:

G-Sync Ultimate and FreeSync Premium Pro having any impact on HDR gaming is pure BS. Even non-certified monitors have no problem using VRR and HDR at the same time and HDR typically doesn't add significant input lag. (The only exception would be TV's that don't have a "game mode" of HDR)

 

The reason why the "HDR400 True Black" certification is actually useable is because the monitor can still do over 1000 nits in HDR. Plus, like the name suggests true black, which is always a great help in HDR. Pretty much any OLED - even with only 700 nits or so peak brightness - can likely display a better HDR image than most HDR 1000 or HDR 1400 monitors with 1000+ nits of peak brightness, simply because of OLED's superior dimming and lack of blooming.

Thats unfortunate... for my wallet.. 
Thank you for the quick response! 

Oh well been living the VA life exclusively for a long time might, be time to see what all the fuss is about!
 

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, MrPapis said:

Thats unfortunate... for my wallet.. 
Thank you for the quick response! 

Oh well been living the VA life exclusively for a long time might, be time to see what all the fuss is about!
 

It has a few problems that unfortunately ruled it out for me, but it's still one of the best monitors around. As long as you can control your ambient light and fan noise isn't too much of a concern go for it.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 5 weeks later...
On 5/17/2022 at 5:43 PM, Stahlmann said:

G-Sync Ultimate and FreeSync Premium Pro having any impact on HDR gaming is pure BS. Even non-certified monitors have no problem using VRR and HDR at the same time and HDR typically doesn't add significant input lag. (The only exception would be TV's that don't have a "game mode" of HDR)

 

The reason why the "HDR400 True Black" certification is actually useable is because the monitor can still do over 1000 nits in HDR. Plus, like the name suggests true black, which is always a great help in HDR. Pretty much any OLED - even with only 700 nits or so peak brightness - can likely display a better HDR image than most HDR 1000 or HDR 1400 monitors with 1000+ nits of peak brightness, simply because of OLED's superior dimming and lack of blooming.

'HDR typically doesn't add significant input lag.'
Untrue! any monitor with a FALD backlight suffers greatly with response times once HDR is enabled, we saw this with the Acer Predator X35.
This typically increases lag.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, tierwelder said:

'HDR typically doesn't add significant input lag.'
Untrue! any monitor with a FALD backlight suffers greatly with response times once HDR is enabled, we saw this with the Acer Predator X35.
This typically increases lag.

On most monitors it doesn't. And if it does, "greatly" is a gross overexaggeration. Hardware Unboxed did all their testing of the PG35VQ with the FALD enabled - even the SDR testing. Still the PG35VQ's input lag is "in line with other gaming montiors tested" - Hardware Unboxed. Just the X35 has an increase of about 4ms, which is in no way noticeable unless you're hardcore into competitive games. And these gamers typically don't even use HDR or FALD. I personally had 4 HDR capable displays now (LG 27GN950-B, Samsung G7, LG C9, Asus PG35VQ) and with every model there is no noticeable difference in input lag when HDR is on or off. The 3 monitors all have dimming zones that become active once HDR is enabled, the TV obviously is OLED so there isn't any additional dimming required.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

I own few HDR display, here is my summary comment mainly on HDR performance. I agreed also HDR doesn't add significant input lag but I also don't see any input lag when turning on FALD.

 

52153423670_61f1c7d890_o.png

PC: AMD Ryzen 9 5900X, Gigabyte GeForce RTX 4090 OC 24G, X570 AORUS Elite WIFI Motherboard, HyperX FURY 32GB DDR4-3200 RGB RAM, Creative Sound Blaster AE-9 Sound Card, Samsung 970 Evo Plus M.2 SATA 500GB, ADATA XPG SX8200 Pro M.2 SATA 2TB, Asus HyperX Fury RGB SSD 960GB, Seagate Barracuda 7200RPM 3.5 HDD 2TB, Cooler Master MASTERLIQUID ML240R ARGB, Cooler Master MASTERFAN MF120R ARGB, Cooler Master ELV8 Graphics Card Holder ARGB, Asus ROG Strix 1000G PSU, Lian Li LANCOOL II MESH RGB Case, Windows 11 Pro (22H2).


Laptop: Asus Vivobook "A Bathing Ape" - ASUS Vivobook S 15 OLED BAPE Edition: Intel i9-13900H, 16 GB RAM, 15.6" 2.8K 120hz OLED | Apple MacBook Pro 14" 2023: M2 Pro, 16 GB RAM, NVMe 512 GB | Asus VivoBook 15 OLED: Intel® Core™ i3-1125G4, Intel UHD, 8 GB RAM, Micron NVMe 512 GB | Illegear Z5 SKYLAKE: Intel Core i7-6700HQ, Nvidia Geforce GTX 970M, 16 GB RAM, ADATA SU800 M.2 SATA 512GB.

 

Monitor: Samsung Odyssey OLED G9 49" 5120x1440 240hz QD-OLED HDR, LG OLED Flex 42LX3QPSA 41.5" 3840x2160 bendable 120hz WOLED, AOC 24G2SP 24" 1920x1080 165hz SDR, LG UltraGear Gaming Monitor 34" 34GN850 3440x1440 144hz (160hz OC) NanoIPS HDR, LG Ultrawide Gaming Monitor 34" 34UC79G 2560x1080 144hz IPS SDR, LG 24MK600 24" 1920x1080 75hz Freesync IPS SDR, BenQ EW2440ZH 24" 1920x1080 75hz VA SDR.


Input Device: Asus ROG Azoth Wireless Mechanical KeyboardAsus ROG Chakram X Origin Wireless MouseLogitech G913 Lightspeed Wireless RGB Mechanical Gaming Keyboard, Logitech G502X Wireless Mouse, Logitech G903 Lightspeed HERO Wireless Gaming Mouse, Logitech Pro X, Logitech MX Keys, Logitech MX Master 3, XBOX Wireless Controller Covert Forces Edition, Corsair K70 RAPIDFIRE Mechanical Gaming Keyboard, Corsair Dark Core RGB Pro SE Wireless Gaming Mouse, Logitech MK850 Wireless Keyboard & Mouse Combos.


Entertainment: LG 55" C9 OLED HDR Smart UHD TV with AI ThinQ®, 65" Samsung AU7000 4K UHD Smart TV, SONOS Beam (Gen 2) Dolby Atmos Soundbar, SONOS Sub Mini, SONOS Era 100 x2, SONOS Era 300 Dolby Atmos, Logitech G560 2.1 USB & Bluetooth Speaker, Logitech Z625 2.1 THX Speaker, Edifier M1370BT 2.1 Bluetooth Speaker, LG SK9Y 5.1.2 channel Dolby Atmos, Hi-Res Audio SoundBar, Sony MDR-Z1R, Bang & Olufsen Beoplay EX, Sony WF-1000XM5, Sony WH-1000XM5, Sony WH-1000XM4, Apple AirPods Pro, Samsung Galaxy Buds2, Nvidia Shield TV Pro (2019 edition), Apple TV 4K (2017 & 2021 Edition), Chromecast with Google TV, Sony UBP-X700 UltraHD Blu-ray, Panasonic DMP-UB400 UltraHD Blu-ray.

 

Mobile & Smart Watch: Apple iPhone 15 Pro Max (Natural Titanium), Apple Watch Series 8 Stainless Steel with Milanese Loop (Graphite).

 

Others Gadgets: Asus SBW-06D2X-U Blu-ray RW Drive, 70 TB Ext. HDD, j5create JVCU100 USB HD Webcam with 360° rotation, ZTE UONU F620, Maxis Fibre WiFi 6 Router, Fantech MPR800 Soft Cloth RGB Gaming Mousepad, Fantech Headset Headphone Stand AC3001S RGB Lighting Base Tower, Infiniteracer RGB Gaming Chair

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Stahlmann said:

On most monitors it doesn't. And if it does, "greatly" is a gross overexaggeration. Hardware Unboxed did all their testing of the PG35VQ with the FALD enabled - even the SDR testing. Still the PG35VQ's input lag is "in line with other gaming montiors tested" - Hardware Unboxed. Just the X35 has an increase of about 4ms, which is in no way noticeable unless you're hardcore into competitive games. And these gamers typically don't even use HDR or FALD. I personally had 4 HDR capable displays now (LG 27GN950-B, Samsung G7, LG C9, Asus PG35VQ) and with every model there is no noticeable difference in input lag when HDR is on or off. The 3 monitors all have dimming zones that become active once HDR is enabled, the TV obviously is OLED so there isn't any additional dimming required.

Nonono sir, worse display response times can genuinely effect input lag aswell, and 4 miliseconds is a massive difference for a display.
Also the G7 isn't HDR-capable, that monitor has like 10 zones

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, tierwelder said:

Nonono sir, worse display response times can genuinely effect input lag aswell, and 4 miliseconds is a massive difference for a display.

The 4ms i mentioned are not the pixel response times, but processing lag. So it will not affect motion clarity, only input lag.

 

29 minutes ago, tierwelder said:

Also the G7 isn't HDR-capable, that monitor has like 10 zones

Doesn't matter, enabling HDR doesn't increase input lag which was my point. I never said it was any good at HDR.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×