Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Misleading monitor marketing

Stahlmann
 Share

! There is a TL/DR at the bottom for anyone who doesn't want to read through it all !

 

With the announcement of the lastest 34" QD-OLED gaming monitors from Samsung and Dell it bit me again. So here's a rant hoping to open up discussion:

 

Response times:

99% of all monitors, even non gaming ones are marketed with "1ms GtG" or even "0.5ms GtG" response times. To this day there is no LCD monitor that is actually that fast. The best TN/IPS/VA monitors top out at around 4ms response times. Examples for these models would be the Samsung G7 (VA), the MSI MAG251RX (IPS) and the HP Omen X27 (TN). And most mid to high-end IPS monitors sit around 4-6ms, yet they all call out 1ms or lower response times. Sometimes these 1ms response times are actually achieved, but only in some transitions and overdrive modes so ridiculously infested with overshoot (inverse ghosting) that these modes almost become unuseable or close to it.

 

Tim from Hardware Unboxed (the best monitor review source imo) actually reviewed his first true 1ms display, which was the LG C1 OLED TV. As OLED doesn't need overdrive or other tuning to get to these response times i'd expect pretty much all current OLED TV's to perform the same. And that while response times aren't even part of their marketing. Instead of taking that and using it in honesty, Samsung and Dell now announced their first QD-OLED monitors having "0.1ms" response times. Again, we're away from realistic values, back to basically lying for the sake of having the lowest number among competitors.

 

HDR:

Now that's still the biggest point of concern for me, as this is where most buyers are still not knowledgable enough about it to to make any informed buying decision. Even monitors certified by VESA with their "Vesa DisplayHDRXXX" certification are mostly straight up non-HDR monitors. This includes all HDR400, most HDR600 and even some HDR1000 monitors. For an LCD backlit monitor to have any HDR capabilities whatsoever, FALD (Full-Array-Local-Dimming) is needed. Otherwise using HDR will mostly just set the backlight to it's brightest value - also raising the black level - resulting in loss of detail for darker parts of the screen for the sake of increasing the overall picture brightness. That is not what HDR in meant to be and in TV's this technology is used a lot more and a lot more effective. Even in mid-range TV models we often see FALD backlights. But even if it's implemented in LCD monitors, brands call out for ridiculous prices for that. In the end the VESA DisplayHDR certification should be completely reworked as it really doesn't warrant any kind of real HDR experience. The DisplayHDR 1400 certification is so far the only one that has ONLY allowed real HDR displays to pass. Also add the fact that "G-Sync Ultimate" has been reworked and made into another useless certification that doesn't mean anything. Before the rework it was a safe bet for true HDR displays, nowadays it basically only means there is a G-Sync module.

 

G-Sync superiority:

G-Sync is a dead standard imo. Nowadays FreeSync monitors also support it's most important features, namely variable overdrive which was the main selling point for G-Sync in the past. Implementing a native G-Sync module in a monitor even introduces connectivity limitations, which is the reason why there is no single G-Sync HDMI 2.1 monitor yet. Their current module only allows for up to DP 1.4 and HDMI 2.0 connections, limiting versatility on high-end displays. The Asus PG32UQX would be one example of this.

 

Another problem is that the G-Sync module almost always introduces at least one fan to the monitor because of it's heat output that cannot be accounted for in a passive way. Fans are the most common point of failure on these monitors, not to mention even while they still work they will always put out a quiet (depending on the model), but high-pitched annoying whine. For people having a quiet computer these fans will almost always be the loudest and most annoying part of a setup. Even my Asus PG35VQ, where most reviewers said the fan wasn't audible is easily and by far the loudest part of my setup. And i can confirm it's not my particular unit because i had 4 of them in total (for RMA reasons) and they were all the same in terms of fan noise. My former LG 27GN950-B had similar fan noise, so that further strenghtens the point.

 

Then some brands (for example LG) started to market "G-Sync compatible" monitors as just "G-Sync", making people with AMD GPU's think it won't work for them, even though they are FreeSync monitors. Most of LG's UltraGear monitors have this problem.

 

Add the fact that 99% of all FreeSync monitors work without issues with G-Sync and i don't really see any point in including G-Sync modules, driving up the monitor's price, introducing fans and cutting out features in the process.

 

Color gamut coverage / color accuracity:

So many people shopping for monitors think that a wider color gamut means "better" or more accurate colors. This is simply not the case. The MSI MAG274QRF-QD is a great example for that. It's one of the best monitors out there in terms of color gamut coverage, sitting around 83% Rec. 2020, but the color accuracity is horrible out of the box, as all normal desktop applications use the sRGB color space, which is oversaturated to an extreme level. Only their later firmware update adding an sRGB picture mode made it useable in work regarding the sRGB color space. I get that many people like the oversaturated look of a wide-gamut monitor, which is why almost all of them ship with their gamut unclamped. But the same people also want a color accurate monitor... So far Asus is the only brand i know of that ships (at least some of) their wide-gamut monitors in an sRGB mode, so colors are correct out of the box.

 

The next problem is that in almost every case where a monitor has an sRGB mode that clamps the gamut, enabling it locks the user out of most settings. In most cases even the brightness control. And if the white point is too warm or too cold you get correct gamut clipping, but still inaccurate colors and gamma you can't do anything about.

 

Connectivity:

Most of you recently heard about the fact that virtually any HDMI port will be 2.1 going forward. That means consumers will - again - have to dig way deeper than they should to actually confirm what HDMI 2.1 features actually are implemented in the monitor. I really hope DisplayPort and it's 2.0 iteration won't fall into the same pit.

 

TL/DR:

There are a few other points about monitor marketing that make me really mad as they're obviously used to mislead consumers, making it harder for people to find the monitors they want or need. But the above are my main points of concern. In the end most specs are useless to determine a monitor's quality or capabilities, making reviews ESSENTIAL to even know what you're buying. It really shouldn't be like that. For me these things clearly fall under false and misleading marketing and shouldn't even be legal. Any brand using this way of marketing their producs loses it's credibility in my book, but sadly it's all of them. And i bet most of these brands have lawyers making sure what they do with their marketing stays legal or at least in some gray area.

 

Please feel free to also share your opinion, thoughts or concerns. And thank you to anyone who took the time reading through all of this.

About monitor marketing BS

 

Current Specs:

CPU: AMD Ryzen 5 5600X - Motherboard: ASUS ROG Strix B550-E - GPU: PNY RTX 3080 XLR8 Epic-X - RAM: 4x8GB (32GB) G.Skill TridentZ RGB 3600MHz CL16 - PSU: Corsair RMx (2018) 850W - Storage: 500 GB Corsair MP600 (Boot) + 2 TB Sabrent Rocket Q (Storage) - Cooling: EK, HW Labs & Alphacool custom loop - Case: Lian-Li PC O11 Dynamic - Fans: 6x Noctua NF-A12x25 chromax - AMP/DAC: FiiO K5 Pro - OS: Windows 11 preview - Monitor: ASUS ROG Swift PG35VQ - Mouse: Logitech G Pro + Powerplay - Keyboard: Logitech G915 TKL - Headphones: Beyerdynamic Amiron Home - Microphone: Antlion ModMic

 

Temperatures @steady state: Furmark + CinebenchR23 running for 1 hour. Fans @850RPM. Pump @1600RPM.

Water: 37°C

CPU: 73°C

GPU: 54°C

Link to comment
Share on other sites

Link to post
Share on other sites

You and me both, brother.  Tech these days has the marketing bombast of PT Barnum written all over it.

Link to comment
Share on other sites

Link to post
Share on other sites

Of the rants you listed, I find HDR advertising the most annoying of all. IMO HDR400 looks worse than SDR. I haven't seen HDR600 but guessing if it doesn't have FALD it would also look as bad.

 

Around mid 2020 when I wanted to upgrade from 1080p TN monitor I don't know much about monitors at that time, but I seen many monitors that advertised HDR which made me think "cool, monitors can do HDR!" But upon researching I found out nearly all monitors with HDR are bogus.

 

Then there's response time which I also initially thought there's nothing wrong with it. When researching at difference between IPS and VA I found out VA has slower response time so I was skeptical how can VA monitor can really have 1ms response time. Again, bogus advertising.

 

I ended up trusting Hardware Unboxed videos, rtings review and couple other monitor reviews that actually measure monitor's performance, and skipped all monitor's bogus advertisement shown on their website. 

CPU: RYZEN 9 5950X | GPU: SAPPHIRE NITRO+ AMD RADEON 6800XT | MB: MSI MAG X570 TOMAHAWK | RAM: G.SKILL TRIDENT Z NEO 32GB (2x16GB) DDR4-3600 | COOLING: NOCTUA NH-D15, CORSAIR ML120 & ML140 | CASE: IN-WIN 707 | 5.25" BAY: LG WH16NS60 INTERNAL BLU-RAY OPTICAL DRIVE | KEYBOARD: CORSAIR K95 PLATINUM XT BROWN SWITCH | MOUSE: LOGITECH G903 HERO | CONTROLLER: PDP AFTERGLOW WIRED CONTROLLER | PSU: SEASONIC PRIME PLATINUM 1000W | UPS: POWERSHIELD COMMANDER TOWER 1100VA | DISPLAYS: LG 34GN850, 2x DELL S2719DGF (portrait) | TV: LG C1 48" | AMPLIFIER: MARANTZ SR7013, YAMAHA AS-501 | SPEAKERS: DALI ZENSOR 5 & DALI ZENSOR VOKAL & JAMO A310 SATELLITE & 2x SVS-SB2000 | HEADPHONE DAC+AMP: TOPPING L30+E30 | HEADPHONE: SENNHEISER HD6XX | MICROPHONE: AUDIO-TECHNICA AT9934USB | BLU-RAY PLAYER: PANASONIC UB820

Link to comment
Share on other sites

Link to post
Share on other sites

  • 4 months later...

Sorry if necro but it seemed to still be relevant. 

So im on AMD hardware contemplating the OLED 34" from dell. Surely it seems there should be no problem with VRR but is it just that easy?
I also read that, Gsync ultimate and Freesync premium pro, these "new" technolgies seem to differ from their older certifications in that they promise extra HDR functionality, like lowered response times with HDR enabled. And i have gotten the idea, maybe iam wrong, that the HDR 400 cert is actually usable and pretty good implementation on the Dell monitor?

Could you maybe say something about that? 

Other than that great write up! Ive worked in electronics sales and sold TV's a decade ago, ohh my to try and explain frequency of "800Hz" vs one with "400hz" but the first was actually 100Hz and the second 200hz.. Even my senior coworkers didnt understand this.
They deliberatly make it illogical to comprehend so that they can fool us more easily.

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, MrPapis said:

Sorry if necro but it seemed to still be relevant. 

So im on AMD hardware contemplating the OLED 34" from dell. Surely it seems there should be no problem with VRR but is it just that easy?
I also read that, Gsync ultimate and Freesync premium pro, these "new" technolgies seem to differ from their older certifications in that they promise extra HDR functionality, like lowered response times with HDR enabled. And i have gotten the idea, maybe iam wrong, that the HDR 400 cert is actually usable and pretty good implementation on the Dell monitor?

G-Sync Ultimate and FreeSync Premium Pro having any impact on HDR gaming is pure BS. Even non-certified monitors have no problem using VRR and HDR at the same time and HDR typically doesn't add significant input lag. (The only exception would be TV's that don't have a "game mode" of HDR)

 

The reason why the "HDR400 True Black" certification is actually useable is because the monitor can still do over 1000 nits in HDR. Plus, like the name suggests true black, which is always a great help in HDR. Pretty much any OLED - even with only 700 nits or so peak brightness - can likely display a better HDR image than most HDR 1000 or HDR 1400 monitors with 1000+ nits of peak brightness, simply because of OLED's superior dimming and lack of blooming.

About monitor marketing BS

 

Current Specs:

CPU: AMD Ryzen 5 5600X - Motherboard: ASUS ROG Strix B550-E - GPU: PNY RTX 3080 XLR8 Epic-X - RAM: 4x8GB (32GB) G.Skill TridentZ RGB 3600MHz CL16 - PSU: Corsair RMx (2018) 850W - Storage: 500 GB Corsair MP600 (Boot) + 2 TB Sabrent Rocket Q (Storage) - Cooling: EK, HW Labs & Alphacool custom loop - Case: Lian-Li PC O11 Dynamic - Fans: 6x Noctua NF-A12x25 chromax - AMP/DAC: FiiO K5 Pro - OS: Windows 11 preview - Monitor: ASUS ROG Swift PG35VQ - Mouse: Logitech G Pro + Powerplay - Keyboard: Logitech G915 TKL - Headphones: Beyerdynamic Amiron Home - Microphone: Antlion ModMic

 

Temperatures @steady state: Furmark + CinebenchR23 running for 1 hour. Fans @850RPM. Pump @1600RPM.

Water: 37°C

CPU: 73°C

GPU: 54°C

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Stahlmann said:

G-Sync Ultimate and FreeSync Premium Pro having any impact on HDR gaming is pure BS. Even non-certified monitors have no problem using VRR and HDR at the same time and HDR typically doesn't add significant input lag. (The only exception would be TV's that don't have a "game mode" of HDR)

 

The reason why the "HDR400 True Black" certification is actually useable is because the monitor can still do over 1000 nits in HDR. Plus, like the name suggests true black, which is always a great help in HDR. Pretty much any OLED - even with only 700 nits or so peak brightness - can likely display a better HDR image than most HDR 1000 or HDR 1400 monitors with 1000+ nits of peak brightness, simply because of OLED's superior dimming and lack of blooming.

Thats unfortunate... for my wallet.. 
Thank you for the quick response! 

Oh well been living the VA life exclusively for a long time might, be time to see what all the fuss is about!
 

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, MrPapis said:

Thats unfortunate... for my wallet.. 
Thank you for the quick response! 

Oh well been living the VA life exclusively for a long time might, be time to see what all the fuss is about!
 

It has a few problems that unfortunately ruled it out for me, but it's still one of the best monitors around. As long as you can control your ambient light and fan noise isn't too much of a concern go for it.

About monitor marketing BS

 

Current Specs:

CPU: AMD Ryzen 5 5600X - Motherboard: ASUS ROG Strix B550-E - GPU: PNY RTX 3080 XLR8 Epic-X - RAM: 4x8GB (32GB) G.Skill TridentZ RGB 3600MHz CL16 - PSU: Corsair RMx (2018) 850W - Storage: 500 GB Corsair MP600 (Boot) + 2 TB Sabrent Rocket Q (Storage) - Cooling: EK, HW Labs & Alphacool custom loop - Case: Lian-Li PC O11 Dynamic - Fans: 6x Noctua NF-A12x25 chromax - AMP/DAC: FiiO K5 Pro - OS: Windows 11 preview - Monitor: ASUS ROG Swift PG35VQ - Mouse: Logitech G Pro + Powerplay - Keyboard: Logitech G915 TKL - Headphones: Beyerdynamic Amiron Home - Microphone: Antlion ModMic

 

Temperatures @steady state: Furmark + CinebenchR23 running for 1 hour. Fans @850RPM. Pump @1600RPM.

Water: 37°C

CPU: 73°C

GPU: 54°C

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share


×