Jump to content

HDR comparison - PG35VQ vs AW3423DW

It is not worth it.

 

1. It's not a true HDR 1000 monitor that requires at least 850-1150nits in 50%-100% window. It's just a HDR 400 (even less due to ABL) OLED  with gimmick 1-2% window size that reaches 1000nits. You will have frequent ABL in HDR 1000 mode. 

 

2. Despite other issues, the monitor flickers in way that's not good for the eyes. 

 

 

=============Part 1: flickering================

 

To record the flickering, the camera shutter speed is set below 1/1000.

 

In the below tests, the monitor flickers. The flickering is not perceptible to human eyes but is very prone to eye strain compared to the traditional DC dimming or high-frequency PWM dimming.

 

Due to OLED's physical properties, changing the current intensity alone will impact both the brightness and the color accuracy. OLED manufacturers have to use PWM combined with their analog algorithm, aka "emulated DC diming", to display color with moderate brightness control. But OLED still flickers due to the imperfect hybrid implementation.

 

In this case, AW3423DW is trying to use emulated DC dimming but ends up making a worse result. The flickering frequency is the same as the monitor refresh rate. The frequency is low.

 

To make things worse, due to the lack of a polarizing layer, it needs to be used with dim ambient light; due to the ABL, its brightness fluctuates. In the particular video, every parry comes with ABL though the camera doesn't show it clearly. Eye strain can happen very quickly in scenes where brightness fluctuates even if the overall brightness is less than 400nits.

 

The combination of these is commercially in a grey area where whether or not it results in eye damage in long-term use. In general, the flicker is not healthy for the eyes, especially in a dim environment.

 

The package, the manual, and the Dell website only describe "flicker-free" as far as one of Dell's product features without any indication of a flicker-free TÜV certification.

 

There is a TÜV certification on Certipedia stating this model was certified for flicker-free. From the description, the panel is specifically mentioned as a flat panel. It can be an early model.

 

The market trick is that Dell can still trademark their product features as ComfortView that includes only low blue light TÜV certification.

 

I don't recommend this monitor for the long intensive daily drive if a gamer only uses one monitor in a basement for 3 years.

 

If you have multiple monitors and tend to replace them every year, this monitor should be probably fine.

 

 

 

=============Part 2: HDR================

 

Now I talk about HDR: 

 

HDR 1000 videos are from The Spears and Munsil UHD HDR Benchmark.

 

This comparison needs at least two exposure settings for accurate HDR comparison in SDR pictures. This is how to compare HDR in SDR mode. 

 

The exposure is set at ISO 100, shutter speed 1/125, and ISO 100, shutter speed 1/25. 

 

The middle monitor shows the reference luminance level. When the curves are flattened at a top level, that level is where 1000nits is. 

 

In the pictures with 1/125 shutter speed, details are preserved on both monitors. In the pictures with 1/25 shutter speed, the details are not preserved but have more pronounced brightness in SDR. The difference regarding relative brightness is represented at both settings though a true HDR 1000 monitor appeared to be overexposed in 1/25 while AW3423DW appears to be dim in 1/125. I expect you can see it in the comparison to know that a true HDR 1000 monitor delivers 2x-3x more luminance, aka more contrast to the eyes, than AW3423DW in some high APL 800nits scenes without losing any details or causing distracting blooming. Therefore, a true HDR monitor delivers more realistic images. In HDR (also in SDR), except fast pixel response, AW3423DW is not at the same level as a true HDR 1000 monitor due to ABL. AW32423DW only looks the same when it displays small window size with a large black background. Also, in these high APL HDR scenes blooming is not noticeable because central object is emitting 1000nits luminance, making edge blooming unnoticeable to the eyes, even to the camera. In average HDR, blooming is not noticeable either unless you actively search for it. 

 

I have to warn you most people don't know HDR very well. They also don't have both monitors to compare but low brightness OLEDs. I suggest you don't listen to people who don't know how to use a true HDR monitor but a mere 200nits full-field OLED TV. 

 

OLED is always a mid-tier monitor and it is going to be a mid-tier compared to the FALD LCD. 

 

Its brightness is struggling. So does its contrast. I mean what I say: The contrast of OLED loses to FALD LCD if the brightness is not enough. The infinite (x/0) contrast of OLED is not the true contrast due to the compromise of brightness. The higher the brightness goes without rising the black level, the more contrast the monitor displays. And FALD LCD has more contrast in this regard. It's also why the premium/flagship product out there is always FALD LCD. Most people don't understand it. 


In some cases, a true HDR monitor with its caliber in SDR 400nits can look even better than AW3423DW in HDR when ABL kicks the OLED below sub 400nits. 

 

Ture HDR 1000 vs AW3423DW HDR

Spoiler

spacer.png

spacer.png

spacer.png
spacer.png

spacer.png

52143852945_5a36a26be5_o_d.png

52142264760_c199dde00e_o_d.png

52142343842_6f3cb9f2d4_o_d.png

52142014859_0a7f795fed_o_d.png

52143385303_7208875797_o_d.png

52142014594_0c075096f2_o_d.png

52143384923_046cb1eb48_o_d.png

52140752337_8b66f9f4bb_o_d.png

52143384458_0b4af235f0_o_d.png

52141795888_3bd5034534_o_d.png

52143850875_8b8925d4c6_o_d.png

52140751867_323a4ee98f_o_d.png

52142341742_8912aac877_o_d.png

52142013684_c671879211_o_d.png

52143603374_24ffda8c32_o_d.png

52140751432_90ccc6bc3d_o_d.png

52142340967_b7675c3513_o_d.png

52140751217_2c16ecca87_o_d.png
52143602539_501ddec421_o_d.png

52141794688_0ce87f8ec3_o_d.png

52143876090_84e44df00f_o_d.png

52141814143_96ab3069d5_o_d.png

52143390966_4c9bb56c38_o_d.png

52142031769_cfc655a19a_o_d.png

52142366212_5954da9c42_o_d.png

52140769637_4fda41ee74_o_d.png

52143628089_20f3394fa2_o_d.png

52140769327_5409406efb_o_d.png

52143874580_e8376e2186_o_d.png

52142280625_587d6e8261_o_d.png

52143407613_6911335712_o_d.png

52140768737_e9674859ab_o_d.png

52143873610_ba56c3fef3_o_d.png

52140768452_fc9f1b7c42_o_d.png

52143873190_50ba6f7bff_o_d.png

52140768072_1aa168657f_o_d.png

52142364017_6bffdecf80_o_d.png

52142279500_e4875f7274_o_d.png

52143405988_ffb60b968a_o_d.png

52141811508_82347c1e50_o_d.png

52143871950_2278306b87_o_d.png

52141811278_c417c8973f_o_d.png

52143625034_510ec364c7_o_d.png

52140766812_56fb08589d_o_d.png

52143624644_9ee393b22c_o_d.png

52140766497_bfebb0e458_o_d.png

52142361952_3b566a3c7f_o_d.png

52140765862_713414ae8c_o_d.png

52143403893_e3334e2e0b_o_d.png

52142277285_389fb1d433_o_d.png

52143385416_f0aa2b41c3_o_d.png

52142027319_c494698042_o_d.png

52143385081_036febbeed_o_d.png

52141808933_4f008ffcbd_o_d.png

52143869345_7238cf7800_o_d.png

52142276350_c40c3e1369_o_d.png

52142360377_34b974e761_o_d.png

52142276025_b5c1272126_o_d.png

52143384036_9cf94a70e9_o_d.png

52141808078_dfd20cf82e_o_d.png

52143621914_4afee905cd_o_d.png

52141807773_323db029d9_o_d.png

52143868075_6e744ba15a_o_d.png

52142275175_87e29c6f96_o_d.png

52142358942_87623ab7cd_o_d.png

52140763242_45f7117509_o_d.png

52143400863_3cfd0c90b0_o_d.png

52140762922_fddd3d2439_o_d.png

52142358282_979ed7c7f3_o_d.png

52141806403_6ed79b6c46_o_d.png

52143381871_737653498d_o_d.png

52142273755_ff529627f1_o_d.png

52143399618_a83de3b1bf_o_d.png

52142273385_5c6e051a2f_o_d.png

52143399163_334bc429bf_o_d.png

52142273070_d283d6543b_o_d.png

52143618989_c13016370a_o_d.png

52141804968_e634a8c062_o_d.png

spacer.png

52141804683_e4ded4cd73_o_d.png

52143864645_fd65f6722c_o_d.png

52142272135_db68a66213_o_d.png

52143864280_b14c9a112d_o_d.png

52142022284_d41b063aaa_o_d.png

52143397373_b62ba6c5cf_o_d.png

52141803678_ef661d8cf3_o_d.png

52143863555_d4b34874c4_o_d.png

52140759927_ab0268f474_o_d.png

52142354852_f15f03b83c_o_d.png

52140759612_c4f6a5862c_o_d.png

52143378426_f906903608_o_d.png

52142270690_3a3146e16e_o_d.png

52143396053_0b148f1d9f_o_d.png

52141802568_83fcd57f5f_o_d.png

52143395693_b173a5f83b_o_d.png

52140758782_a5bfc2fb26_o_d.png

52143395378_157a9c95ce_o_d.png

52141802083_509e0369a5_o_d.png

52143395028_6840cd127e_o_d.png

52141782091_b26bb55fec_o_d.png

52143861090_157b2de4c5_o_d.png

52140757997_558b0c9105_o_d.png

52143375926_313919597b_o_d.png

52142352707_aceb21d985_o_d.png

52141801358_41a1169fdb_o_d.png

52143613849_b4611d0375_o_d.png

52141800808_eb6f68e593_o_d.png

52142350902_44487a67d2_o_d.png

52141801083_aa2022fb71_o_d.png

52143613509_bc824331ac_o_d.png

52142019029_8b1278c349_o_d.png

52143859150_3e929ea09a_o_d.png

52142268255_56b59b482c_o_d.png

52143612009_502dcd2b33_o_d.png

52142268050_3c064261ff_o_d.png

52143611614_317553ac3b_o_d.png

52141799878_5c710af866_o_d.png

52143857780_f943cc0ee0_o_d.png

52142267575_d5b904acc6_o_d.png

52143857350_87f16a718f_o_d.png

52142267320_e7facd9d3f_o_d.png

52143390298_55de5353be_o_d.png

52141779486_bc96289f4b_o_d.png

52142347967_bdbea01cf1_o_d.png

52142266905_eb4be3c61d_o_d.png

52143856045_c3ea8903f4_o_d.png

52141798743_d18aff7af9_o_d.png

52143370821_12f25a7102_o_d.png

52141778751_4ddd4da21e_o_d.png

52143370406_ee820f1159_o_d.png

52141798243_8413de38f3_o_d.png

52142346232_bb81f9c1df_o_d.png

52140754292_42b8efd476_o_d.png

52143854405_47eff26f8b_o_d.png

52141778006_2029927804_o_d.png

52142345447_7e40151e27_o_d.png

 



YCbCr SDR 400 vs AW3423DW HDR

Spoiler


52162568017_418e478717_o_d.png

52163599513_f9ebbccf03_o_d.png

52163599801_86c0d42b7d_o_d.png
52163600558_cda286164d_o_d.png

52163845244_470fa6e30d_o_d.png

52162581652_f81e024409_o_d.png

52162584702_2feb69b26a_o_d.png

52164100355_d281a1f867_o_d.png

52163853194_993d8cb958_o_d.png

52162591272_c810f352d0_o_d.png

52163616686_af1cf4c4cd_o_d.png

52163856789_9f3a29ecd5_o_d.png

52163621376_036e817024_o_d.png

52164109080_a4062efc31_o_d.png

52163622981_9b27c35830_o_d.png

52164110295_7130f79059_o_d.png

52163624741_3e4b48f869_o_d.png

52162600327_2d28e250e0_o_d.png

52164115190_683519b035_o_d.png

52163631081_c23d9cc18f_o_d.png

52162606887_c70382bcba_o_d.png

52163631488_41a2798554_o_d.png

52163872514_c0868594d9_o_d.png

IMG_20220619_064011

 

52163639696_1cf5ab5674_o_d.png

52163639653_5a4ee8ae5b_o_d.png

52164128120_5610855fd2_o_d.png

52163643656_8432506cb0_o_d.png

52163645301_eef6a0c78e_o_d.png

52162620787_fdfdfb51ed_o_d.png

52164133015_a1e328d20f_o_d.png

52164133415_53ba95ba61_o_d.png

52163646413_667f427792_o_d.png

52163885269_a4a5ffde6a_o_d.png

52162626702_0e018c4d61_o_d.png

52163652018_77e4fc0aa1_o_d.png

52164139695_0ffe43beca_o_d.png

52164140550_690f565474_o_d.png

52163892179_cdee075ddc_o_d.png

52163892599_7d8299ca3c_o_d.png

52162631377_5ca5270a93_o_d.png

52163656078_1e7feabdc9_o_d.png

52163896904_557a57e588_o_d.png

52163902214_b6506eb06d_o_d.png

52162641822_563a2db9a1_o_d.png

52163666633_32371ed069_o_d.png

52164156125_365c5ba344_o_d.png

52164156675_b8f159d56f_o_d.png

52162645402_76dfac684c_o_d.png

52162646432_1725f9092e_o_d.png

52163910219_cc7ea2d0bb_o_d.png

52163673106_b27301d11c_o_d.png

52164159985_207a6de829_o_d.png

52162652127_5862350bba_o_d.png

52162653107_8daa321a9c_o_d.png

52163679161_d2af561927_o_d.png

52163686096_fab29b7379_o_d.png

52163685928_ab42c72e21_o_d.png

52163826604_aa73e7eeb0_o_d.png

 

=============CONSLUSION ================

 

The OLED or QD-OLED is struggling. Brightness is not enough. And PWM fatigue is more severe because the OLED is trying to use emulated DC dimming but end up making a worse result. The flickers frequency is the same as the monitor refresh rate. The frequency is low. The flicker is not healthy to the eyes especially under a dim environment. My eyes become rather irritated when looking at the monitor. It's not as comfortable compared to other true HDR 1000 monitors even though other monitors are much brighter.

 

The traditional DC dimming or high frequency PWM won't have the problem and it is safe to use for a long time. And the HDR monitor is going to hit 10,000nits for image quality.

 

The comparison of the latest QD-OLED vs a 4-year-old FALD 512-zone true HDR 1000 monitor is made to prove this point. 

 

In order to archive a high-level HDR performance, OLED has to deal with flickering and brightness one way or another. But that won't happen very soon. 

 

So I suggest you don't buy it. You wait. This monitor won't last even for a short pierid of time considering a better FALD LCD is becoming much cheaper. 

Link to comment
Share on other sites

Link to post
Share on other sites

As an owner of both an Asus PG35VQ and an LG C9 i can say a few things about this particular comparison. I have to say these pictures DO NOT represend the PG35VQ's HDR performance at all. The blooming is practically invisible in these pictures, but in real live it's very apparent with lots of these scenes. I also ran throught the standard HDR demos on youtube when i first got it.

 

The Asus might outperform it when it comes to raw brightness, but everything else is worse. The biggest problem for the Asus is blooming, especially while viewing HDR. Since it only has a native contrast ratio of about 2000:1 and "only" 512 local dimming zones it's limited in that regard. If you have a black loading screen for example and a white loading icon in the bottom right, a significant portion of the screen has raised blacks.

 

On my OLED TV on the other hand, you don't get any blooming at all. And even though it's significantly dimmer, 700 nits is still more than enough to create an amazing HDR experience when combined with OLED's pixel-level dimming and therefore lack of blooming. And since the AW3423DW has a peak brightness of over 1000 nits, combined with an even wider color volume i can only imagine that the HDR experience will be even better than the C9.

 

And people forget all the time: It's very rare to see large bright spots in HDR (most of these are snowy scenes) . The peak brightness is utilized to make highlights stand out, which is why peak brightness in =<10% is most important. To this day i haven't noticed the ABL on my TV once, at leat in content like games or movies. Of course it's present when you run your desktop at full brightness, but the AW3423DW can do 250 nits fullscreen, which is more than enough. If you're not in an extremely bright environment (where the AW3423DW would be the wrong choice either way) then you probably won't even go all the way to 250 nits. It's just uncomfortably bright in a dark environment. (At least for desktop use where large portions of the screen can be white).

 

If a movie in a snowy environment is graded in a way that the APL (average picture level or average scene brightness) is over 1000 nits than they didn't understand HDR to begin with. If you want to do it right, the snow should be around 100 nits when in direct sunlight, with only reflections like "twinkling" going up to >1000 nits. That way the snow looks much more realistic than just a bright white blotch.

 

And lastly: Being able to to >1100 nits fullscreen white flashes is not necessarily a good thing. It's bright enough to actually make you squint your eyes. 250 nits fullscreen is more than enough to have the effect of a bright flash and it isn't at a point that is uncomfortable to the user.

 

TL:DR: It's not even a comparison, really. The AW3423DW is just a much, much better HDR display.

And that's coming from a PG35VQ owner. So if anything, i should be biased against the Dell. But i do have to say, the PG35VQ is an amazing monitor with great HDR capabilities. In fact I'd still count it as one of the best HDR monitors available. But It's simply not "the best", that goes to the AW3423DW.

 

The main problem with this comparison is that OP and the creator of these pictures tries to compare HDR displays on an SDR display, which is simply impossible. It's the same as judging how good audiophile headphones sound through your $20 desktop speakers. With these comparisons you have to trust the judgement of the content creator, as your own will be highly inaccurate.

About monitor marketing BS

 

CPU: AMD Ryzen 5 5600X - Motherboard: ASUS ROG Strix B550-E - GPU: PNY RTX 3080 XLR8 Epic-X - RAM: 4x8GB (32GB) G.Skill TridentZ RGB 3600MHz CL16 - PSU: Corsair RMx (2018) 850W - Storage: 500 GB Corsair MP600 (Boot) + 2 TB Sabrent Rocket Q (Storage) - Cooling: EK, HW Labs & Alphacool custom loop - Case: Lian-Li PC O11 Dynamic - Fans: 6x Noctua NF-A12x25 chromax - AMP/DAC: FiiO K5 Pro - OS: Windows 10 Pro - Monitor: LG C2 OLED 42" - Mouse: Logitech G Pro - Keyboard: Logitech G915 TKL - Headphones: Beyerdynamic Amiron Home - Microphone: Antlion ModMic

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Stahlmann said:

As an owner of both an Asus PG35VQ and an LG C9 

Your reply is biased at the best. I suggest you take your camera, have the benchmark disk play on your both displays for a while.

 

Maybe you don't own a PG anymore, cannot compare the difference.

 

1000nits full-field PG will crash sub 200nits full-field C9 very easily in the benchmark video due to most of scenes have average 800nits output. And you don't notice blooming because central object is emitting 1000 nits luminance, making edge blooming unnoticeable to the eyes, even the camera cannot capture it.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, MonitorFlicker said:

Your reply is biased at the best. I suggest you take your camera, have the benchmark disk play on your both displays for a while.

Not necessary. I did calibrate both displays professionally, so i know full well what either them are capable of. Plus i have thousands of hours of use time with both, which only further backs up my experiences.

 

1 minute ago, MonitorFlicker said:

Maybe you don't own a PG anymore, cannot compare the difference.

I have the PG35VQ sitting right in front of me. It's the display i'm using right now. And the C9 OLED is directly behind me and used almost every evening to watch movies.

 

1 minute ago, MonitorFlicker said:

1000nits full-field PG will crash sub 200nits full-field C9 very easily in the benchmark video.

Of course it does, and i said so in my post above. Thing is, you rarely see full field white images in HDR content. And if you do, you don't want to have a monitor blasting 1000 nits full-field at you. If you ever owned a display with these capabilities you should know that this is very uncomfortable to your eyes. A 150-200 nit fullscreen white flash has the same effect just because of the big portion of the screen that is used. It's equally as capable at delivering a "flashbang effect" when transitioning from a dark scene to fullscreen white.

 

1 minute ago, MonitorFlicker said:

And you don't notice blooming because central object is emitting 1000 nits luminance, making edge blooming unnoticeable to the eyes, even the camera cannot capture it.

No i notice blooming because i sit in front of it during these test scenes and notice it.

 

 

AGAIN, you can not look at pictures on your SDR monitor trying to compare 2 HDR monitors:

3 hours ago, Stahlmann said:

The main problem with this comparison is that OP and the creator of these pictures tries to compare HDR displays on an SDR display, which is simply impossible.

About monitor marketing BS

 

CPU: AMD Ryzen 5 5600X - Motherboard: ASUS ROG Strix B550-E - GPU: PNY RTX 3080 XLR8 Epic-X - RAM: 4x8GB (32GB) G.Skill TridentZ RGB 3600MHz CL16 - PSU: Corsair RMx (2018) 850W - Storage: 500 GB Corsair MP600 (Boot) + 2 TB Sabrent Rocket Q (Storage) - Cooling: EK, HW Labs & Alphacool custom loop - Case: Lian-Li PC O11 Dynamic - Fans: 6x Noctua NF-A12x25 chromax - AMP/DAC: FiiO K5 Pro - OS: Windows 10 Pro - Monitor: LG C2 OLED 42" - Mouse: Logitech G Pro - Keyboard: Logitech G915 TKL - Headphones: Beyerdynamic Amiron Home - Microphone: Antlion ModMic

Link to comment
Share on other sites

Link to post
Share on other sites

I mean one is an LCD and while much beteer than regular non HDR monitors it's still and LCD so not really proper HDR in a way as per pixel lit display.  Haloing is a big no for me. 

Ryzen 7 3800X | X570 Aorus Elite | G.Skill 16GB 3200MHz C16 | Radeon RX 5700 XT | Samsung 850 PRO 256GB |Mousepad: Skypad 3.0 XL | Mouse: Zowie S1-C |Keyboard: Corsair K63 MX red | OS: Windows 11

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Stahlmann said:

I have the PG35VQ sitting right in front of me. It's the display i'm using right now. And the C9 OLED is directly behind me and used almost every evening to watch movies.

Nice. Then you play that benchmark and tell me which one looks better, much better in fact. That disk is cheap. 
 

You also talk about how to compare monitors. Not everybody has a 4,000nits reference HDR monitor and a proper camera set to recording actual HDR comparison videos. There is no platform for general audience to know how HDR looks if they are using SDR monitor. 
 

I know people will talk about numbers and stuff because that is all. 

 

What I did is exactly how HDR photo works by combining photos with different exposure to generate a HDR result. 

So you tell me how good your C9 actually looks compared to PG35VQ in the benchmark.

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Doobeedoo said:

I mean one is an LCD and while much beteer than regular non HDR monitors it's still and LCD so not really proper HDR in a way as per pixel lit display.  Haloing is a big no for me. 

You don't notice the haloing effect when the over all luminance is high. The edge haloing effect will be unnoticeable to the eyes, to the camera as well. You can tell in the pictures there is no haloing effect. This is also one of the reason Dolby reference HDR monitors are all dual layers LCDs. They replaced the OLED.  

Other reasons why OLED cannot keep with with HDR is:

 

1.the brightness cannot be sustained.

 

2. PWM causes eye fatigue. You can see that in my videos about AW3423DW flickering. While other two DC dimming HDR 1000 monitors doesn't have the issue while displaying higher, even more comfortable brightness at the same time. 

Link to comment
Share on other sites

Link to post
Share on other sites

I not sure how the AW3423DW as that monitor no longer in my wanted list as they is many cons on that.

 

But I am pretty sure LCD still cannot fight with OLED in HDR movie playback.

 

I own LG C9 and yes the nits is not reaching 1000 but it still out perform FALD LCD (mini LED is also LCD) for movie playback. Bright is not the only terms the HDR look nice.

 

I also own the Samsung Odyssey Neo G9 which is 2048 local dimming zone FALD with 1015 nits show at Windows Display setting. It is not certified but it is confirmed HDR 1000 capable. I will ignore that marketing gimmick of Quantum HDR 2000. Yes it also looks nice in HDR when FALD is on for movie and gaming but still no way near the OLED HDR quality output for movies, not even for gaming as well but I not using my C9 for gaming as 55" is way too big for gaming to me but I had try before previously.

 

The haloing effect will not noticeable in gaming - I agreed but still very noticeable when using at Windows, I do check some FALD monitor review on YouTube included this PG35VQ which is worst than Samsung Odyssey Neo G9 when FALD is on. FALD is still not a good tech for use in Windows, this is an issue as no monitor can make it auto switch FALD on and off when set Windows environment as HDR. The OLED will not have this issue but then OLED have another issue ABL.

 

As for the flickering you mentioned, I did not seen this issue at all in my OLED TV, off course also not in my FALD monitor. I not sure why that AW3423DW will have, but sometimes the flickering record by the camera is not impact to human eyes as human eyes cannot seen it.

 

In short summary - they is still no perfect PC monitor that is great for everything.

PC: AMD Ryzen 9 5900X, Gigabyte Geforce RTX 3080 Vision OC 10G, X570 AORUS Elite WIFI Motherboard, HyperX FURY 32GB DDR4-3200 RGB RAM, Creative Sound Blaster AE-9 Sound Card, Samsung 970 Evo Plus M.2 SATA 500GB, ADATA XPG SX8200 Pro M.2 SATA 2TB, Asus HyperX Fury RGB SSD 960GB, Seagate Barracuda 7200RPM 3.5 HDD 2TB, Cooler Master MASTERLIQUID ML240R ARGB, Cooler Master MASTERFAN MF120R ARGB, Cooler Master ELV8 Graphics Card Holder ARGB, Asus ROG Strix 1000G PGU, Lian Li LANCOOL II MESH RGB Case, Windows 11 Pro (22H2).


Laptop: Asus VivoBook 15 OLED: Intel® Core™ i3-1125G4, Intel UHD, 8 GB RAM, Micron NVMe 512 GB, Windows 11 Home (21H2), Illegear Z5 SKYLAKE: Intel Core i7-6700HQ, Nvidia Geforce GTX 970M, 16 GB RAM, ADATA SU800 M.2 SATA 512GB, Windows 11 Pro (22H2).

 

Monitor: Samsung Odyssey G8 Neo 32" 3840x2160 240hz mini-LED VA HDR, Samsung Odyssey G9 Neo 49" 5120x1440 240hz mini-LED VA HDR, LG UltraGear Gaming Monitor 34" 34GN850 3440x1440 144hz (160hz OC) NanoIPS HDR, LG Ultrawide Gaming Monitor 34" 34UC79G 2560x1080 144hz IPS SDR, LG 24MK600 24" 1920x1080 75hz Freesync IPS SDR, BenQ EW2440ZH 24" 1920x1080 75hz VA SDR.


Input Device: Logitech G913 Lightspeed Wireless RGB Mechanical Gaming Keyboard, Logitech G903 Lightspeed HERO Wireless Gaming Mouse, Logitech Pro X, Logitech MX Keys, Logitech MX Master 3, XBOX Wireless Controller Covert Forces Edition, Corsair K70 RAPIDFIRE Mechanical Gaming Keyboard, Corsair Dark Core RGB Pro SE Wireless Gaming Mouse, Logitech MK850 Wireless Keyboard & Mouse Combos.


TV Entertainment: LG 55" C9 OLED HDR Smart UHD TV with AI ThinQ®, 65" Samsung AU7000 4K UHD Smart TV, Nvidia Shield TV Pro (2019 edition), Apple TV 4K (2017 & 2021 Edition), Chromecast with Google TV, Sony UBP-X700 UltraHD Blu-ray, Panasonic DMP-UB400 UltraHD Blu-ray, LG SK9Y 5.1.2 channel Dolby Atmos, Hi-Res Audio SoundBar.

 

Mobile & Smart Watch: Samsung Galaxy S22 Ultra (Burgundy), Samsung Galaxy Watch4 (Green), Huawei Watch GT (Saddle Brown).

 

Others Gadgets: Logitech G560 2.1 USB & Bluetooth Speaker, Logitech Z625 2.1 THX Speaker, Edifier M1370BT 2.1 Bluetooth Speaker, Sony MDR-Z1R, Sony WH-1000XM5, Sony WH-1000XM4, Apple AirPods Pro, Samsung Galaxy Buds2, Asus SBW-06D2X-U Blu-ray RW Drive, 70 TB Ext. HDD, j5create JVCU100 USB HD Webcam with 360° rotation, ZTE UONU F620, Maxis Fibre WiFi 6 Router, Fantech MPR800 Soft Cloth RGB Gaming Mousepad, Fantech Headset Headphone Stand AC3001S RGB Lighting Base Tower, Infiniteracer RGB Gaming Chair

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, MonitorFlicker said:

So you tell me how good your C9 actually looks compared to PG35VQ in the benchmark.

In my personal opinion: Much better. Blooming is too big of an issue, so the PG35VQ can get to 2000 nits and i'd still prefer the OLED, even if it's only 600 nits.

 

2 minutes ago, MonitorFlicker said:

You don't notice the haloing effect when the over all luminance is high.

But HDR often is not overall bright. That's why blooming/haloing IS an issue. If you can tell me with a straight face that you don't see blooming on your PG35VQ, then you either don't know what blooming is or you're wearing sunglasses.

 

2 minutes ago, MonitorFlicker said:

The edge haloing effect will be unnoticeable to the eyes, to the camera as well. You can tell in the pictures there is no haloing effect. This is also one of the reason Dolby reference HDR monitors are all dual layers LCDs. They replaced the OLED.

Sure they replaced OLED for reference monitors, but these dual layer LCD's have two layers to reproduce OLED-like blacks and help surpress blooming.

 

2 minutes ago, MonitorFlicker said:

Other reasons why OLED cannot keep with with HDR is:

 

1.the brightness cannot be sustained.

If you want to go that route, you can also say LCD isn't good enough for HDR because no LCD can hit 10.000 nits.

That's why tone mapping exists: To map the HDR content's brightness and color values to your TV's capabilities.

 

2 minutes ago, MonitorFlicker said:

2. PWM causes eye fatigue. You can see that in my videos about AW3423DW flickering. While other two DC dimming HDR 1000 monitors doesn't have the issue while displaying higher, even more comfortable brightness at the same time. 

OLED's don't have noticeable flickering. It's nothing like PWM flickering in past monitors. They don't fully turn off in between the frames (which is what flicker would be), they have a slight dip in brightness with every refresh. It's not an issue for anyone i personally know. It may be an issue for those who are susceptible to headaches though.

 

This is what PWM flicker looks like: (brightness drops to 0 in between frames)

image.png.9a6e8c08975327cc09c56c7f1cc582b8.png

 

This is how an OLED (AW3423dw) looks like: (only slight variations)

image.png.be38bcbb0f16ab88d629849c5642387d.png

About monitor marketing BS

 

CPU: AMD Ryzen 5 5600X - Motherboard: ASUS ROG Strix B550-E - GPU: PNY RTX 3080 XLR8 Epic-X - RAM: 4x8GB (32GB) G.Skill TridentZ RGB 3600MHz CL16 - PSU: Corsair RMx (2018) 850W - Storage: 500 GB Corsair MP600 (Boot) + 2 TB Sabrent Rocket Q (Storage) - Cooling: EK, HW Labs & Alphacool custom loop - Case: Lian-Li PC O11 Dynamic - Fans: 6x Noctua NF-A12x25 chromax - AMP/DAC: FiiO K5 Pro - OS: Windows 10 Pro - Monitor: LG C2 OLED 42" - Mouse: Logitech G Pro - Keyboard: Logitech G915 TKL - Headphones: Beyerdynamic Amiron Home - Microphone: Antlion ModMic

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, MonitorFlicker said:

You don't notice the haloing effect when the over all luminance is high. The edge haloing effect will be unnoticeable to the eyes, to the camera as well. You can tell in the pictures there is haloing effect. This is also one of the reason Dolby reference HDR monitors are all dual layers LCDs. They replaced the OLED.  

Other reasons why OLED cannot keep with with HDR is:

 

1.the brightness cannot be sustained.

 

2. PWM causes eye fatigue. You can see that in my videos about AW3423DW flickering. While other two DC dimming HDR 1000 monitors doesn't have the issue while displaying higher, even more comfortable brightness at the same time. 

I can definitely notice haloing when using the monitor, not talking certain edge cases like that with overall high luminence scenes at times. Nowhere near enough zones not even close to not be an issue. 

Just these gaming MiniLED monitors are obscenely expensive already and dual layer LCD though, the pro displays aside they are in their own category. Also OLED is still used along.

No MiniLED monitor is worth it's price. Especially now QD OLED is coming. We just need to see more models and tech to improve a bit more. It's already much cheaper vs MiniLED though.

The high peak brightness may not be the thing but it can sustain decently enough, agan this is just one and first monitor. I won't be watching the display outside or super bright room. Aside that it's better in every other way.

The flicker can differ on cam and live depends if PWM is used in kHz range and this one is not same as like OLED TV ones that I saw or some monitors. It also may not be an issue for all and in real use. Same for strobing some like some not.

Ryzen 7 3800X | X570 Aorus Elite | G.Skill 16GB 3200MHz C16 | Radeon RX 5700 XT | Samsung 850 PRO 256GB |Mousepad: Skypad 3.0 XL | Mouse: Zowie S1-C |Keyboard: Corsair K63 MX red | OS: Windows 11

Link to comment
Share on other sites

Link to post
Share on other sites

On 6/14/2022 at 9:20 PM, Stahlmann said:

In my personal opinion: Much better. Blooming is too big of an issue, so the PG35VQ can get to 2000 nits and i'd still prefer the OLED, even if it's only 600 nits.

 

But HDR often is not overall bright. That's why blooming/haloing IS an issue. If you can tell me with a straight face that you don't see blooming on your PG35VQ, then you either don't know what blooming is or you're wearing sunglasses.

 

Sure they replaced OLED for reference monitors, but these dual layer LCD's have two layers to reproduce OLED-like blacks and help surpress blooming.

 

If you want to go that route, you can also say LCD isn't good enough for HDR because no LCD can hit 10.000 nits.

That's why tone mapping exists: To map the HDR content's brightness and color values to your TV's capabilities.

 

OLED's don't have noticeable flickering. It's nothing like PWM flickering in past monitors. They don't fully turn off in between the frames (which is what flicker would be), they have a slight dip in brightness with every refresh. It's not an issue for anyone i personally know. It may be an issue for those who are susceptible to headaches though.

 

This is what PWM flicker looks like: (brightness drops to 0 in between frames)

 

 

This is how an OLED (AW3423dw) looks like: (only slight variations)

 

Don't even try to fool me. I do HDR works. You are just biased. If you dare to show the video play on both monitors side by side, 99% people will say PG35VQ is way more impressive than your C9 or AW3423DW.  

 

And the bloom is not noticeable in the videos, I have said this again and again. The picture and camera did the same. 

Instead, you show numbers and crap measured by limited devices that are not representative. PWM OLED is causing fatigue. Go search how many people has fatigue on OLED devices. The AW monitor is set at the right side, and my right eye is red for the moment even most of the time I don't even look directly at the monitor. I don't know if safety originations such as OSHA has a report on OLED. I have read a report on PWM OLED causing fatigue. It is true. 

 

Monitor is going to reach 10,000 nits standard. Do you even realize we are still in the early stage of HDR considering how fewer true HDR devices are? 

Show me your actual picture test results or a HDR camera footages as I use a BVM-3000 reference monitor. 

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, MonitorFlicker said:

 

Don't even try to fool me. I do HDR works. You are just biased. If you dare to show the video play on both monitors side by side, 99% people will say PG35VQ is way more impressive than your C9 or AW3423DW.

And that's their opinion. As i have said multiple times, a dimmer picture with less artifacts looks better to me than a brighter picture with blooming.

 

29 minutes ago, MonitorFlicker said:

And the bloom is not noticeable in the videos, I have said this again and again. The picture and camera did the same. 

Yet somehow i frequently notice it in movies AND games - be it SDR or HDR. It doesn't really annoy me, but i still notice it and it takes away from the picture quality. It is the main reason why i prefer the less bright picture of my OLED TV.

 

29 minutes ago, MonitorFlicker said:

Instead, you show numbers and crap that is not representative. PWM OLED is causing fatigue.

I argue with my own experience and personal opinion. I've been using the C9 OLED for 2.5 years now and i've had the PG35VQ for almost a year now.

 

I personally don't notice any increase in eye fatigue and obviously the majority of the market isn't either. Otherwise it would be a much bigger argument against OLED, which it isn't, because it isn't a problem for the majority of people.

 

29 minutes ago, MonitorFlicker said:

Show me your actual picture test results or a HDR camera footages as I use a BVM-3000 reference monitor. 

I obviously can't since i don't have a reference monitor or a camera capable of capturing what i see with my own eyes. I can only argue on what i have tested myself. And since it's clear that the AW3423DW delivers a better HDR experience than practically any OLED TV currently on the market (other than the S95B and soon A95K) i think it's fair to assume it's likely better than the C9 i have and prefer over the PG35VQ.

About monitor marketing BS

 

CPU: AMD Ryzen 5 5600X - Motherboard: ASUS ROG Strix B550-E - GPU: PNY RTX 3080 XLR8 Epic-X - RAM: 4x8GB (32GB) G.Skill TridentZ RGB 3600MHz CL16 - PSU: Corsair RMx (2018) 850W - Storage: 500 GB Corsair MP600 (Boot) + 2 TB Sabrent Rocket Q (Storage) - Cooling: EK, HW Labs & Alphacool custom loop - Case: Lian-Li PC O11 Dynamic - Fans: 6x Noctua NF-A12x25 chromax - AMP/DAC: FiiO K5 Pro - OS: Windows 10 Pro - Monitor: LG C2 OLED 42" - Mouse: Logitech G Pro - Keyboard: Logitech G915 TKL - Headphones: Beyerdynamic Amiron Home - Microphone: Antlion ModMic

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Andrewtst said:

I not sure how the AW3423DW as that monitor no longer in my wanted list as they is many cons on that.

 

But I am pretty sure LCD still cannot fight with OLED in HDR movie playback.

 

I own LG C9 and yes the nits is not reaching 1000 but it still out perform FALD LCD (mini LED is also LCD) for movie playback. Bright is not the only terms the HDR look nice.

 

I also own the Samsung Odyssey Neo G9 which is 2048 local dimming zone FALD with 1015 nits show at Windows Display setting. It is not certified but it is confirmed HDR 1000 capable. I will ignore that marketing gimmick of Quantum HDR 2000. Yes it also looks nice in HDR when FALD is on for movie and gaming but still no way near the OLED HDR quality output for movies, not even for gaming as well but I not using my C9 for gaming as 55" is way too big for gaming to me but I had try before previously.

 

The haloing effect will not noticeable in gaming - I agreed but still very noticeable when using at Windows, I do check some FALD monitor review on YouTube included this PG35VQ which is worst than Samsung Odyssey Neo G9 when FALD is on. FALD is still not a good tech for use in Windows, this is an issue as no monitor can make it auto switch FALD on and off when set Windows environment as HDR. The OLED will not have this issue but then OLED have another issue ABL.

 

As for the flickering you mentioned, I did not seen this issue at all in my OLED TV, off course also not in my FALD monitor. I not sure why that AW3423DW will have, but sometimes the flickering record by the camera is not impact to human eyes as human eyes cannot seen it.

 

In short summary - they is still no perfect PC monitor that is great for everything.

If you have a comparison of true HDR 1000 monitors, you won't say the same. 

 

I have used both PG35VQ and NEO G9.  PG35VQ in fact delivers higher sustained brightness than Neo G9. The FALD algorithm is worse on NEO G9, also why bloom is more noticeable.

The reviews out there doesn't do HDR works so they can miss all the time. 

Yes, bloom is noticeable in certain low light scene, but over all in dark movies, PG35VQ looks the same as AW3423DW except ABL. I just watched Batman, a rather dim movie, on both monitors side by side this week . They look identical. 

 

And speaking about comfort, I can look at PG35VQ all day with highest brightness settings without a single eye fatigue. But with AW3423DW, I just cannot look at it for over 30 minutes.  The AW monitor is set at the right side, and my right eye is red for the moment even most of the time I don't even look directly at the monitor. I don't know if safety originations such as OSHA has a report on OLED. I have read a report on PWM OLED causing fatigue. It is true. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, MonitorFlicker said:

I don't know if safety originations such as OSHA has a report on OLED. I have read a report on PWM OLED causing fatigue. It is true. 

Again - OLED's don't use PWM. If they would, then the "flicker" would become more noticeable on lower brightness settings, as PWM effectively inserts more black frames in between bright frames to lower the brightness.

 

You can gladly show me a halfway current OLED TV that uses PWM dimming, i'll wait.

 

I never heard someone else complain about their OLED TV flickering, unless they accidentally enabled BFI.

About monitor marketing BS

 

CPU: AMD Ryzen 5 5600X - Motherboard: ASUS ROG Strix B550-E - GPU: PNY RTX 3080 XLR8 Epic-X - RAM: 4x8GB (32GB) G.Skill TridentZ RGB 3600MHz CL16 - PSU: Corsair RMx (2018) 850W - Storage: 500 GB Corsair MP600 (Boot) + 2 TB Sabrent Rocket Q (Storage) - Cooling: EK, HW Labs & Alphacool custom loop - Case: Lian-Li PC O11 Dynamic - Fans: 6x Noctua NF-A12x25 chromax - AMP/DAC: FiiO K5 Pro - OS: Windows 10 Pro - Monitor: LG C2 OLED 42" - Mouse: Logitech G Pro - Keyboard: Logitech G915 TKL - Headphones: Beyerdynamic Amiron Home - Microphone: Antlion ModMic

Link to comment
Share on other sites

Link to post
Share on other sites

i own both a x27 and a c1 in rooms where i don't need the 1000nits peak brightness, the blooming on fald does affect the picture quality. 

 

I've also seen the PG35VQ in person, and plan to buy the AW3423DW either way since one of my x27 is dead.

 

I've not seen any flickering on the oled, if it does exist other reviewers would have picked up on it by now. What does give me eye fatigue though is hdr1000 on full blast (i'll never forget when i got flashed in division 2)

 

As an end-user (not a tester) i'd pick OLED over FALD in HDR content, and there's no way i'd pick a PG35VQ over the AW3423DW (now that it's out)

 

If the PG35VQ really was better than the AW3423DW the latter would be a pointless product, which i highly doubt.

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Doobeedoo said:

I can definitely notice haloing when using the monitor, not talking certain edge cases like that with overall high luminence scenes at times. Nowhere near enough zones not even close to not be an issue. 

Just these gaming MiniLED monitors are obscenely expensive already and dual layer LCD though, the pro displays aside they are in their own category. Also OLED is still used along.

No MiniLED monitor is worth it's price. Especially now QD OLED is coming. We just need to see more models and tech to improve a bit more. It's already much cheaper vs MiniLED though.

The high peak brightness may not be the thing but it can sustain decently enough, agan this is just one and first monitor. I won't be watching the display outside or super bright room. Aside that it's better in every other way.

The flicker can differ on cam and live depends if PWM is used in kHz range and this one is not same as like OLED TV ones that I saw or some monitors. It also may not be an issue for all and in real use. Same for strobing some like some not.

The miniLED, microLED FALD is getting cheaper because the manufactures out there are shifting focus from OLED to local dimming for actual HDR 1000 and more. 

 

That is why Sony's premium TV product is Z series. It is expensive but not for long. 

 

The current OLED or QD-OLED is struggling. Brightness is not enough. And PWM is more server when monitor brightness goes relative higher, really not that high but higher in OLED. The problem is when OLED hits higher 1000nits, and you used it at 300-500nits average, larger 500-700 nits ranges of PWM will cause fatigue much easier.  DC dimming won't have the problem and it is safe to use for a long time. And HDR monitor is going to hit 10,000 nits for image quality.  

 

I showed the comparison of the latest QD-OLED vs a 4 year old FALD 512 zone VA monitor to prove this point. 

 

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, xg32 said:

If the PG35VQ really was better than the AW3423DW the latter would be a pointless product, which i highly doubt.

I use PG35VQ for HDR and casual gaming. If you don't use much HDR 1000 content, and X27, C1 works fine for you. Then C2 will be a better option than AW3423DW though there is not much improvement on both of them. Samsung's QLED with local dimming could be a better choice in terms of HDR. 

If you just play games, then AW3423DW will probably do. But the eye fatigue caused by PWM is server to me. The monitor needs low ambient light to have true black level instead of appearing grey. You won't be able to stare at the AW3423DW for a long session. 

 

What I really suggest is to wait. The current OLED is not worth it. Time will tell. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, MonitorFlicker said:

The miniLED, microLED FALD is getting cheaper because the manufactures out there are shifting focus from OLED to local dimming for actual HDR 1000 and more. 

 

That is why Sony's premium TV product is Z series. It is expensive but not for long. 

 

The current OLED or QD-OLED is struggling. Brightness is not enough. And PWM is more server when monitor brightness goes relative higher, really not that high but higher in OLED. The problem is when OLED hits higher 1000nits, and you used it at 300-500nits average, larger 500-700 nits ranges of PWM will cause fatigue much easier.  DC dimming won't have the problem and it is safe to use for a long time. And HDR monitor is going to hit 10,000 nits for image quality.  

 

I showed the comparison of the latest QD-OLED vs a 4 year old FALD 512 zone VA monitor to prove this point. 

 

Nah MiniLED is still way too expensive with little benefit over regular LCD anyway. MicroLED is a tech for future. Local dimming and FALD are simply horrible.

No point in LCD any more, it will stay for a while yeah but for premium QD OLED is to go now.

 

TVs are a different thing vs monitors though, also latest Sony flagship is QD-OLED though.

 

We're talking monitors here, so really we only have this single QD OLED monitor. Again like I said before, the PWM on this or some monitors is not same as on OLED TVs it really depends etc. Nobody wants LCD monitors in high end anymore, they are slow and not best for HDR so. 

Ryzen 7 3800X | X570 Aorus Elite | G.Skill 16GB 3200MHz C16 | Radeon RX 5700 XT | Samsung 850 PRO 256GB |Mousepad: Skypad 3.0 XL | Mouse: Zowie S1-C |Keyboard: Corsair K63 MX red | OS: Windows 11

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, MonitorFlicker said:

The miniLED, microLED FALD is getting cheaper because the manufactures out there are shifting focus from OLED to local dimming for actual HDR 1000 and more. 

 

That is why Sony's premium TV product is Z series. It is expensive but not for long. 

 

The current OLED or QD-OLED is struggling. Brightness is not enough. And PWM is more server when monitor brightness goes relative higher, really not that high but higher in OLED. The problem is when OLED hits higher 1000nits, and you used it at 300-500nits average, larger 500-700 nits ranges of PWM will cause fatigue much easier.  DC dimming won't have the problem and it is safe to use for a long time. And HDR monitor is going to hit 10,000 nits for image quality.  

 

I showed the comparison of the latest QD-OLED vs a 4 year old FALD 512 zone VA monitor to prove this point. 

 

😂😂😂 my oh my!

 

"The miniLED, microLED FALD" + "shifting focus from OLED to local dimming for actual HDR 1000" - this make me laugh loud!

 

Understand 1st what is Mini-LED, microLED - https://medium.com/hd-pro/understanding-oled-qled-mini-led-microled-dont-be-misled-30520b686fcb

 

- Mini-LED is the one shine the FALD as it is the finest for local dimming.

- microLED is the future tech of OLED which no need local dimming. 

 

"shifting focus from OLED to local dimming for actual HDR 1000" - 🤣🤣🤣

 

 

PC: AMD Ryzen 9 5900X, Gigabyte Geforce RTX 3080 Vision OC 10G, X570 AORUS Elite WIFI Motherboard, HyperX FURY 32GB DDR4-3200 RGB RAM, Creative Sound Blaster AE-9 Sound Card, Samsung 970 Evo Plus M.2 SATA 500GB, ADATA XPG SX8200 Pro M.2 SATA 2TB, Asus HyperX Fury RGB SSD 960GB, Seagate Barracuda 7200RPM 3.5 HDD 2TB, Cooler Master MASTERLIQUID ML240R ARGB, Cooler Master MASTERFAN MF120R ARGB, Cooler Master ELV8 Graphics Card Holder ARGB, Asus ROG Strix 1000G PGU, Lian Li LANCOOL II MESH RGB Case, Windows 11 Pro (22H2).


Laptop: Asus VivoBook 15 OLED: Intel® Core™ i3-1125G4, Intel UHD, 8 GB RAM, Micron NVMe 512 GB, Windows 11 Home (21H2), Illegear Z5 SKYLAKE: Intel Core i7-6700HQ, Nvidia Geforce GTX 970M, 16 GB RAM, ADATA SU800 M.2 SATA 512GB, Windows 11 Pro (22H2).

 

Monitor: Samsung Odyssey G8 Neo 32" 3840x2160 240hz mini-LED VA HDR, Samsung Odyssey G9 Neo 49" 5120x1440 240hz mini-LED VA HDR, LG UltraGear Gaming Monitor 34" 34GN850 3440x1440 144hz (160hz OC) NanoIPS HDR, LG Ultrawide Gaming Monitor 34" 34UC79G 2560x1080 144hz IPS SDR, LG 24MK600 24" 1920x1080 75hz Freesync IPS SDR, BenQ EW2440ZH 24" 1920x1080 75hz VA SDR.


Input Device: Logitech G913 Lightspeed Wireless RGB Mechanical Gaming Keyboard, Logitech G903 Lightspeed HERO Wireless Gaming Mouse, Logitech Pro X, Logitech MX Keys, Logitech MX Master 3, XBOX Wireless Controller Covert Forces Edition, Corsair K70 RAPIDFIRE Mechanical Gaming Keyboard, Corsair Dark Core RGB Pro SE Wireless Gaming Mouse, Logitech MK850 Wireless Keyboard & Mouse Combos.


TV Entertainment: LG 55" C9 OLED HDR Smart UHD TV with AI ThinQ®, 65" Samsung AU7000 4K UHD Smart TV, Nvidia Shield TV Pro (2019 edition), Apple TV 4K (2017 & 2021 Edition), Chromecast with Google TV, Sony UBP-X700 UltraHD Blu-ray, Panasonic DMP-UB400 UltraHD Blu-ray, LG SK9Y 5.1.2 channel Dolby Atmos, Hi-Res Audio SoundBar.

 

Mobile & Smart Watch: Samsung Galaxy S22 Ultra (Burgundy), Samsung Galaxy Watch4 (Green), Huawei Watch GT (Saddle Brown).

 

Others Gadgets: Logitech G560 2.1 USB & Bluetooth Speaker, Logitech Z625 2.1 THX Speaker, Edifier M1370BT 2.1 Bluetooth Speaker, Sony MDR-Z1R, Sony WH-1000XM5, Sony WH-1000XM4, Apple AirPods Pro, Samsung Galaxy Buds2, Asus SBW-06D2X-U Blu-ray RW Drive, 70 TB Ext. HDD, j5create JVCU100 USB HD Webcam with 360° rotation, ZTE UONU F620, Maxis Fibre WiFi 6 Router, Fantech MPR800 Soft Cloth RGB Gaming Mousepad, Fantech Headset Headphone Stand AC3001S RGB Lighting Base Tower, Infiniteracer RGB Gaming Chair

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Andrewtst said:

 

"The miniLED, microLED FALD" + "shifting focus from OLED to local dimming for actual HDR 1000" - this make me laugh loud!

 

Understand 1st what is mini-LED, microLED - https://medium.com/hd-pro/understanding-oled-qled-mini-led-microled-dont-be-misled-30520b686fcb

 

 

1st You better not fool yourself around the concept and make a topic about it. They are exactly the same idea of dimming zones.  

 

10 minutes ago, Andrewtst said:

"shifting focus from OLED to local dimming for actual HDR 1000" - 🤣🤣🤣

 

Mark my words. OLED is not going to catch up with HDR in terms of sustained brightness and the comfort of DC dimming.

 

You laugh all you want with your ignorance because in the end of day you are not the one who makes monitors. 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Doobeedoo said:

Nah MiniLED is still way too expensive with little benefit over regular LCD anyway. MicroLED is a tech for future. Local dimming and FALD are simply horrible.

No point in LCD any more, it will stay for a while yeah but for premium QD OLED is to go now.

 

TVs are a different thing vs monitors though, also latest Sony flagship is QD-OLED though.

 

We're talking monitors here, so really we only have this single QD OLED monitor. Again like I said before, the PWM on this or some monitors is not same as on OLED TVs it really depends etc. Nobody wants LCD monitors in high end anymore, they are slow and not best for HDR so. 

The monitors or TVs are going to be brighter. OLED is facing issues with high brightness. 

Next gen QD-OLED will still be blow HDR 1000 because ABL.  And the PWM flickering isn't fixed. 

Next gen LCDs are going at HDR 4000. So in reality, the high end is on DC dimming LCD I'm afraid. There is a reason it is expensive but not for long in the next five years.

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, MonitorFlicker said:

The monitors or TVs are going to be brighter. OLED is facing issues with high brightness. 

Next gen QD-OLED will still be blow HDR 1000 because ABL.  And the PWM flickering isn't fixed. 

Next gen LCDs are going at HDR 4000. So in reality, the high end is on DC dimming LCD I'm afraid. There is a reason it is expensive but not for long in the next five years.

You do know that the best-of-the-best and brightest MiniLED TV's use PWM dimming right? Just throwing that out there...

About monitor marketing BS

 

CPU: AMD Ryzen 5 5600X - Motherboard: ASUS ROG Strix B550-E - GPU: PNY RTX 3080 XLR8 Epic-X - RAM: 4x8GB (32GB) G.Skill TridentZ RGB 3600MHz CL16 - PSU: Corsair RMx (2018) 850W - Storage: 500 GB Corsair MP600 (Boot) + 2 TB Sabrent Rocket Q (Storage) - Cooling: EK, HW Labs & Alphacool custom loop - Case: Lian-Li PC O11 Dynamic - Fans: 6x Noctua NF-A12x25 chromax - AMP/DAC: FiiO K5 Pro - OS: Windows 10 Pro - Monitor: LG C2 OLED 42" - Mouse: Logitech G Pro - Keyboard: Logitech G915 TKL - Headphones: Beyerdynamic Amiron Home - Microphone: Antlion ModMic

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Stahlmann said:

You do know that the best-of-the-best and brightest MiniLED TV's use PWM dimming right? Just throwing that out there...

Don't even try to derail the post. I never said miniLED doesn't use PWM. 

If anything flickers they won't do good in HDR.

But OLED has brightness issues combined with PWM flickering, two issues instead of one. 
Also, the best TV out there is probably still Sony Z9J. There is a reason Sony's miniLED TV is under X series.  

Again, I provide solid facts and proof. You are in denial and act like a total loser without having AW3423DW in person but pure imagination.

Nothing can help you when you try hard to fool yourself. Buy that benchmark disk for a good HDR 1000 reference on your PG35VQ so you don't have to imagine how HDR 1000 looks like. 

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, MonitorFlicker said:

The monitors or TVs are going to be brighter. OLED is facing issues with high brightness. 

Next gen QD-OLED will still be blow HDR 1000 because ABL.  And the PWM flickering isn't fixed. 

Next gen LCDs are going at HDR 4000. So in reality, the high end is on DC dimming LCD I'm afraid. There is a reason it is expensive but not for long in the next five years.

 

Sure they will. General OLED for like TVs is limited and QD OLED is to improve with that, we just need to see more monitors to come over time. Also not everything is in super high brightness anyway. Darker blacks is more important really, but more even is the range where LCD can't match. 

Also ABL is worse on TVs and can be less noticeable on like a monitor. There's the lower nit mode too. The flicker is not an issues as some out there, it's a small dip in brightness not like some full screen like on some LED backlit displays.

Those demoed LCDs with high brightness don't mean much, they're still LCD panels, again, TVs. We're talking monitors here, top MiniLED monitors make no sense with such price and price won't change much. They still have so little number of LEDs 10000 FALD display would cost fortune and make no sense.

Ryzen 7 3800X | X570 Aorus Elite | G.Skill 16GB 3200MHz C16 | Radeon RX 5700 XT | Samsung 850 PRO 256GB |Mousepad: Skypad 3.0 XL | Mouse: Zowie S1-C |Keyboard: Corsair K63 MX red | OS: Windows 11

Link to comment
Share on other sites

Link to post
Share on other sites

Posted (edited)

Might want to put the pictures in a (couple of) spoiler tag(s) or something. This page is like 5 km long.

< removed by moderation >

If you're going to present "solid facts" then you'll have to be ready to defend them, whether it's a tech forum or a scientific paper. Facts are only facts by having surived attempts to poke holes in them. It's nice to see people making these comparisons, but realistically all those pictures show us is a bright image next to a dimmer image and an even brighter image next to a bright image with a random rainbow you say is reference in the middle. Which ties in to the next point:

On 6/14/2022 at 5:10 PM, MonitorFlicker said:

I ask you to play that disk for HDR 1000 reference videos.  You ignored.   I have the comparison pictures. You ignored. 

Instead, you show numbers and crap that is not representative.

Numbers are the representative thing and numbers will be the "solid facts". This is why benchmarks are always graphs, tables, histograms etc. You even argue this yourself by focussing so much on high brightness. It'll be both clearer to communicate and easier to defeind if you can find a metric that encodes the relevant information of your photos and present that.

 

That brings me to this: what proof is shown here? You have two sets of pictures with different exposure times with very little explanation. The second paragraph states one is brighter than the other, but the actual specs/reviews of the brightness would already tell us that in quantifiable numbers. It would be informative if you could add some explanation to the pictures, for example. What do they tell us besides relative brightness? The monumental amounf of images you have attached make it feel like they show something important, but it's hard to tell.

 

Are both monitors calibrated? Is this an "out of the box" comparison? If both are properly calibrated to the same amount of nits in a reference scene, shouldn't they more or less look the same aside from HDR highlights that one can perhaps get brighter than the other? The comparison is again unclear to me. What is "reference" about the central monitor? Is it at reference brightness for us to compare? Are the rainbow curves references levels that we are should do something with? In the latter case, only a reasonable expert is going to be able to make sense of those curves and there's not much explanation to help us out here.

 

 

Regarding flicker, I've been using my LG C9 extensively for two years now, movies, gaming, a bit of productivity here and there, and I must say there is no noticeable flicker unless I enable black frame insertion, in which case it's extremely noticeable to me. Either I'm not that sensitive to it, or I have a good unit if it really is a widespread issue. Regarding brightness, it gets more than bright enough in my living room at night to figuratively burn my eyes out, making me actually squint on bright scenes.

 

Sure, other tech exceeds OLED in terms of brightness, but I'm not sitting here wishing it was capable of 4000 nits. Instead I'm missing the deep blacks and contrast those create of my C9 in all my other monitors. For me, maybe a little boost to brightness to help daylight performance may be nice, but other than that I'm happy with it.

Edited by LogicalDrm

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share


×