Jump to content

Alienware AW3423DW QD-OLED Gaming Monitor

James

Buy an Alienware AW3423DW: https://lmg.gg/czm5C
Buy an LG UltraGear 34GN850-B.AUS: https://geni.us/BeHGRxw
Buy an LG UltraGear 34GP950G-B.AUS: https://geni.us/NHCd50N
Buy an Intel Core i5-12600K: https://geni.us/8EcdRA
Buy an MSI PRO Z690-A WiFi: https://geni.us/4Y4fE7
Buy a Crucial P2 2TB: https://geni.us/zCPdl
Buy a G.Skill 2x16GB 3200MHz Trident Z RGB: https://geni.us/jkuB

 

I hope you didn’t just buy a new monitor, because Dell Alienware’s new QD-OLED AW3423DW is HERE and it’s giving everything else in this price range a serious run for its money. It's high refresh rate, GSYNC HDR, and OLED. Have we achieved a zero-compromises monitor?!

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, 10leej said:

It's alienware, for that fact alone I'll give this a hard pass.

They were so cool. When I were a lad, and was naive. Now I look at them, the price, THE INSIDES OF THE DESKTOPS!

Wow.

And I wonder.

Seems like they are recycling parts throughout their builds like some Borg collective growing over the years.

Link to comment
Share on other sites

Link to post
Share on other sites

not sure, would need to see more comparisons. Also how it looked a bit dull from an "HDR" experience. Where the other oled stood out more, but might see more natural than over saturated colors to the oled? then again, very hard to capture and to show in a video. Also what setting you leave the monitor at could change a lot. Would want a brighter display though, maybe it wasn't shown as much in the video or that smaller areas can become quite bright, but when you have big explosions, I guess mini-LED or micro-LED is still superior.

 

have the doubts on the claims of "static UI over longer time, will not degrade your panel" and that its still just 3 years of warranty.

burning money, I guess.

 

Also why can't these videos lately decide if they want to be an review or a look at the tech?

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, Quackers101 said:

very hard to capture and to show in a video

Not to mention what display you are watching on.

Link to comment
Share on other sites

Link to post
Share on other sites

55 minutes ago, 10leej said:

It's alienware, for that fact alone I'll give this a hard pass.

Alienware monitors are quite good, as is Dell's warranty. 

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, Quackers101 said:

not sure, would need to see more comparisons. Also how it looked a bit dull from an "HDR" experience. Where the other oled stood out more, but might see more natural than over saturated colors to the oled? then again, very hard to capture and to show in a video. Also what setting you leave the monitor at could change a lot. Would want a brighter display though, maybe it wasn't shown as much in the video or that smaller areas can become quite bright, but when you have big explosions, I guess mini-LED or micro-LED is still superior.

 

have the doubts on the claims of "static UI over longer time, will not degrade your panel" and that its still just 3 years of warranty.

burning money, I guess.

 

Also why can't these videos lately decide if they want to be an review or a look at the tech?

I am taking this with a grain of salt as well. There is a lot of features to try and avoid the burn in, but I don't think it is a complete non issue quite yet. 

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, James said:

 

 

I hope you didn’t just buy a new monitor, because Dell Alienware’s new QD-OLED AW3423DW is HERE and it’s giving everything else in this price range a serious run for its money. It's high refresh rate, GSYNC HDR, and OLED. Have we achieved a zero-compromises monitor?!

 

 

 

I ask you .. Why ...Why was this filmed in such a bright room ?

 

I physical sighed at the start when i realized the whole vid was going to be based in that 'sun lounge' .

 

Its good Linus still enjoyed it, but bright rooms like that really dont do OLED justice.

 

Also from what i've seen from a few other 'reviewers' the default 'standard' profile is the most color accurate.

 

 

Will be interesting to see what other brands do with the panel, we already know why Samsung hasnt released their own version (price dispute between the consumer electronics Samsung and the Panel manufacturing branch)

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

I was about to expect cheaper QD-OLED options than that, really.

 

QD-OLED has potential; if the yields are better by 2 years I will expect cheap monitors with that technology coming.

 

... but realistically I just wanted a QD-OLED monitor with reduced brightness in it, because I don't like very bright monitors and I don't want to spend too much on features I don't need, and also because low brightness means longer lifespan

Link to comment
Share on other sites

Link to post
Share on other sites

would it make more sense to buy a graphics card that can drive 4k before getting a 4k panel or getting a 4k panel like this before having a graphics card that can drive that? 1660 Super is what I'm rocking right now.

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, fractle said:

would it make more sense to buy a graphics card that can drive 4k before getting a 4k panel or getting a 4k panel like this before having a graphics card that can drive that? 1660 Super is what I'm rocking right now.

better to have 1080p then upgrade later. as for GPU, 1660 super can be fine for now and wait for next generation of GPUs and lower price for GPUs in general. Also that intel is coming around summer or after summer with their cards.

 

Not a lot of 4K displays hitting all the spots that one want.

Link to comment
Share on other sites

Link to post
Share on other sites

Just signed up to say that there were a few important points I didn't hear about on this video and had to find another reviewer that covered them:
- 2 HDMI ports, only 2.0 (*would have been great to hear how limited modern consoles would be hooked up to this, considering a lot of people are considering this or the C2 42" OLED)

- Display Port 1.4 limits you to 144hz if you want 10bit color(right?) so to get 175hz you're sacrificing color in your HDR viewing. Would have been great to hear if that is a legit concern in games/desktop usage.

 

Thanks for the review, just wanted to pass along the feedback to LTT and share my findings with others who are curious.

 

Link to comment
Share on other sites

Link to post
Share on other sites

2067179087_ScreenShot2022-03-16at4_11_56PM.thumb.png.98dfe46b983e2a46dbe74a86f3776bf4.png

This is where he says "I see no noticeable blur". 

 

Probably Refresh rate and Camera shutter speed stuff, it was just funny when I noticed VERY noticeable blur.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm hoping this is finally a monitor to replace my 27" 1440p 144Hz.  I bought it in 2015 and I think it was the first one with all that feature set.   The last 7 years have been really shitty with new monitors not really being any better than just having 2x27" screens.  We wasted like 3 years of development with 1080 ultrawide garbage.

Workstation:  13700k @ 5.5Ghz || Gigabyte Z790 Ultra || MSI Gaming Trio 4090 Shunt || TeamGroup DDR5-7800 @ 7000 || Corsair AX1500i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, fractle said:

would it make more sense to buy a graphics card that can drive 4k before getting a 4k panel or getting a 4k panel like this before having a graphics card that can drive that?

As someone who upgraded to a 3440x1440 144Hz display from a 2560x1080 75Hz with a 1080 Ti, not having a GPU to drive it at its native refresh rate is a pretty miserable experience.

 

Used to go from maxing out every game at staying vsync capped, even running reshade in numerous titles, to suffering with framerate being well below the native and not being able to use reshade anymore.

 

Adaptive sync helps and is very much required to have the image looking smooth, but once you get used to HFR, you really don't want to go lower.  60FPS looks like how 30FPS did before I got used to 120Hz.  Games start to feel a bit jittery when I drop below gsynced 90Hz.. and a 1080 Ti can't run 3440x1440 at anywhere close to 144Hz in modern games, so it's always a battle between IQ and framerate.  For example, I refused to play Horizon Zero Dawn because the FPS was dropping below 60 at med-high, and with an enthusiast level card, you really don't want to deal with that.

 

Also as an aside, adaptive sync does not work in windowed mode on a gsync-compatible display despite what NVCP may say.  The game has to be in borderless and no other window can be rendering over it (eg. on-top overlays).  So if you wanted to run a game in a lower windowed resolution to get better framerate, it might even be a worse experience than fullscreen at a lower framerate, just because of how bad non-adaptive sync judder is. Not the case with this specific monitor though.

 

Until the GPU market stabilizes, I do not recommend jumping up your display... unless you somehow avoid becoming a framerate snob. (and if so, I envy you!)

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, DANK_AS_gay said:

This is where he says "I see no noticeable blur". 

 

Probably Refresh rate and Camera shutter speed stuff, it was just funny when I noticed VERY noticeable blur.

It's more like OLED ghosting, which ended up becoming a limitation in OLED displays still, but that just mean it still has some motion blur. Not all OLED displays are created equal, obviously.

I gathered some Blur Busters links on that topic
https://blurbusters.com/faq/oled-motion-blur
https://forums.blurbusters.com/viewtopic.php?t=6780

Link to comment
Share on other sites

Link to post
Share on other sites

Alienware has been going down hill over the years. I’ll pass cause I had a dell computer along time ago and they wanted an arm and leg to repair it.

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, SolarNova said:

I ask you .. Why ...Why was this filmed in such a bright room ?

 

I physical sighed at the start when i realized the whole vid was going to be based in that 'sun lounge' .

 

Its good Linus still enjoyed it, but bright rooms like that really dont do OLED justice.

 

Also from what i've seen from a few other 'reviewers' the default 'standard' profile is the most color accurate.

In fact the standard picture profile is the only one that is factory calibrated. Credible reviews confirmed it's calibrated to sub delta-e 2.

 

14 hours ago, SolarNova said:

Will be interesting to see what other brands do with the panel, we already know why Samsung hasnt released their own version (price dispute between the consumer electronics Samsung and the Panel manufacturing branch)

People somehow think Samsung Electronics and Samsung Display are the same company. They're not.

 

14 hours ago, Herrscher of Whatever said:

I was about to expect cheaper QD-OLED options than that, really.

Tbh that's just unrealistic. You're looking at a cutting edge technology. I fully expected this exact monitor was $3000+. I was pleasantly surprised that the MSRP is $1300. It's a good deal for what it offers imo.

 

14 hours ago, Herrscher of Whatever said:

... but realistically I just wanted a QD-OLED monitor with reduced brightness in it, because I don't like very bright monitors and I don't want to spend too much on features I don't need, and also because low brightness means longer lifespan

You are free to just lower the brightness yourself like with any other monitor.

 

13 hours ago, madsauce said:

- 2 HDMI ports, only 2.0 (*would have been great to hear how limited modern consoles would be hooked up to this, considering a lot of people are considering this or the C2 42" OLED)

HDMI 2.0 is not a limitation when it comes to console use. The XBSX will be able to do 2560x1440 120Hz, and the PS5 will do 1920x1080 120Hz. As consoles don't support 21:9 it doesn't make sense to include HDMI 2.1 for that. And for PC you should use DP either way.

 

13 hours ago, madsauce said:

- Display Port 1.4 limits you to 144hz if you want 10bit color(right?) so to get 175hz you're sacrificing color in your HDR viewing. Would have been great to hear if that is a legit concern in games/desktop usage.

From my past experience this isn't a big deal. My current PG35VQ has a similar problem. I can choose between 8bpc 200Hz or 10bpc 144Hz. Realistically i cannot see a difference between 8bpc and 10bpc, and i also don't see a big difference between 144Hz or 200Hz. You can really go one way or the other and be happy with the results. Still a bit sad that they included such an unneccesary compromise in a high-end display. In the end games RARELY run above 144 fps on that resolution, going 144Hz 10bpc is probably the go-to option.

 

13 hours ago, DANK_AS_gay said:

2067179087_ScreenShot2022-03-16at4_11_56PM.thumb.png.98dfe46b983e2a46dbe74a86f3776bf4.png

This is where he says "I see no noticeable blur". 

 

Probably Refresh rate and Camera shutter speed stuff, it was just funny when I noticed VERY noticeable blur.

Judging ghosting by looking at gameplay in a 30 fps video is useless. Look for other reviews and you'll see that motion clarity is as good as it gets for sample and hold displays.

 

9 hours ago, Herrscher of Whatever said:

It's more like OLED ghosting, which ended up becoming a limitation in OLED displays still, but that just mean it still has some motion blur. Not all OLED displays are created equal, obviously.

OLED is limited by being a "sample and hold" display. The only way to further improve motion clarity would be to introduce BFI, which wouldn't even be used all that much on an HDR and image quality focused product like this.

 

All in all motion clarity looks superb from what i've seen so far. As good or even better than typical 120Hz OLED TV's.

image.png.164723f97bfeefd125a17af306e6b34e.png

 

 

 

Some feedback about the review itself:

 

First off, why is the only person i know of at LTT that knows about displays completely missing on-set? @James

Isn't such a huge team of writers with different backgrounds supposed to warrant that you have the right person for the job?

After looking at the video credits it seems he wasn't even involved in the slightest.

 

Also, why are you filiming in such a sunbathed room? It's the best way to counteract all the advantages HDR brings. For a video that focuses on subjective opinion, going for a worst-case scenario is a bad approach imo. And if it was unintentional, it just further adds to the problem of assigning the wrong person to plan out this video.

 

In my personal opinion you should focus on providing more data about the product you're reviewing to actually prove if it's good or bad, not talking about the technology itself and it's THEORETICAL advantages for half of the review.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Stahlmann said:

Judging ghosting by looking at gameplay in a 30 fps video is useless. Look for other reviews and you'll see that motion clarity is as good as it gets for sample and hold displays

Probably Refresh rate and Camera shutter speed stuff, it was just funny when I noticed VERY noticeable blur. 

When I said that, I was acknowledging that the difference in video frame rate and monitor refresh rate might cause the blurring I was seeing. I literally said exactly what you said, yet you decided to get on your high horse and argue? Seriously?

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, madsauce said:

Just signed up to say that there were a few important points I didn't hear about on this video and had to find another reviewer that covered them:
- 2 HDMI ports, only 2.0 (*would have been great to hear how limited modern consoles would be hooked up to this, considering a lot of people are considering this or the C2 42" OLED)

- Display Port 1.4 limits you to 144hz if you want 10bit color(right?) so to get 175hz you're sacrificing color in your HDR viewing. Would have been great to hear if that is a legit concern in games/desktop usage.

 

Thanks for the review, just wanted to pass along the feedback to LTT and share my findings with others who are curious.

 

When is the next displayport standard coming out? I might wait to adopt a QD-OLED until then. I want 10 bit, 1440p, 240hz 

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, AnonymousGuy said:

I'm hoping this is finally a monitor to replace my 27" 1440p 144Hz.  I bought it in 2015 and I think it was the first one with all that feature set.   The last 7 years have been really shitty with new monitors not really being any better than just having 2x27" screens.  We wasted like 3 years of development with 1080 ultrawide garbage.

1080p ultrawide is SO bad.

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, DANK_AS_gay said:

I literally said exactly what you said, yet you decided to get on your high horse and argue? Seriously?

I was just clarifying. If that is enough to offend you that's your problem.

15 minutes ago, Ryan829 said:

When is the next displayport standard coming out? I might wait to adopt a QD-OLED until then. I want 10 bit, 1440p, 240hz 

I don't expect it anytime soon. Current DisplayPort 1.4 can even do 4K 240Hz 10bpc HDR. It just needs DSC to be implemented, which this monitor doesn't because they decided to use an older G-Sync module that is not compatible with DSC. As long as displays can be fed through a DP 1.4 connection, 2.0 won't be implemented. Just as HDMI 2.1 was nowhere to be seen and only showed up once 4K 120Hz became a thing in TVs. They only started to implement it once they NEEDED it.

 

Intel's ARC Alchemist GPUs seem to have DP 2.0, but again there are no monitors on the horizon that use this connector.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Stahlmann said:

I was just clarifying.

Clarifying what exactly? What did you say that someone else didn't, or I didn't say?

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, DANK_AS_gay said:

Clarifying what exactly? What did you say that someone else didn't, or I didn't say?

For your reference, I have compiled a list of words in order of appearance that @Stahlmannsaid that you did not say.

Spoiler

1.       Judging

2.       ghosting

3.       by

4.       looking

5.       at

6.       gameplay

7.       in

8.       a

9.       30

10.   fps

11.   video

12.   useless

13.   Look

14.   for

15.   other

16.   reviews

17.   you'll

18.   that

19.   motion

20.   clarity

21.   as

22.   good

23.   gets

24.   sample

25.   hold

26.   displays

 

BabyBlu (Primary): 

  • CPU: Intel Core i9 9900K @ up to 5.3GHz, 5.0GHz all-core, delidded
  • Motherboard: Asus Maximus XI Hero
  • RAM: G.Skill Trident Z RGB 4x8GB DDR4-3200 @ 4000MHz 16-18-18-34
  • GPU: MSI RTX 2080 Sea Hawk EK X, 2070MHz core, 8000MHz mem
  • Case: Phanteks Evolv X
  • Storage: XPG SX8200 Pro 2TB, 3x ADATASU800 1TB (RAID 0), Samsung 970 EVO Plus 500GB
  • PSU: Corsair HX1000i
  • Display: MSI MPG341CQR 34" 3440x1440 144Hz Freesync, Dell S2417DG 24" 2560x1440 165Hz Gsync
  • Cooling: Custom water loop (CPU & GPU), Radiators: 1x140mm(Back), 1x280mm(Top), 1x420mm(Front)
  • Keyboard: Corsair Strafe RGB (Cherry MX Brown)
  • Mouse: MasterMouse MM710
  • Headset: Corsair Void Pro RGB
  • OS: Windows 10 Pro

Roxanne (Wife Build):

  • CPU: Intel Core i7 4790K @ up to 5.0GHz, 4.8Ghz all-core, relidded w/ LM
  • Motherboard: Asus Z97A
  • RAM: G.Skill Sniper 4x8GB DDR3-2400 @ 10-12-12-24
  • GPU: EVGA GTX 1080 FTW2 w/ LM
  • Case: Corsair Vengeance C70, w/ Custom Side-Panel Window
  • Storage: Samsung 850 EVO 250GB, Samsung 860 EVO 1TB, Silicon Power A80 2TB NVME
  • PSU: Corsair AX760
  • Display: Samsung C27JG56 27" 2560x1440 144Hz Freesync
  • Cooling: Corsair H115i RGB
  • Keyboard: GMMK TKL(Kailh Box White)
  • Mouse: Glorious Model O-
  • Headset: SteelSeries Arctis 7
  • OS: Windows 10 Pro

BigBox (HTPC):

  • CPU: Ryzen 5800X3D
  • Motherboard: Gigabyte B550i Aorus Pro AX
  • RAM: Corsair Vengeance LPX 2x8GB DDR4-3600 @ 3600MHz 14-14-14-28
  • GPU: MSI RTX 3080 Ventus 3X Plus OC, de-shrouded, LM TIM, replaced mem therm pads
  • Case: Fractal Design Node 202
  • Storage: SP A80 1TB, WD Black SN770 2TB
  • PSU: Corsair SF600 Gold w/ NF-A9x14
  • Display: Samsung QN90A 65" (QLED, 4K, 120Hz, HDR, VRR)
  • Cooling: Thermalright AXP-100 Copper w/ NF-A12x15
  • Keyboard/Mouse: Rii i4
  • Controllers: 4X Xbox One & 2X N64 (with USB)
  • Sound: Denon AVR S760H with 5.1.2 Atmos setup.
  • OS: Windows 10 Pro

Harmonic (NAS/Game/Plex/Other Server):

  • CPU: Intel Core i7 6700
  • Motherboard: ASRock FATAL1TY H270M
  • RAM: 64GB DDR4-2133
  • GPU: Intel HD Graphics 530
  • Case: Fractal Design Define 7
  • HDD: 3X Seagate Exos X16 14TB in RAID 5
  • SSD: Inland Premium 512GB NVME, Sabrent 1TB NVME
  • Optical: BDXL WH14NS40 flashed to WH16NS60
  • PSU: Corsair CX450
  • Display: None
  • Cooling: Noctua NH-U14S
  • Keyboard/Mouse: None
  • OS: Windows 10 Pro

NAS:

  • Synology DS216J
  • 2x8TB WD Red NAS HDDs in RAID 1. 8TB usable space
Link to comment
Share on other sites

Link to post
Share on other sites

I considered buying one based on this video, but this German review essentially calls the panel technology garbage. Apparently edges with a strong contrast have colors. It's such a dealbreaker that they barely bother to talk about the monitor itself.

https://www.heise.de/hintergrund/Monitor-mit-QD-OLED-im-Test-Samsungs-neue-Display-Technik-ist-verkorkst-6582133.html

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×