Jump to content

*Originally posted in wrong sub, sorry. 😒

 

Just wanted to put into words something that’s been bothering me for the past few weeks after upgrading—why plasma still looks better than OLED, even though on paper OLEDs destroy it in contrast, resolution, and color gamut.


Here’s what I discovered: plasma and OLED don’t just differ in specs, they differ at a fundamental light production level. Plasma uses phosphors that emit a broad, continuous spectrum of light—more like a natural incandescent glow than anything modern. That means it outputs many overlapping bands of light across the visible spectrum (OLED has wider color gamut, but plasma produces more bands of light in the visible spectrum). The result is a type of illumination that feels rich, organic, and alive. Colors blend with grace. There’s warmth in every shade, a kind of atmospheric depth that gives the image a soul. Reds smolder, greens breathe, and skin tones feel touched by sunlight rather than lit by electronics. It’s not just color—it’s radiant, expressive light.


OLED, on the other hand, uses highly efficient emitters with narrow spectral output. Each subpixel produces a sharp spike at specific wavelengths—very pure, very intense, but also very narrow. This gives OLED a wider measured color gamut, since those pure spikes can hit extreme points on a CIE chart. But that’s not the same as broader spectral coverage. OLED covers more of the chart, but with less light in between. The result is often clinical. Colors appear isolated, not continuous. Everything is vivid, but it feels artificial—like looking at a very digital rendering instead of looking at something through a window.


Then there’s resolution in motion. OLED panels might be 4K on paper, but due to sample-and-hold behavior and the absence of inherent motion clarity, they typically resolve only about 300 to 400 lines during motion. Plasma, by contrast, displays have true 1080 lines of motion resolution, because the image is continually refreshed with pulsed light rather than held in place. That means plasma retains more detail when the picture moves, without relying on artificial interpolation. In terms of effective resolution while watching real content, SDR plasma actually shows more functional detail than a 4K OLED.


I finally made the upgrade from plasma to OLED (Samsung S90D), and I’m not happy. I’ve done a lot of research to figure out why this is, and now everything clicks. OLED has a lot of advantages on paper, but it really falls short in practicality. In the end, I’m returning it.


TL;DR: OLED wins on the spec sheet—higher static resolution, wider color gamut, deeper blacks. But plasma wins where it matters—color richness, natural light, motion clarity, and emotional impact. OLED is digital. Plasma is cinematic.

d3533e45-00dc-42f1-937b-66c7c1dcd46f.png

Link to comment
https://linustechtips.com/topic/1615144-plasma-is-better-than-oled/
Share on other sites

Link to post
Share on other sites

This is an interesting comparison. Some of us never got our hands on plasma for daily use, so we've been stuck with LEDs for a long time. OLED is a breath of fresh air compared to traditional LEDs. Plasma has always looked incredible to me, but HDR content on OLED puts plasma in the rear view mirror. 

 

Why don't you just keep your plasma if you like the way it looks so much? Don't forget, televisions are adjustable. I mean 100% of the time when I get a new screen, I turn down the saturation to help calm down the intensity and give everything a more natural look. I target the reds when I can because that's where a majority of skin tones live.

Link to post
Share on other sites

31 minutes ago, johnt said:

This is an interesting comparison. Some of us never got our hands on plasma for daily use, so we've been stuck with LEDs for a long time. OLED is a breath of fresh air compared to traditional LEDs. Plasma has always looked incredible to me, but HDR content on OLED puts plasma in the rear view mirror. 

 

Why don't you just keep your plasma if you like the way it looks so much? Don't forget, televisions are adjustable. I mean 100% of the time when I get a new screen, I turn down the saturation to help calm down the intensity and give everything a more natural look. I target the reds when I can because that's where a majority of skin tones live.

I know a fair amount about display calibration, and the characteristics I don’t like about OLED cannot be mitigated with calibration because they are inherent to the technology / LED light output. I had my ST60 plasma side by side with my S90D—SDR vs. HDR—and the plasma looked so much better. The OLED can get much brighter, but that doesn’t matter for dark room viewing, and outside of that, it looks cold, digital, and clinical.

 

Directly comparing SDR vs. HDR on Prime, Netflix, etc., I can say with certainty that SDR looks better 95% of the time. These platforms cap brightness at 800 nits—oftentimes less—so you’re not even getting a real HDR experience. As a matter of fact, APL (Average Picture Level) is actually lower in streaming HDR than in SDR, so the image just looks dimmer and flatter than SDR in general.

 

All of the primary colors look so much richer on plasma, particularly red. I put up a strawberry screensaver from YouTube on both TVs, and the difference was gigantic—the plasma just destroyed my QD-OLED. When comparing real-world content in a dark room, the difference was huge. OLED gets slaughtered!

 

I’m sure HDR is more impressive on disc, but on streaming platforms, it sucks—just like OLED color rendering in my opinion. It looks exactly the same as LED with perfect blacks- which makes perfect sense - as it is LED light. 

Link to post
Share on other sites

Posted (edited)

I wanted to propose QD-Oled as the better Oled but you already have QD-Oled.

That's surprising, hard to believe really (I have a 2017 LG Oled and I have never (really) experienced plasma).

I'm not saying you're wrong though.

Edited by leclod

If you don't reply to us, we won't get notified.

Link to post
Share on other sites

Well TVs are different outbof box, I've seen many that have oversaturated image. At very least I'd always try to use more realistic mode and calibrate if possible, there are pro OLED monitor displays and they look proper and awesome. Even monitors I've found can look better than most TVs with more settings.

As far as motion resolution, CRT and plasma sure operate different than LCD and OLED that are sample-and-hold tech. However there is strobe and BFI tech as compensation, those improve clarity. But needs proper implementation or it's useless. But also we need to add that modern displays offer higher Hz which improves motion but also shows more information fps wise in games.

We have some strobed LCD monitors that have very good motion clarity. Also certain OLED with BFI but most last it at top Hz sadly.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Zowie GTF-X | Mouse: Vaxee XE wired | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | LG 32GS95UV-B OLED 4K 240Hz / 1080p 480Hz dual-mode | OS: Windows 11 |

Link to post
Share on other sites

I haven't seen a plasma for many years, well before OLEDs existed in the consumer market. So I never compared the two side by side. I don't recall the image of it being anything special at the time.

 

The claim on the spectrum I'm not sure holds out. How many colour detectors do we have as humans? The RGB emissions of display technologies aim to match that. So long as we're looking at the screen for light directly, it should be fine. If anything, having a broad spectrum light source displaying 3 colour channels would result in reduced colour separation as they spill over more between each bin. If you wanted to degrade an OLED to give a weaker colour response you could probably do it using a custom shader. It doesn't change the emission characteristics of the display, but it will give a similar experience at eye detection.

 

The time where continuous spectrum vs peaky narrow spectrum does matter if it is used for lighting other things to be observed. That helps against the scenario where the subject and the light source spectrums don't align so colour rendition is off. That doesn't apply to display technologies.

 

In case the differences between the two scenarios above are not clear:

1: Display - eye

2: Light source - subject - eye

 

There may be other psychovisual effects going on for any person to prefer one thing over another. It isn't just one thing but a combination of all factors.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, MSI Ventus 3x OC RTX 5070 Ti, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Alienware AW3225QF (32" 240 Hz OLED)
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 4070 FE, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, iiyama ProLite XU2793QSU-B6 (27" 1440p 100 Hz)
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to post
Share on other sites

14 hours ago, porina said:

I haven't seen a plasma for many years, well before OLEDs existed in the consumer market. So I never compared the two side by side. I don't recall the image of it being anything special at the time.

 

The claim on the spectrum I'm not sure holds out. How many colour detectors do we have as humans? The RGB emissions of display technologies aim to match that. So long as we're looking at the screen for light directly, it should be fine. If anything, having a broad spectrum light source displaying 3 colour channels would result in reduced colour separation as they spill over more between each bin. If you wanted to degrade an OLED to give a weaker colour response you could probably do it using a custom shader. It doesn't change the emission characteristics of the display, but it will give a similar experience at eye detection.

 

The time where continuous spectrum vs peaky narrow spectrum does matter if it is used for lighting other things to be observed. That helps against the scenario where the subject and the light source spectrums don't align so colour rendition is off. That doesn't apply to display technologies.

 

In case the differences between the two scenarios above are not clear:

1: Display - eye

2: Light source - subject - eye

 

There may be other psychovisual effects going on for any person to prefer one thing over another. It isn't just one thing but a combination of all factors.

Appreciate thr thoughtful reply, but a few points here need some clarification, especially regarding human color perception and spectral output...

You're right that human vision is trichromatic—our cones respond to long (red), medium (green), and short (blue) wavelengths. But it's not as simple as “any RGB source is good enough.” The spectral shape of each channel—the actual light distribution within each RGB emitter—has a huge impact on how colors are perceived. It affects not just how we see pure primaries, but how colors blend and how natural or artificial the overall image feels.

 

OLED’s emitters are extremely narrow and spiky in the spectral power domain.. That spike-like distribution allows for a wider measured gamut, sure—but the energy between those spikes is lacking. Plasma, using broadband phosphors, emits smoother, fuller curves with overlapping bands of energy across the visible spectrum. This more continuous spectrum doesn't degrade color separation—it enhances realism, because real-world light sources (sunlight, incandescent bulbs) emit similar broadband light. Plasma mimics that behavior, leading to more natural gradients, subtler skin tones, and richer overall tonality—especially in midtones.

 

The argument that continuous-spectrum light only matters when lighting external objects (i.e. real-world lighting scenarios) misses a key nuance: even in direct-view displays, the spectral shape determines how light interacts with our eyes. Displays aren’t just triggering cone responses in isolation—they’re engaging our entire visual system, which includes complex interactions, adaptation mechanisms, and cross-channel blending. That’s why OLED often feels vivid but synthetic, and plasma feels warm and cinematic—even if both are technically “accurate” in a color chart.

 

Also, your point about psychovisual preference is fair—yes, spme of this is perceptual and subjective. But not all of it. Motion resolution, for instance, isn’t up for debate. Plasma’s pulsed drive gives it a clean 1080 lines of motion resolution, while OLED's sample-and-hold nature limits it to 300–400 lines without motion interpolation. That’s a measurable, tested difference in effective resolution during real-world content, and it directly affects sharpness during panning shots or fast movement. A 4K OLED looks less detailed in motion than a 1080p plasma.

 

So yes, OLED wins on lab specs—deeper blacks, wider gamut, higher peak brightness. But plasma delivers a more full-bodied, lifelike visual experience thanks to its analog light characteristics, real motion clarity, and broadband color emission. 

 

Another huge difference I couldn’t ignore: plasma absolutely demolishes OLED when it comes to near-black performance and shadow detail. Plasma, with its subtle low-level glow, keeps shadows alive—gradients are near perfect, dark scenes have texture and nuance that OLED just doesn’t preserve regardless of calibration.

 

For me, it all comes down to this: OLED feels synthetic, digital, cold and clinical. It’s technically dazzling but emotionally hollow. Plasma feels like real light—warm, natural, atmospheric, cinematic.

 

That’s the thing no spec sheet can capture... A Ferrari smokes a fully loaded Camry on a spec sheet, but you don’t daily a Ferrari. Point being- superior specs don’t always deliver a superior experience. 

Link to post
Share on other sites

14 hours ago, Doobeedoo said:

Well TVs are different outbof box, I've seen many that have oversaturated image. At very least I'd always try to use more realistic mode and calibrate if possible, there are pro OLED monitor displays and they look proper and awesome. Even monitors I've found can look better than most TVs with more settings.

As far as motion resolution, CRT and plasma sure operate different than LCD and OLED that are sample-and-hold tech. However there is strobe and BFI tech as compensation, those improve clarity. But needs proper implementation or it's useless. But also we need to add that modern displays offer higher Hz which improves motion but also shows more information fps wise in games.

We have some strobed LCD monitors that have very good motion clarity. Also certain OLED with BFI but most last it at top Hz sadly.

"We have some strobed LCD monitors that have very good motion clarity".

 

My friend has this old, super rare TV— Sharp Elite—and honestly, it’s one of the most impressive displays I’ve ever seen, even today. It was a joint effort between Sharp and Pioneer, basically a blank-check project built to be the best of the best. They spent two years developing it and it’s the only LED TV that ever won the Value Electronics TV Shootout. That would be like a mini-LED dethroning OLED today—completely unheard of.

 

Back to the point.... motion. The Elite has a true 1200 lines of motion resolution without SOE, motion handling is just as good as plasma. Meanwhile, my brand-new S90D has awful motion handling. It’s wild that they nailed this back in 2011 and somehow we’ve regressed on a lot of newer high-end sets.

 

That said, It wasn’t cheap—over $6,000 for the 60-inch— the tech was a solid decade + ahead of its time. It used a proprietary UV2A panel with a native contrast ratio of 10,000:1 and 300-zone local dimming- insane specs for 2011 and even great by todays standards. The black levels are insanely—honestly right up there with OLED—and it has this thick glass panel just like plasmas... Even today, it puts out one of the most refined images I’ve ever seen.

 
Link to post
Share on other sites

4 hours ago, DeltaE3 said:

Also, your point about psychovisual preference is fair—yes, spme of this is perceptual and subjective. But not all of it. Motion resolution, for instance, isn’t up for debate. Plasma’s pulsed drive gives it a clean 1080 lines of motion resolution, while OLED's sample-and-hold nature limits it to 300–400 lines without motion interpolation. That’s a measurable, tested difference in effective resolution during real-world content, and it directly affects sharpness during panning shots or fast movement. A 4K OLED looks less detailed in motion than a 1080p plasma.

Can you provide a source for this measurable and tested difference?

Link to post
Share on other sites

4 hours ago, DeltaE3 said:

"We have some strobed LCD monitors that have very good motion clarity".

 

My friend has this old, super rare TV— Sharp Elite—and honestly, it’s one of the most impressive displays I’ve ever seen, even today. It was a joint effort between Sharp and Pioneer, basically a blank-check project built to be the best of the best. They spent two years developing it and it’s the only LED TV that ever won the Value Electronics TV Shootout. That would be like a mini-LED dethroning OLED today—completely unheard of.

 

Back to the point.... motion. The Elite has a true 1200 lines of motion resolution without SOE, motion handling is just as good as plasma. Meanwhile, my brand-new S90D has awful motion handling. It’s wild that they nailed this back in 2011 and somehow we’ve regressed on a lot of newer high-end sets.

 

That said, It wasn’t cheap—over $6,000 for the 60-inch— the tech was a solid decade + ahead of its time. It used a proprietary UV2A panel with a native contrast ratio of 10,000:1 and 300-zone local dimming- insane specs for 2011 and even great by todays standards. The black levels are insanely—honestly right up there with OLED—and it has this thick glass panel just like plasmas... Even today, it puts out one of the most refined images I’ve ever seen.

 

Yeah not bad for back then, though afaik there were LCD and plasma in that Elite lineup. 

Though TVs aside todays top monitor are finally catching up to CRT I'm on 480Hz OLED it's very nice in fast games, still more improvement to motion clarity to be done. 

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Zowie GTF-X | Mouse: Vaxee XE wired | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | LG 32GS95UV-B OLED 4K 240Hz / 1080p 480Hz dual-mode | OS: Windows 11 |

Link to post
Share on other sites

5 hours ago, DeltaE3 said:

But it's not as simple as “any RGB source is good enough.” The spectral shape of each channel—the actual light distribution within each RGB emitter—has a huge impact on how colors are perceived. It affects not just how we see pure primaries, but how colors blend and how natural or artificial the overall image feels.

Thinking more on it, the charts I've looked at in the past present an average human response. They don't indicate how much variation there is between humans. Red-green are much closer to each other than blue, which I guess is a contributor to the common red-green colour blindness. So in this sense, I may have overlooked that if there is sufficient variation between individual response, a narrow emission could result in more variance than if it had a broader response.

 

5 hours ago, DeltaE3 said:

Also, your point about psychovisual preference is fair—yes, spme of this is perceptual and subjective. But not all of it. Motion resolution, for instance, isn’t up for debate. Plasma’s pulsed drive gives it a clean 1080 lines of motion resolution, while OLED's sample-and-hold nature limits it to 300–400 lines without motion interpolation. That’s a measurable, tested difference in effective resolution during real-world content, and it directly affects sharpness during panning shots or fast movement. A 4K OLED looks less detailed in motion than a 1080p plasma.

I'd appreciate if you could link to research in this area. I guess a problem for many of us is that either many people these days are simply too young for the CRT/plasma era. Even for those like me who did live through that time, it is now a distant memory. I've seen comments from the likes of Digital Foundry and elsewhere talking about the motion response of CRTs, but I can't do a comparison. I gave away my last CRTs a couple years ago to a retro collector. One was an ordinary 25" CRT TV, the other was a 17" Trinitron monitor which was my last CRT display before I switched over to LCDs. Even then, the resolution of both were far lower than what is possible now. The Trinitron I think had a reasonable usable resolution of 1024x768, which matches my first LCD, which I preferred.

 

5 hours ago, DeltaE3 said:

Another huge difference I couldn’t ignore: plasma absolutely demolishes OLED when it comes to near-black performance and shadow detail. Plasma, with its subtle low-level glow, keeps shadows alive—gradients are near perfect, dark scenes have texture and nuance that OLED just doesn’t preserve regardless of calibration.

OLED was hailed as the ultimate for black response as when it was off, it was OFF, unlike backlit LCDs. So a low glow is good again now? I wonder if this is a parallel to audio processing, where intentionally adding noise can improve perceived low level performance by overcoming digital quantisation.

 

5 hours ago, DeltaE3 said:

For me, it all comes down to this: OLED feels synthetic, digital, cold and clinical. It’s technically dazzling but emotionally hollow. Plasma feels like real light—warm, natural, atmospheric, cinematic.

Out of interest, how do you rate LCD technologies in all this?

 

For you to convince us of your claims, I feel you have to do more of the work. For claims you make, please can you reference sources to support that? Again, most people will not have access to plasma to see for themselves. I'd also add it doesn't help that your writing is a mix of objective and subjective claims. At times it reads like so called audiophile speak.

 

5 hours ago, DeltaE3 said:

That’s the thing no spec sheet can capture... A Ferrari smokes a fully loaded Camry on a spec sheet, but you don’t daily a Ferrari. Point being- superior specs don’t always deliver a superior experience. 

This is a different problem. Some people will over-focus on a narrow or even singular measure when comparing things, when there are myriad factors. If you sorted by number of cup holders, the Camry might do better than a Ferrari. There's a LOT more to a car than 0-60 times or top speed.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, MSI Ventus 3x OC RTX 5070 Ti, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Alienware AW3225QF (32" 240 Hz OLED)
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 4070 FE, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, iiyama ProLite XU2793QSU-B6 (27" 1440p 100 Hz)
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to post
Share on other sites

15 hours ago, Doobeedoo said:

not bad for back then, though afaik there were LCD and plasma in that Elite lineup.

What exactly do you mean by that? The Sharp Elite was a one-off model—produced for just a single year—and only a limited number ever made it to market. In terms of performance, it won the entire Value Electronics Shootout, beating the top plasma and LED TVs of its time, which was a massive achievement and hasn't happened since. Nobody expected an LED-based display to outperform plasma back then—just like today, few believe a mini-LED could take the top spot.

 

As for calling it “not bad for back then,” it actually has a better native contrast ratio than any current flagship mini-LED TV—and by a wide margin. Its motion handling (for film content) is superior to any modern display, and it also features build quality leagues ahead of any TV made today: a thick genuine glass panel, a beautiful black brushed aluminum frame, and what I’d argue is still among the best black-level performance of any non-OLED display. That’s despite having only 300 dimming zones—because the panel’s exceptionally high native contrast and flawless local dimming algorithm made it so effective. In fact, it still holds the record for the second-best measured black level ever achieved by an LED-based panel, aside from the new Sony Bravia 9..

Link to post
Share on other sites

15 hours ago, porina said:

Out of interest, how do you rate LCD technologies in all this?

 

I haven’t spent enough time with a wide range of high end LED panels to form a solid opinion. The only high end LED display I have real hands-on experience with is my friend’s Sharp Elite, and it’s honestly one of the best displays I’ve ever seen. That said, it’s an ultra high-end nearly reference level set that originally retailed for $6,000, so it’s probably superior to most of what’s available in the current mini-LED market despite it's age.

 

Interestingly, I’ve seen quite a few people on forums say that mini-LED actually comes closer to plasma in terms of color reproduction. Here’s an interesting post I came across that touches on that topic: Pixel Structure: Mini-LED vs WOLED vs QD-OLED vs Plasma | AVS Forum

 

I'm considering purchasing the new TCL QM8K as I've read great things about it. 

Link to post
Share on other sites

11 minutes ago, DeltaE3 said:

What exactly do you mean by that? The Sharp Elite was a one-off model—produced for just a single year—and only a limited number ever made it to market. In terms of performance, it won the entire Value Electronics Shootout, beating the top plasma and LED TVs of its time, which was a massive achievement and hasn't happened since. Nobody expected an LED-based display to outperform plasma back then—just like today, few believe a mini-LED could take the top spot.

 

As for calling it “not bad for back then,” it actually has a better native contrast ratio than any current flagship mini-LED TV—and by a wide margin. Its motion handling (for film content) is superior to any modern display, and it also features build quality leagues ahead of any TV made today: a thick genuine glass panel, a beautiful black brushed aluminum frame, and what I’d argue is still among the best black-level performance of any non-OLED display. That’s despite having only 300 dimming zones—because the panel’s exceptionally high native contrast and flawless local dimming algorithm made it so effective. In fact, it still holds the record for the second-best measured black level ever achieved by an LED-based panel, aside from the new Sony Bravia 9..

Oh the Sharp Elite brand was revived in partnership with Pioneer for it. Though you were mentioning plasma mostly. That TV being LCD it was good for it's time, but also not really a price people can buy. Today LCDs evolved on tech, color volume, nits, zones in MiniLED and such. I don't like LCD though and MiniLED just not enough zones and OLED just destroys it. I mean we have today dozens of thousands of dollars pro grade monitors, they're awesome, they're better than anything basically, but they cost so much they're not for regular consumer. 

 

Top CRTs and plasmas were quite something, LCDs becoming a thing ruined it for a long time, especially motion clarity which I value for gaming. 300 zones is rather bad really. I mean we have 1000+ zone monitors which are much smaller and they look bad to me. Modern top tier MiniLED would destroy that one though, of course not in OLED blacks. Not sure how it would compare with black levels to todays stuff that is not OLED though there are IPS black monitors and VA which have higher contrast, but yeah. Either way, I stopped caring for LCD myself, not just because of OLED contrast per pixel but also speed.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Zowie GTF-X | Mouse: Vaxee XE wired | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | LG 32GS95UV-B OLED 4K 240Hz / 1080p 480Hz dual-mode | OS: Windows 11 |

Link to post
Share on other sites

36 minutes ago, Doobeedoo said:

Modern top tier MiniLED would destroy that one

I can promise you that you're wrong—native contrast ratio and local dimming algorithm are far more important than zone count. I saw the Sharp Elite side-by-side with a TCL QM851, which he bought to replace the Elite. The TCL has over 2,000 zones and performs very similarly to the Bravia 9, yet the Elite absolutely destroyed it in black level performance—I saw it with my own eyes.

 

Modern LED-based panels all dim highlights to some degree to control blooming. The Elite doesn’t do that. Despite having a much lower zone count, it produced better highlight detail, has zero blooming ever and has significantly better all-around black level performance. The Elite’s overall image quality was just in a different class—but that’s what you get when you compare a mass-market, assembly-line TV to a near-reference-level $6,000 display.

 

Using a higher zone count on a panel with lower native contrast ratio doesn’t work as well as fewer zones paired with extremely high native contrast. And nothing made today comes anywhere near the native contrast of the Elite. Yes, the TCL is obviously brighter and it's 4K, but even with those advantages, it still wasn't on par.

 

This was a direct comparison with what is essentially one of the best mini-LED TVs currently available—and it still couldn’t compete. Newer doesn't always equal better... 

Link to post
Share on other sites

8 hours ago, DeltaE3 said:

I can promise you that you're wrong—native contrast ratio and local dimming algorithm are far more important than zone count. I saw the Sharp Elite side-by-side with a TCL QM851, which he bought to replace the Elite. The TCL has over 2,000 zones and performs very similarly to the Bravia 9, yet the Elite absolutely destroyed it in black level performance—I saw it with my own eyes.

 

Modern LED-based panels all dim highlights to some degree to control blooming. The Elite doesn’t do that. Despite having a much lower zone count, it produced better highlight detail, has zero blooming ever and has significantly better all-around black level performance. The Elite’s overall image quality was just in a different class—but that’s what you get when you compare a mass-market, assembly-line TV to a near-reference-level $6,000 display.

 

Using a higher zone count on a panel with lower native contrast ratio doesn’t work as well as fewer zones paired with extremely high native contrast. And nothing made today comes anywhere near the native contrast of the Elite. Yes, the TCL is obviously brighter and it's 4K, but even with those advantages, it still wasn't on par.

 

This was a direct comparison with what is essentially one of the best mini-LED TVs currently available—and it still couldn’t compete. Newer doesn't always equal better... 

Yes but native contrast is still limited to tech being LCD anyway, same for local dimming and zone count or total LEDs in them. There are TVs that have 10000 or so. Those can have more granular control. But still, we had much smaller iPad that had as much, way more than TV does and still can see blooming. I'm sure it looks fairly good though I never liked Mini LED displays. They were always more expensive and not worth it with cons. So non OLED always will have worse blacks and contrast. For proper HDR you really need OLED and calibrated display. 

Not sure what you were comparing though, was it HDR content from disk as some things can look really bad on modern TVs if content and TV modes are off. Like crushed blacks, overexposed highlights. Same for color accuracy. Some TVs by default don't look the best or accurate is the issue. 

But like you said that thing costs way beyond consumer price range anyway. 

 

I mean there are TVs nowdays with higher native contrast like VA MiniLED ones with way more zones and LEDs though so yeah not sure how you compared. Just saying, no ways todays basically same type of tech but better specs can't produce better image. It's FALD display after all. Newer doesn't mean better, that depends and many would argue that. Some will say CRT motion clarity is unmatched, to degree true but top tier monitors can match and exceed them now. Maybe you like how that TV looks, that's fine, but juat spec wise I know I'd easily see blooming like star night and stuff like with huge contrast differences and fine detail aa it's obvious with displays that have even finer control and more LEDs zones to work with. Color accuracy may be good but it can't reach proper HDR levels though. I've seen a lot of crap HDR and ruins the meaning of it. Same for oversaturated TVs out of box for wow factor consumers seem to like vs properly calibrated image. 

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Zowie GTF-X | Mouse: Vaxee XE wired | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | LG 32GS95UV-B OLED 4K 240Hz / 1080p 480Hz dual-mode | OS: Windows 11 |

Link to post
Share on other sites

10 hours ago, Doobeedoo said:

For proper HDR you really need OLED and calibrated display

OLED doesn’t actually deliver true HDR....It’s just the reality of how the technology works. People assume HDR is HDR, but on OLED it’s a watered-down version thanks to major brightness limits and the aggressive Automatic Brightness Limiter. What you end up with is a display that theoretically supports HDR, but can’t sustain the brightness needed to do it justice in real-world content.

 

Here’s a perfect example: the opening of The Revenant. There’s a scene where Hugh Glass escapes the forest and stumbles out onto a wide, blindingly bright snowy landscape. It’s supposed to hit you with contrast—the darkness of the trees behind him, and this overwhelming wall of white sunlight ahead. On OLED, that moment just doesn’t land. The snow looks dull because ABL kicks in the second too much of the screen needs to be bright. The panel can’t sustain full-screen luminance, so instead of being blown away by the light, everything just looks flat and muted.

 

On a flagship Mini-LED display, that same scene actually looks blinding. The snow glows, the shadows hold detail, and the entire image has weight and impact—that’s HDR. OLED chokes under that kind of scene 100% of the time. If more than 5–10% of the screen tries to go bright, it just can't do it. You never see the full dynamic range the content was mastered for- HDR on OLED is not even remotely close to reference accurate.

 

This is why I say OLED gives you maybe half the HDR experience. Sure, the blacks are great, but that’s not the whole story. Mini-LED gives you 90% of the black level performance, with none of OLED’s compromises. You get higher sustained brightness, vastly better peak brightness, more stable performance with better motion handling, wider color gamut than WOLED, and no risk of burn-in. You also have less DSE, banding, and panel aging issues that plague OLED.

 

Sony seems to agree. They’re clearly backing Mini-LED in their high-end lineup, and for good reason.

 

11 hours ago, Doobeedoo said:

I mean there are TVs nowdays with higher native contrast like VA MiniLED ones

You cannot show me one single current flagship mini-LED with a 10000:1 native contrast or even close to it! 

 

TCL and Hisense use panels with pretty good native contrast but still not on the level of the Elite. Sadly, most major brands like Sony and Samsung use cheap low native contrast panels and rely solely on local dimming zone count, and it doesn’t work well. No matter the zone count, LED tech struggles with high contrast scenes, and the more zones you have, the harder they are to precisely control as it takes a lot of CPU power.

 

Higher contrast with fewer zones produces much better results because you have a great baseline black level to work from and easier zone control—i.e., better algorithm. In the Elite’s case, it has 12-bit backlight control with over 4000 steps of dimming, whereas many flagship-level sets like the U8K still use 10-bit backlight control with just over 1000 steps of dimming.

 

In the end, it is far more nuanced than “more zones is better.”

Link to post
Share on other sites

7 minutes ago, DeltaE3 said:

OLED doesn’t actually deliver true HDR....It’s just the reality of how the technology works. People assume HDR is HDR, but on OLED it’s a watered-down version thanks to major brightness limits and the aggressive Automatic Brightness Limiter. What you end up with is a display that theoretically supports HDR, but can’t sustain the brightness needed to do it justice in real-world content.

 

Here’s a perfect example: the opening of The Revenant. There’s a scene where Hugh Glass escapes the forest and stumbles out onto a wide, blindingly bright snowy landscape. It’s supposed to hit you with contrast—the darkness of the trees behind him, and this overwhelming wall of white sunlight ahead. On OLED, that moment just doesn’t land. The snow looks dull because ABL kicks in the second too much of the screen needs to be bright. The panel can’t sustain full-screen luminance, so instead of being blown away by the light, everything just looks flat and muted.

 

On a flagship Mini-LED display, that same scene actually looks blinding. The snow glows, the shadows hold detail, and the entire image has weight and impact—that’s HDR. OLED chokes under that kind of scene 100% of the time. If more than 5–10% of the screen tries to go bright, it just can't do it. You never see the full dynamic range the content was mastered for- HDR on OLED is not even remotely close to reference accurate.

 

This is why I say OLED gives you maybe half the HDR experience. Sure, the blacks are great, but that’s not the whole story. Mini-LED gives you 90% of the black level performance, with none of OLED’s compromises. You get higher sustained brightness, vastly better peak brightness, more stable performance with better motion handling, wider color gamut than WOLED, and no risk of burn-in. You also have less DSE, banding, and panel aging issues that plague OLED.

 

Sony seems to agree. They’re clearly backing Mini-LED in their high-end lineup, and for good reason.

 

You cannot show me one single current flagship mini-LED with a 10000:1 native contrast or even close to it! 

 

TCL and Hisense use panels with pretty good native contrast but still not on the level of the Elite. Sadly, most major brands like Sony and Samsung use cheap low native contrast panels and rely solely on local dimming zone count, and it doesn’t work well. No matter the zone count, LED tech struggles with high contrast scenes, and the more zones you have, the harder they are to precisely control as it takes a lot of CPU power.

 

Higher contrast with fewer zones produces much better results because you have a great baseline black level to work from and easier zone control—i.e., better algorithm. In the Elite’s case, it has 12-bit backlight control with over 4000 steps of dimming, whereas many flagship-level sets like the U8K still use 10-bit backlight control with just over 1000 steps of dimming.

 

In the end, it is far more nuanced than “more zones is better.”

It's per pixel lit, which is really a must for proper HDR though. Brightness means nothing without context, yes extra can help like on LCD but that lacks per pixel so. Can definitely depend on content, bright content like sunny day stuff can benefit more from bright LCD vs OLED that will do better in dark on straight up any contrasting content though. You can disable ABL I did so. My OLED monitor looks very good with all energy saving disabled too. It's even brighter than my old LCD one I had.

But it can do brightness needed though, thing is it can do better contrast which matters for HDR than LCD can. It also destroys LCD in dark scenes way more than LCD can in bright scenes content. 

 

Disable ABL maybe as it sucks, on OLED looks great, you can still be blinded fairly. No point comparing like that at all, it's huge difference with it on and off. Though there are many example, overall I find OLED have way more wins and examples where it can shine more in general. Watching any space like stuff or dark scenes or contrasting dark scenes with neon stuff like Blade Runner looks so awesome on OLED though. HDR is not blindness or very bright image only, LCD can't do per pixel as OLED which is just not how you can achieve perfect contrast and fine detail and difference when there are many small contrasting elements on screen. Things are improving and once brightness catches up no contention.

 

Neither is LCD close to reference mastering, obviously. People don't have reference monitors at home. I wouldn't say MiniLED giving 90% of black not even close. I can still see zones and glow around very bright objects on dark scene. It's distracting. Brightness is not an issue unless you're in very bright room though. OLED is improving in brightness department but LCD can't improve further. Motion handling is not good on most MiniLED displays even monitors. The LCD blur may help if you maybe mean low fps content. The burn-in is almost no issue as monitors started to appear which are way more prone to it than TV is. Depends which you're comparing yeah WOLED may not have best color gamut but still solid, but there's QD-OLED which has the highest from reviews. 

 

Sony is offering both but maybe for large TVs they're seeing more may use them in bright living room more and they can get better image in those conditions. Though they have QD-OLED model and Bravia 9 Mini-LED one. It's a very good TV but not something I'd use. I can still see haloing and blooming, I can see it on monitor that has very close amount of zones like smallest model. They're both flagships but different preference and environments. 

That is not native contrast, that is like some mix of native and MiniLED or what, some test this different. It's a VA panel it's like few thousands : 1 in general, MiniLEDs on improving this, depends who reviews it. But still it's an VA LCD though.

 

The Sharp Elite contrast is like much lower less than 2000:1 without local dimming. That's a native panel. These are all LCDs they're limited. To me number of zones don't do it, it's just not enough in some cases. Of course they use local dimming as they're making more zones, but LCD is an LCD still, there is IPS black with better contrast than regular, also better colors than VA, but they're using VA as IPS is LG tech and also VA can have better contrast but a bit worse colors. They can only improve contrast for LCD so much, they will never increase zones way more, let alone per pixel control. 

 

Both are needed and work together, native contrast in general matters how black image can even look where higher zone count can show contrasts in granular detail. Yeah it's not just more zones better, but more zones sure is needed as I've seen enough of high zone displays and monitors and they suck to me when they're pushed to display dark background content with very bright detail moving. Freaking mouse pointer glows. 

 

Both tech can improve only so far. Though OLED is looking it will keep improving for a while. Next big thing may be MicroLED or something else.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Zowie GTF-X | Mouse: Vaxee XE wired | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | LG 32GS95UV-B OLED 4K 240Hz / 1080p 480Hz dual-mode | OS: Windows 11 |

Link to post
Share on other sites

2 hours ago, Doobeedoo said:

You can disable ABL I did so

No offense, but you didn't disable ABL.. You disabled ASBL.. which only dims the screen after static content persists for a few minutes...disabling ASBL doesn’t matter much in any meaningful way.   You cannot disable ABL  — there's a big difference between the two! *ALL* OLED have terrible brightness stability and you can do anything about it.. unfortunately.  

 

2 hours ago, Doobeedoo said:

The Sharp Elite contrast is like much lower less than 2000:1 without local dimming.

Even the first-generation UV2A panels from 2009 had a native contrast ratio of 5,000:1 Sharp to Incorporate UV2A Technology into Production of LCD Panels | Press Releases:SHARP (global.sharp) .  The Elite uses second gen UV2A panels -  With local dimming enabled in PC or Pure mode the Elite reaches around 100,000:1, which still competes with today’s best Mini-LED displays.

 

Even in THX mode, where the backlight doesn't fully turn off, it achieves 40,000:1 which is really damn good. The 2,000:1 figure quoted by Sound & Vision was inaccurate — this set has been measured many times on AVSForum, HTF, and elsewhere. Plus, Sharp’s own spec sheets confirm the numbers.

 

It’s a really unique panel tech— almost a hybrid between VA and IPS — and the viewing angles are unbelievably good for an LED-based TV. Someone even posted a video showing just how wide the angles are, and it’s insane 

 

2 hours ago, Doobeedoo said:

but more zones sure is needed

RTINGS is widely known for having a somewhat unreliable rating system, so you can’t fully trust their scores. However, they do produce excellent blooming test videos and usually get the specs correct. To illustrate my point about how high native contrast combined with fewer dimming zones outperforms low native contrast with many zones, check out their blooming test video. A TV with low native contrast and 720 zones exhibits significantly more blooming compared to the same model with much higher native contrast but far fewer zones.  Samsung QN85C/QN85CD QLED vs Samsung QN85D Side-by-Side TV Comparison - RTINGS.com

 

2 hours ago, Doobeedoo said:

Next big thing may be MicroLED

I agree and it will make OLED obsolete 

Link to post
Share on other sites

9 hours ago, DeltaE3 said:

No offense, but you didn't disable ABL.. You disabled ASBL.. which only dims the screen after static content persists for a few minutes...disabling ASBL doesn’t matter much in any meaningful way.   You cannot disable ABL  — there's a big difference between the two! *ALL* OLED have terrible brightness stability and you can do anything about it.. unfortunately.  

On my LG i have smart energy savings disabled which disables luminance compensation algorithm. Also peak brightness setting is set on high.  I've seen some disable ABL through remove on some TVs but with * there. Consumer stuff are not reference monitors after all that are built different and can sustain and cool better .
I meant it more on my monitor, where when configured I never see dimming in static or dynamic content. So by ABL I meant is more in general brightness limit which works well on my display in SDR as expected and HDR too. So for OLED monitors ABL not really a factor.

Then again it's not all about super high brightness and depends on content and environment like we know with different display tech today.

9 hours ago, DeltaE3 said:

Even the first-generation UV2A panels from 2009 had a native contrast ratio of 5,000:1 Sharp to Incorporate UV2A Technology into Production of LCD Panels | Press Releases:SHARP (global.sharp) .  The Elite uses second gen UV2A panels -  With local dimming enabled in PC or Pure mode the Elite reaches around 100,000:1, which still competes with today’s best Mini-LED displays.

 

Even in THX mode, where the backlight doesn't fully turn off, it achieves 40,000:1 which is really damn good. The 2,000:1 figure quoted by Sound & Vision was inaccurate — this set has been measured many times on AVSForum, HTF, and elsewhere. Plus, Sharp’s own spec sheets confirm the numbers.

 

It’s a really unique panel tech— almost a hybrid between VA and IPS — and the viewing angles are unbelievably good for an LED-based TV. Someone even posted a video showing just how wide the angles are, and it’s insane 

 

Yeah it's just a VA type of panel which has better contrast than IPS does. Which can matter more in a dark room. You can find todays flagship MiniLED TVs contrast ratio with local dimming in millions : 1 though. There are different testings how they all get these numbers, some for native VA apparently 10000:1 so yeah. I wouldn't really look at these numbers all too much, especially as they all come from different sources and type of testing so they can't be compared. Point is no matter how good it may be, it's still VA tech, it's still MiniLED backlight, both known for their limitations which can't improve vs IPS color or OLED per pixel contrast. 

 

That in vid looks fine, even though hard to tell in video and how it was recorded, yes it's good, but it's not really anything special, we had IPS TVs that had much better viewing angle, basically almost not changing at all. I know that by monitors even. 

I wouldn't call LCD display LED based like horrible marketing did and confused people as it's not direct view display like OLED is. It's just using LEDs as backlight vs some CCFL like before. But it's still LCD display and not LED display. Shitty marketing.

9 hours ago, DeltaE3 said:

RTINGS is widely known for having a somewhat unreliable rating system, so you can’t fully trust their scores. However, they do produce excellent blooming test videos and usually get the specs correct. To illustrate my point about how high native contrast combined with fewer dimming zones outperforms low native contrast with many zones, check out their blooming test video. A TV with low native contrast and 720 zones exhibits significantly more blooming compared to the same model with much higher native contrast but far fewer zones.  Samsung QN85C/QN85CD QLED vs Samsung QN85D Side-by-Side TV Comparison - RTINGS.com

I always checked few sources, they have been revising their testing methods to be more accurate and better represent differences. They did vid/articles on this, I was mostly interested in monitors when I checked this but yeah. HDTVTest channel is good. Few others. 

I get what you're saying with contrast, but if not in dark it may not matter much even, also from what I've seen between TVs in general being VA with MiniLEDs one with more zones sure is better, no VA based TV can get OLED black so if some is better cool but hardly will be so much better than general same tier panels. 

 

Those are completely different type of LCD tech there, it's IPS vs VA of course contrast will be different, regardless of zones even. But the one with less zones which is VA also has worse black grey uniformity and viewing angles, reflection than IPS one. It also seems IPS one was reviewed better on individual reviews. Though I'd say some things are a wash as some images may contradict comparison as different method testing was used VA one using newer.

While I don't care for these, they are well over €1000+ and I'd never buy an LCD for that much, same thing happened in monitors space, actually worse, when MiniLED monitors came they were obscenely expensive and that that good even. Later they improved, but still costed way too much while not justifying it still. Now, OLEDs finally came as monito options and they're growing very fast and for great price even, which was not expected. Literally killed MiniLED monitors. Also most regular LCD ones that cost near them now, make no sense to buy. Unless some niche color pro monitor or certain strobe ones.

9 hours ago, DeltaE3 said:

I agree and it will make OLED obsolete 

it will make anything that exists obsolete if they improve and fix everything to make it commercialized. Otherwise there's QDEL tech display as successor.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Zowie GTF-X | Mouse: Vaxee XE wired | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | LG 32GS95UV-B OLED 4K 240Hz / 1080p 480Hz dual-mode | OS: Windows 11 |

Link to post
Share on other sites

9 hours ago, Doobeedoo said:

flagship MiniLED TVs contrast ratio with local dimming in millions : 1 though

Sorry man, but I’ve been patient and it’s time to just say it — This whole time you’re trying to jump into a conversation you clearly don’t have the knowledge for. I’ve had to correct you multiple times, and you keep coming back with even more ridiculous claims. There are no mini-LEDs hitting “millions to 1” contrast. The absolute best, like the Bravia 9, top out around 500,000:1 — and even then, there are maybe two LED-based panels on the planet that can pull that off.

 

9 hours ago, Doobeedoo said:

On my LG i have smart energy savings disabled which disables luminance compensation algorithm. Also peak brightness setting is set on high.  I've seen some disable ABL through remove on some TVs but with * there. Consumer stuff are not reference monitors after all that are built different and can sustain and cool better .
I meant it more on my monitor, where when configured I never see dimming in static or dynamic content. So by ABL I meant is more in general brightness limit which works well on my display in SDR as expected and HDR too. So for OLED monitors ABL not really a factor.

Then again it's not all about super high brightness and depends on content and environment like we know with different display tech today.

How about just say you were wrong instead of posting a whole paragraph of nonsense? You didn’t disable ABL — and regardless of whatever settings you used, your OLED has the same awful brightness instability as everyone else’s.

 

9 hours ago, Doobeedoo said:

Those are completely different type of LCD tech there, it's IPS vs VA of course contrast will be different

Jesus Christ dude, more drivel. The whole point was about native contrast, not panel tech — and instead of just admitting you were wrong, you cope by dumping another paragraph of nonsense. Sorry to be a jerk, but it gets seriously annoying when people flat-out refuse to admit they misunderstood something, so they just keep doubling down with more crap. Don’t be that guy — it's not a good way to be. I can't talk to you anymore. 

Link to post
Share on other sites

36 minutes ago, DeltaE3 said:

Sorry man, but I’ve been patient and it’s time to just say it — This whole time you’re trying to jump into a conversation you clearly don’t have the knowledge for. I’ve had to correct you multiple times, and you keep coming back with even more ridiculous claims. There are no mini-LEDs hitting “millions to 1” contrast. The absolute best, like the Bravia 9, top out around 500,000:1 — and even then, there are maybe two LED-based panels on the planet that can pull that off.

Correct me on what? I even mentioned looking at contrast ratio numbers highly depends because how different sources test them and also what manufacturers state on spec sheet. There's that TCL QM7K with high contrast. Again like I said I never bothered too much with these contrast ratios as they depend on type of testing.

 

41 minutes ago, DeltaE3 said:

How about just say you were wrong instead of posting a whole paragraph of nonsense? You didn’t disable ABL — and regardless of whatever settings you used, your OLED has the same awful brightness instability as everyone else’s.

You mention ABL and snow scene. It should not look dull, obviously it won't be full screen peak brightness but it should still look solid. You're not watching this under the sun. Also rare scenario too. Same when comparing LCD vs OLED in space scenario and such. What brightness instability you speak of, provide example maybe. If by this you mean instability that is not instability, just different brightness capabilities. 

You seem to be hyper focused on brightness like it's be all end all in image quality. I'd get it if you watch it at your outdoor cinema during day.

Same how you claimed plasma is better than OLED it's just not, you may prefer certain look of it, same for CRT on specific use it may feel or look better for older content, doesn't mean it has better color or accuracy. 

1 hour ago, DeltaE3 said:

Jesus Christ dude, more drivel. The whole point was about native contrast, not panel tech — and instead of just admitting you were wrong, you cope by dumping another paragraph of nonsense. Sorry to be a jerk, but it gets seriously annoying when people flat-out refuse to admit they misunderstood something, so they just keep doubling down with more crap. Don’t be that guy — it's not a good way to be. I can't talk to you anymore. 

Hahaha, you literally tried to show me comparisons between two TVs without even stating they're using completely different tech, to show contrast difference, two different techs, also different zones, what's the point of comparing extremes like this then, when some things can different between models using same tech. What was I wrong about, again? You didn't even mention, you just deflect. But whatever.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Zowie GTF-X | Mouse: Vaxee XE wired | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | LG 32GS95UV-B OLED 4K 240Hz / 1080p 480Hz dual-mode | OS: Windows 11 |

Link to post
Share on other sites

On 6/15/2025 at 7:44 AM, porina said:

I don't recall the image of it being anything special at the time.

My brother in law at the time had a plasma. It was 720p I believe. I did notice better picture compared to an LCD of the same resolution at the time. But going off memory and the last time I was at Best Buy, Id say OLED and Mini LED probably looked better. Of course that also could be due to the higher resolution and any other improvements made to TV's since Plasmas were a thing. 

 

I just want to sit back and watch the world burn. 

Link to post
Share on other sites

19 hours ago, Donut417 said:

.

 

have one sitting in a room unused, it did look better than lcds side by side, though i don't know the specs of the VA panel i compared it to, there are some major improvements made to regular ips/va panels since around 2011 so could just be nostagia

 

I've never thought woled color was impressive, black isn't the only color that's important for me so i was super excited when qd-oled came out.

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to post
Share on other sites

From the reviews I've seen QD-OLED show a curve similar to the left.  I'm not sure I believe this review.  Possibly (I haven't seen plasma in a very long time) but it's going to 100% lose on resolution, brightness and burn-in.

On 6/14/2025 at 10:15 PM, DeltaE3 said:

 

d3533e45-00dc-42f1-937b-66c7c1dcd46f.png

 

Every Day: Minisforum AI X1 Pro, 64GB, 4TB 990 PRO 9060xt eGPU; Gaming: AMD 7950x3d / Gigabyte Aurous Master X670E/ 64GB @ 6000c30 / 2 x 4TB Samsung 990 Pro / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED + MSI 321URX

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×