Jump to content

Can someone explain why a dual-layer display, with monochrome OLED pixels as the backlight at 1080p, and LCD 4K as the color, isn't a thing?

I Just Want Ram Drives

I would guess because you don't get enough brightness levels out of OLED, and you'd burn up the oled panel quickly when using it for hours at full white.

Also not sure the OLED emits enough brightness a 4K lcd panel would need... the panels are not transparent after all.

 

I would like to see some panels with an eink layer as the base, and backlight on the edges of the lcd panel.... could make the eink layer fully white / reflecting all light to the lcd panel when gaming  and when you want to read books or don't mind grayscale you could make the lcd transparent and display text on the eink and turn off or make the panel dim. 

The eink needs no refresh so the monitor would basically consume no power.

It's complicated because lcd panels have a lot of specialized plastic sheets / layers on the bottom to diffuse and reflect the backlight from the edges uniformly back towards your eyes so the functionality of those would have to be incorporated into the eink layer, or make those layers transparent as needed. 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, mariushm said:

I would guess because you don't get enough brightness levels out of OLED, and you'd burn up the oled panel quickly when using it for hours at full white.

Also not sure the OLED emits enough brightness a 4K lcd panel would need... the panels are not transparent after all.

Darn.

Link to comment
Share on other sites

Link to post
Share on other sites

Hisense has a similar solution in their flagship LCD TV's that essentially uses 2 layers of LCD displays. One that is a standard VA display, the other is a lower resolution "dimming layer". It's works really well to improve contrast and is about the best LCD can do, but it has it's own downsides. Like horrible response times.

 

Samsung's QD-OLED works in a quite similar way to what you described. They only use a single blue OLED diode for each pixel and then use the Quantum Dots to filter the blue light and make red and green appear on screen. Blue OLED pixels are the ones that degrade the least, so this will certainly be a step in the anti burn-in direction.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Stahlmann said:

They only use a single blue OLED diode for each pixel and then use the Quantum Dots to filter the blue light and make red and green appear on screen.

Quantum dots are used as color converters / transformers here - not filters. Hence - minimal energy (brightness) loss.

5 hours ago, Stahlmann said:

Blue OLED pixels are the ones that degrade the least, so this will certainly be a step in the anti burn-in direction.

Typically, at the same brightness level organic diodes that emit higher frequency (lower wavelength) light have lower lifespan compared to lower frequency (higher wavelenth) light-emitting diodes.

 

In other words, blue OLEDs degrade faster than green ones. And green OLEDs degrade faster than red ones. Same brightness level.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, I Just Want Ram Drives said:

Just as the title says. What's the practical limitations?

For one, dual layer LCDs exist. Hisense make them

Secondly, there's no real benefit to using an OLED backlight layer over an LCD in that respect. The OLED layer isnt as bright as existing LCD versions, and the pixel transition speed wouldnt help since the color LCD layer above it would be the slowest link at that point.

 

Upcoming Samsung QD-OLED is something that u could argue is similar to what ur asking, except that its purely OLED.

A single color OLED layer (blue) is used as the 'backlight', and a QD matrix layer is used to filter that light into RGB.

The QD layer as far as im aware is purely a filter and not an active LCD layer, as such the its the OLED layer that changes the intensity of light output for each subpixel thus controlling color and ofc resulting in normal OLED pixel responsiveness.

If the QD layer was an active LCD matrix then that would adversely affect pixel response speed, removing one of the biggest strengths of OLED technology.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, tarsius said:

Typically, at the same brightness level organic diodes that emit higher frequency (lower wavelength) light have lower lifespan compared to lower frequency (higher wavelenth) light-emitting diodes.

 

In other words, blue OLEDs degrade faster than green ones. And green OLEDs degrade faster than red ones. Same brightness level.

It were the red subpixels that wore out fastest on OLEDs though, which is why they made them bigger. RTINGS also found in their burn-in test that the order was red-blue-green in terms of wearing out. I guess they have to push the red pixels rather hard to make up for the loss of brightness.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, tikker said:

It were the red subpixels that wore out fastest on OLEDs though, which is why they made them bigger. RTINGS also found in their burn-in test that the order was red-blue-green in terms of wearing out. I guess they have to push the red pixels rather hard to make up for the loss of brightness.

RTINGS: "We found that in our 20/7 Burn-in Test the red sub-pixel is the fastest to degrade, followed by blue and then green".

 

This statement, however, doesn’t automatically mean that red OLED material used in B6 degrades faster than green and blue in a pure (fair) experiment.

 

I don’t know the exact structure of this panel, but it most likely has a stack of OLED layers (of different colors to form white) and color filter layer - similarly to the modern WRGB OLED panels produced by LG Display. So, this panel doesn't have red, green and blue OLEDs as sub-pixels.

 

To make a relevant comparison of lifespan for OLED of different colors you have to make an experiment with the same sized diodes (or, at least, same [subpixel area / pixel area] ratio) and same brightness. I bet, in such an experiment blue OLED material from B6 panel will degrade faster than green and red.

 

On the other note, I’m sure that blue OLED used in the current Samsung QD-Display technology is much more stable compared to the one used in B6.

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, tarsius said:

RTINGS: "We found that in our 20/7 Burn-in Test the red sub-pixel is the fastest to degrade, followed by blue and then green".

 

This statement, however, doesn’t automatically mean that red OLED material used in B6 degrades faster than green and blue in a pure (fair) experiment.

 

I don’t know the exact structure of this panel, but it most likely has a stack of OLED layers (of different colors to form white) and color filter layer - similarly to the modern WRGB OLED panels produced by LG Display. So, this panel doesn't have red, green and blue OLEDs as sub-pixels.

 

To make a relevant comparison of lifespan for OLED of different colors you have to make an experiment with the same sized diodes (or, at least, same [subpixel area / pixel area] ratio) and same brightness. I bet, in such an experiment blue OLED material from B6 panel will degrade faster than green and red.

 

On the other note, I’m sure that blue OLED used in the current Samsung QD-Display technology is much more stable compared to the one used in B6.

I agree that it doesn't imply that fundamentally the red OLED material degrades the fastest. I think there are two comparisons to be made here:

  1. Degradation when driven at the same power/voltage
  2. Degradation when driven to the same brightness

I'm happy to believe you that when driven the same the degradation order is blue-green-red. For real-world use I would argue it matters more what the degradation is when outputting the same, which seems to be red-blue-green due to needing to drive red harder to increase its output.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/4/2022 at 6:08 AM, SolarNova said:

For one, dual layer LCDs exist. Hisense make them

Secondly, there's no real benefit to using an OLED backlight layer over an LCD in that respect. The OLED layer isnt as bright as existing LCD versions, and the pixel transition speed wouldnt help since the color LCD layer above it would be the slowest link at that point.

I think the LCD being the slow link would be fine as the OLED layer could just be tuned to that(millisecond delays for sending signal to it), but I'm not sure I agree on "OLED layer isn't as bright as existing LCD versions", unless you mean that using an OLED display isn't as bright as existing LCD backlights...

And I'm honestly not sure about that either, based on the QD-OLED video Linus did in the last day or two where he mentioned the standard method is to boost the brightness with some extra white OLEDs, if I'm remembering that one right. Since what I'm suggesting is just pure white OLEDs only, shouldn't their brightness be a fair bit higher?

(Mind you, to everyone talking about QD-OLED, this idea is based on both being cheaper than QD-OLED, because you know that's going to cost a fortune, thus the 1080p OLED backlight layer, and on not requiring new tech to implement)

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, I Just Want Ram Drives said:

I think the LCD being the slow link would be fine as the OLED layer could just be tuned to that(millisecond delays for sending signal to it)

That would just be a whole other can of worms. Hisense's current dual-panel solution shows us that it's not simple to combine two different panels while also keeping acceptable response times.

 

21 minutes ago, I Just Want Ram Drives said:

but I'm not sure I agree on "OLED layer isn't as bright as existing LCD versions", unless you mean that using an OLED display isn't as bright as existing LCD backlights...

Pretty sure he meant the backlight, as the liquid crystals don't emit light on their own.

 

21 minutes ago, I Just Want Ram Drives said:

And I'm honestly not sure about that either, based on the QD-OLED video Linus did in the last day or two where he mentioned the standard method is to boost the brightness with some extra white OLEDs, if I'm remembering that one right. Since what I'm suggesting is just pure white OLEDs only, shouldn't their brightness be a fair bit higher?

Even with the brightness boosting white OLEDs, they're still not getting bright enough to shine through an LCD layer and keep up repectable brightness.

 

21 minutes ago, I Just Want Ram Drives said:

(Mind you, to everyone talking about QD-OLED, this idea is based on both being cheaper than QD-OLED, because you know that's going to cost a fortune, thus the 1080p OLED backlight layer, and on not requiring new tech to implement)

OLED struggles with brightness, which is why it's not good enough to be a backlight for a normal LCD panel. You'd probably end up with 100 nit displays or so which people won't buy. I'd argue combining OLED and LCD will be even more expensive as both are complex and very different display technologies.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

As mentioned above. What i was referring to is that OLED simply cant get anywhere near bright enough to act as a backlight to a LCD.

 

If u take apart a HDR TV, and turn on just the backlights ull see they are ridiculously bright. WAY brighter than the resulting HDR output of the display and many more times that difference vs OLED.

 

To give u an idea of just how much brighter backlights are vs the image you see, check out the below vid. its not exactly comparable to 'normal' TV usage since its for outside thus requiring much brighter output, but u can still compare the shown backlight intensity vs the final image. its also just a neat vid to watch tbh.

LCDs cut backlight ..light ..by ALOT, theres no way current OLED could handle it.

 

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 months later...
54 minutes ago, I Just Want Ram Drives said:



Cool, someone made it. And then discontinued it.

Yes this TV was also praised by RTINGS for it's near-OLED-like contrast. But sadly it has horrible response times (ghosting) so it's a miss for everyone that likes to watch sports or play games.

 

I see why Hisense discontinued it. OLED TV's are getting cheaper and cheaper. LCD MiniLED options also get better each year. You could say this TV got cought in the crossfire because it didn't pick a side.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×