Jump to content

LG says next-generation OLED EX technology delivers improved brightness and accuracy and slimmer bezels

Lightwreather

Summary

LG is the maker of some of our favorite OLED TVs, so when the company says it’s improved on its basic panel technology, it’s worth paying attention. Today it did just that, with LG Display announcing its next-generation OLED technology — dubbed OLED EX — which the company says will increase brightness by up to 30 percent, boost picture accuracy, and allow for smaller bezels in finished products.

 

Quotes

Quote

These improvements are due to two key changes. The first is the use of an element known as deuterium in the chemical make-up of LG’s OLED panels, and the second is the incorporation of algorithmic image processing. LG says the latter will predict the usage of each individual light emitting diode in your TV based on your personal viewing habits to “precisely [control] the display’s energy input to more accurately express the details and colors of the video content being played.”

LG’s claims about reduced bezel sizes with OLED EX are a little more concrete at least. The company says that based on calculations involving a 65-inch OLED display, it will be able to reduce bezel thickness from 6mm to 4mm. It’s not a huge change on paper, but given how optimized this technology already is, every little improvement has to be fought for.

LG says it plans to start incorporating OLED EX technology into all its OLED panels starting in the second quarter of 2022, though it’s not clear how much longer it might then take for this technology to reach consumers.

My thoughts

Well, LG wants to improve it's panels and Truth be told, I'm all for it (seeing as I'm probably going to be in the market for an oled around the time these will be out). Now, about those bezels, do they really need to be slimmer? Apart from that, this new "Evolution Experience (EX)" seems to address some of the limitations of OLED notable, brightness. It's also funny how LG did not use the word "AI" for their "Algorithmic Image Processing", and is something that I can appreciate. However how well, LG lives upto their claims, and how much more or less these will cost, only time can tell and let me tell you, I am excited.

Sources

TheVerge

LG Newsroom

"The most important step a man can take. It’s not the first one, is it?
It’s the next one. Always the next step, Dalinar."
–Chapter 118, Oathbringer, Stormlight Archive #3 by Brandon Sanderson

 

 

Older stuff:

Spoiler

"A high ideal missed by a little, is far better than low ideal that is achievable, yet far less effective"

 

If you think I'm wrong, correct me. If I've offended you in some way tell me what it is and how I can correct it. I want to learn, and along the way one can make mistakes; Being wrong helps you learn what's right.

 

Link to comment
Share on other sites

Link to post
Share on other sites

New Panel tech is always interesting but I do have to agree that this slim bezel trend could just stop at this point.

Maybe that is just me, but i've never felt that a bezel is an issue unless I am working multiple screens, so I don't fully get the small bezel obsession in TVs.

Link to comment
Share on other sites

Link to post
Share on other sites

LG's current OLED TV's already are at the point where i don't really see an advantage of even slimmer bezels.

 

The "improved" brightness they already introduced with their "Evo" OLED panels on the G1 were measureably brighter than current C1's, but not different enough to make a noticeable difference in real world viewing. 30% more brightness compared to current C1's would mean they would finally overcome the regression since the C9 again. But only by 70 nits or so. And 70 nits is not something that will bring a noticeable improvement.

 

And in what way are they planning to "improve picture accuracity"? Current OLED TV's - no matter of what brand - are already insanely accurate, especially when calibrated. Any and all color accuracity limitations are completely up to the level of factory calibration they include, not the new OLED panel.

 

I don't mean to sound too pessimistic, but these claims don't really sound that lifechanging to me. But if they want to improve their OLED panels all power to them. When Samsung introduces their QD-OLED technology and it at least somewhat lives up to the claims then LG will have a serious competitor in the upcoming years.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, J-from-Nucleon said:

LG says the latter will predict the usage of each individual light emitting diode in your TV based on your personal viewing habits to “precisely [control] the display’s energy input to more accurately express the details and colors of the video content being played.”

This sounds vague. Do they mean if you watch a lot of letter boxed content that it'll be able to send more power to the pixels that aren't off for a brighter/richer image or something? I'm not sure how they could better express detail, as isn't that simply determined by the amount of pixels and source quality? Or is this all just fancy talk for saying the panels are more energy efficient now. Given that features like ABL already have mixed opinions  I hope it'll be an optional setting. The purists probably won't appreciate even more messing with the picture.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

Can we get something better for monitors already though. LCD needs to be phased out, even so in general monitor prices on high end are still insane, yet not even the best in terms of image quality or speed. 

Let's see what CES has to show.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

The brightness increase may sound like a lot, but its good to understand that 30% of a small number is a small increase.

 

The very bright scenes that really pop in HDR 1000+ QLEDs for example still wont be possible with a 30% increase over current OLEDs capabilities.

 

For more normal real HDR scenes we are talking about an increase of maybe 180nits bringing the OLED up to around 750-800nits, which is good ..its not 1000nits but its certainly good.

However there is no guarantee we would see the 30% improvement in such scenes, in fact unless they have made improvements to sub pixel durability, they may not increase the top end brightness but rather improve brightness when displaying large bright images, and peak brightness for short durations (explosions and such). Basically bring up the current ~700nit peak brightness to around 900nits which will certainly help.

 

The 30% improvement should it be predominantly in large bright scenes akin to 100% full white images however would be in the region of a 35-40 nit increase bringing full screen white up to about 165nits.

 

I think we'll see some good improvement coming over the next few years once Samsungs QD-OLED hit the markets as it has the potential to be superior to LGs current WRGB OLED tech.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Dreckssackblase said:

New Panel tech is always interesting but I do have to agree that this slim bezel trend could just stop at this point.

Maybe that is just me, but i've never felt that a bezel is an issue unless I am working multiple screens, so I don't fully get the small bezel obsession in TVs.

Smaller bezels made more sense back when bezels were a significant part of the TV. Now bezels are already so small that it's reached a point of good enough. I mean if you look at LGs existing OLED tvs and you can see that they already have incredibly small bezels. I can't imagine what the need for a smaller bezels would be other than maybe if you were doing some sort of TV wall with multiple tvs. 

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, SolarNova said:

The brightness increase may sound like a lot, but its good to understand that 30% of a small number is a small increase.

 

The very bright scenes that really pop in HDR 1000+ QLEDs for example still wont be possible with a 30% increase over current OLEDs capabilities.

 

For more normal real HDR scenes we are talking about an increase of maybe 180nits bringing the OLED up to around 750-800nits, which is good ..its not 1000nits but its certainly good.

However there is no guarantee we would see the 30% improvement in such scenes, in fact unless they have made improvements to sub pixel durability, they may not increase the top end brightness but rather improve brightness when displaying large bright images, and peak brightness for short durations (explosions and such). Basically bring up the current ~700nit peak brightness to around 900nits which will certainly help.

 

The 30% improvement should it be predominantly in large bright scenes akin to 100% full white images however would be in the region of a 35-40 nit increase bringing full screen white up to about 165nits.

 

I think we'll see some good improvement coming over the next few years once Samsungs QD-OLED hit the markets as it has the potential to be superior to LGs current WRGB OLED tech.

To be fair OLED has the advantage of incredibly dark blacks so while the peak brightness might be less than other technologies it might not matter as the darks could make up for the contrast. I mean if what matters is the difference in brightness between two parts of the screen then this new OLED might really be good for HDR. Also as someone with a 1000 nits peak brightness display I would say that honestly 1000 nits is too bright imo. It honestly hurts to look at scenes that actually have 1000 nits brightness. 

Link to comment
Share on other sites

Link to post
Share on other sites

All I want is a freakin' 32C2. Although 42C2 is fine and the max that I want. But do they really have to be much brighter...? I don't want sun in front of my face.

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

I'd have been interested in this info if it had news about drastically reduced burn in. It doesn't. So it's just pointless drivel to me 🤷🏻‍♂️

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Dreckssackblase said:

New Panel tech is always interesting but I do have to agree that this slim bezel trend could just stop at this point.

Maybe that is just me, but i've never felt that a bezel is an issue unless I am working multiple screens, so I don't fully get the small bezel obsession in TVs.

When a 40" TV from 12 years ago dwarfs a modern one in all aspects...that's why. Some old LCD TV weighed over 15KG - even at that size. Kind of difficult to wall mount.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, dizmo said:

I'd have been interested in this info if it had news about drastically reduced burn in. It doesn't. So it's just pointless drivel to me 🤷🏻‍♂️

Burn-in is not a serious issue with current OLEDs. It can happen, but LG has a lot going on in the firmware to minimize the chance of it.

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Derangel said:

Burn-in is not a serious issue with current OLEDs. It can happen, but LG has a lot going on in the firmware to minimize the chance of it.

That's contrary to pretty much everything else I've seen. 

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, dizmo said:

That's contrary to pretty much everything else I've seen. 

Long term burn-in tests have shown that the risk tends to be over-stated. Not that it can’t happen, but OLEDs are nowhere near as prone to it as Plasma was in the past. I’ve had a CX as my monitor since Jan and the only thing I do to reduce the risk of burn-in is to turn it off when I’m going to be away from the computer for a while.

 

That said, if you turn off all the auto-dimming, brightness limiting, and other features LG uses to reduce burn-in risk you will need to be A LOT more careful and diligent with your usage.

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/29/2021 at 4:44 PM, CTR640 said:

All I want is a freakin' 32C2. Although 42C2 is fine and the max that I want. But do they really have to be much brighter...? I don't want sun in front of my face.

They were due this year but a couple of reasons, one being the pandemic, meant they are pushing it out to 2022 so hopefully we will see a proper announcement of release very soon.

 

I was reading up about them a couple of weeks back (:

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Derangel said:

Long term burn-in tests have shown that the risk tends to be over-stated. Not that it can’t happen, but OLEDs are nowhere near as prone to it as Plasma was in the past. I’ve had a CX as my monitor since Jan and the only thing I do to reduce the risk of burn-in is to turn it off when I’m going to be away from the computer for a while.

 

That said, if you turn off all the auto-dimming, brightness limiting, and other features LG uses to reduce burn-in risk you will need to be A LOT more careful and diligent with your usage.

I'm pretty sure Linus had issues with the CX and burn in. So. Since it's still an issue, I still wouldn't do it 🤷‍♂️

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/31/2021 at 1:35 AM, dizmo said:

I'm pretty sure Linus had issues with the CX and burn in. So. Since it's still an issue, I still wouldn't do it 🤷‍♂️

To be fair linus is a worst-case user for an OLED panel. Having the same 4 apps open for 8-10 hours each day is not something the average home user will do, no matter if it's a gamer, someone using it to browse the web, or movie enthusiasts. Current OLED panels are already at the point where the average user won't have to worry about burn in. But if you regularly work in the same apps for hours each and every day, stay away from an OLED display.

 

To further add to that i've had a C9 for over 2 years now and there is absolutely no sign of burn in.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/29/2021 at 6:17 AM, J-from-Nucleon said:

do they really need to be slimmer?

You should see the bezels on my current TV. They have to be 2 Inches thick. I dont understand the need for smaller bezels in a TV. In a monitor I can see it to an extent as you might have multiple monitors. But I dont see the need in TVs. 

I just want to sit back and watch the world burn. 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Donut417 said:

You should see the bezels on my current TV. They have to be 2 Inches thick. I dont understand the need for smaller bezels in a TV. In a monitor I can see it to an extent as you might have multiple monitors. But I dont see the need in TVs. 


I don’t think they need to be smaller than they currently are on OLEDs, but I hate big bezels. Even on TV sitting a few feet away they’re hideous and distracting.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Derangel said:


I don’t think they need to be smaller than they currently are on OLEDs, but I hate big bezels. Even on TV sitting a few feet away they’re hideous and distracting.

Some are even more distracting when you get a newer TV, that has a larger screen size while being physically smaller. CCFL LCD to current LED LCD is an insane difference.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/3/2022 at 2:59 AM, Stahlmann said:

To be fair linus is a worst-case user for an OLED panel. Having the same 4 apps open for 8-10 hours each day is not something the average home user will do, no matter if it's a gamer, someone using it to browse the web, or movie enthusiasts. Current OLED panels are already at the point where the average user won't have to worry about burn in. But if you regularly work in the same apps for hours each and every day, stay away from an OLED display.

 

To further add to that i've had a C9 for over 2 years now and there is absolutely no sign of burn in.

Anyone using Windows is going to have burn in issues. Regardless of the app they're using. 

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, dizmo said:

Anyone using Windows is going to have burn in issues. Regardless of the app they're using. 

How is it i've been using my C9 for 2 years including as a display for my PC and i don't have any issues?

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, dizmo said:

Anyone using Windows is going to have burn in issues. Regardless of the app they're using. 

My CX has 1242 hours of screen on time, the vast majority of that is being used as a monitor for my Windows PC. Absolutely zero burn-in.

Link to comment
Share on other sites

Link to post
Share on other sites

Linus's OLED use case was rather terrible tbh.

 

The video even showed a 'cardinal sin' in OLED use for PC, that being he had his windows opened 'maximized' resulting in the browser/folder/program always displaying the UI in the exact same position each time. When using an OLED one should never open a window up maximized.

 

In addition, ull want to have auto hide taskbar enabled, and some kind of animated desktop background, ideally u'd also either have no desktop Icons, or enable opacity with a program like Wallpaper engine, which also has animated wallpapers available.

 

Do the above and Desktop usage should be doable long term without 'burn-in.

 

For gaming, so long as ur not going ham on a single game 8 hours+ a day ever day, u should be fine for many years. However even in games there's things u can do to limit 'burn-in' potential. Any game with a customizable UI should allow u to move and/or resize UI elements, do this on a regular basis and u can avoid the static UI from always displaying in the same spot.

 

Even when doing all the wrong things in the absolute worst case scenarios with a OLED (watching the same news channel all the time), it takes about 1800 hours before minor 'burn-in' starts to become noticeable on flat color slides.

it takes over 5500 hours with worst case gaming content such as high contrast, low opacity UI, for 'burn-in' to start showing up.

It should also be noted that after over 14,000 hours of COD:WWII, a low risk game, there is no noticeable burn-in' present.

Rtings tested this.

 

Those numbers btw are roughly equal to about 4 hours a day every day for over a year, and the same again for 4 years, which for someone who also has to work during the week is probably a good estimate. The 14,000 hour mark is equal to about 9.5 years 4 hours a day everyday on average.

 

The idea of changing habitual usage and doing the above things to lower 'burn-in' risk is to get ur average content down to the level of risk of COD:WWII. or at the very least somewhere between the Fifa example and COD:WWII, resulting in between 4 and 10 years of life before the most minor uniformity issues start to appear.

 

In all honesty, if u adjust habitual usage to that of low risk content, u should never see 'burn-in' within the expected average life of a Display.

Ofc not everyone is going to want to change their habits when using a display, and ever example of 'burn-in' i've seen within a year of use has been the result of the user doing something that is high risk. Like maximizing web browsers all the time.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×