Jump to content

Display Technology FAQ / Mythbuster

Glenwing
On 4/13/2016 at 8:35 PM, Glenwing said:

Right now OLED has longevity problems. Screens degrade the more they are used. You can see it if you look at demo units at stores if you can find Samsung tablets or phones with OLED screens, if they've been on display for a while.

In other words, wait until the longevity problem is solved, or any environmental gain you might get will be dwarfed by the environmental loss because you need to replace it a lot more often?

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, AlveKatt said:

In other words, wait until the longevity problem is solved, or any environmental gain you might get will be dwarfed by the environmental loss because you need to replace it a lot more often?

It should last 3 to 5 years. We have it in many smartphones. The problem is that each color wear at different speeds, which alters the colors over time. The OLED panel has a algorithm system which tries to compensate so that colors are always right, but if you want the best colors always, you need to calibrate it often, much like CRT monitor (the the big tubed monitor, before we had LCD computer monitors and TVs), that is assuming you care of course. Most people don't notice.

 

Other problems with OLED is that they can face burn in problem. Where if an image is at the same sport for extended period of time (like several hours), it can remain displayed for a few seconds to minutes if it changes. This has improved over the years with newer technologies, but it remains still there. This can be potentially 'cause problems if you leave your monitor on, and have no screen saver (or don't set it to a blank screen / disable the monitor from going into Standby),

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 4/21/2016 at 8:40 AM, AlveKatt said:

In other words, wait until the longevity problem is solved, or any environmental gain you might get will be dwarfed by the environmental loss because you need to replace it a lot more often?

Yes, pretty much. OLED screens are fine for phones, since it's rare that a phone will be expected to stay in top shape after 3-5 years, but monitors have much higher expectations.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Glenwing said:

Yes, pretty much. OLED screens are fine for phones, since it's rare that a phone will be expected to stay in top shape after 3-5 years, but monitors have much higher expectations.

That's a low expectation in a day and age when electronic waste is a problem.

I would probably still be using my HTC magic if Android didn't stop supporting it. It was locked to Android 1.5, and the security updates had stopped coming a long time ago when I got my new Fairphone. I must have had it at least 7 or 8 years, and with the non waste philosophy I expect Fairphone to still support it that far in the future. http://www.fairphone.com

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...

Hello. So... may i have suggestions on a gaming monitor at about 160$? Thanks :) also thanks for the thread. Very usefull and informative. Yet i dont feel comfortable picking one on my own just yet... 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Z-Gaming said:

Hello. So... may i have suggestions on a gaming monitor at about 160$? Thanks :) also thanks for the thread. Very usefull and informative. Yet i dont feel comfortable picking one on my own just yet... 

Hello :) If you want some help picking a monitor you can start a new thread in this section and someone will help you out :)

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Glenwing said:

Hello :) If you want some help picking a monitor you can start a new thread in this section and someone will help you out :)

Cool. Thaks, new to the forum

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...

I want to chime in (perhaps late) on the OLED discussion a few posts back. I've owned a Zune HD since 2010 and it was my daily music player until recently. I've kept the same wallpaper on it for over three years and I used the quick media controls a lot. So I was expecting some kind of burn in, either from the quick media controls or the wallpaper. But there's no noticeable burn in at all, if there is any.  I also had a Moto X 2013 until last December. I was also expecting some noticeable degrading where the Android soft buttons and notification area are, but nothing.

 

The only thing I can think of is that I kept my devices at their minimum brightness unless I really need it.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 months later...
On 5/17/2016 at 10:17 PM, M.Yurizaki said:

I want to chime in (perhaps late) on the OLED discussion a few posts back. I've owned a Zune HD since 2010 and it was my daily music player until recently. I've kept the same wallpaper on it for over three years and I used the quick media controls a lot. So I was expecting some kind of burn in, either from the quick media controls or the wallpaper. But there's no noticeable burn in at all, if there is any.  I also had a Moto X 2013 until last December. I was also expecting some noticeable degrading where the Android soft buttons and notification area are, but nothing.

 

The only thing I can think of is that I kept my devices at their minimum brightness unless I really need it.

That min brightness really does help oleds.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 weeks later...

Quiet an informative guide, not only as regards of misconception and myths.

 

My suggestion is so when talking about any values such as frame rates, please get specific on numbers with actual context. Such as the average frame rate for HD movies, optimal and the best average which high end PC's clock nowadays.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 month later...

Sorry for the noob question guys, but if I understand HDR displays correctly, then aren't most if not all modern AMOLED displays technically capable of displaying HDR content?

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, vergil09 said:

Sorry for the noob question guys, but if I understand HDR displays correctly, then aren't most if not all modern AMOLED displays technically capable of displaying HDR content?

They are physically capable, yes, but their controlling software needs to understand what HDR is and how to operate the display panel in accordance with that.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Glenwing said:

They are physically capable, yes, but their controlling software needs to understand what HDR is and how to operate the display panel in accordance with that.

Ohh I see, thanks!

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 month later...
  • 2 weeks later...

What happens when you connect a laptop that displays FHD (1920 x 1080) with HD Graphics 520 to a QHD, UHD or a monitor with 21:9 aspect ratio? 

Will it scale to the QHD, UHD or widescreen resolution with the HD Graphics 520 thorugh HDMI 2.0 port?

| Follow me @beaniesonwhales | Dream Rig @ PCPartPicker | ♪♫SoundCloud♮♭ |

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...
On 12/27/2014 at 3:47 PM, Glenwing said:

Feeling a difference of 4ms is absolutely impossible, and if you think you can, have a go at this so you get an idea of how short a millisecond is: http://humanbenchmark.com/tests/reactiontime.

Do you know how this site measures this? I mean, if it is client side, then it is accurate. But if the server is in new guinea, then not so much!

Thanks!

As #muricaparrotgang's founder, I invite you to join our ranks today.

"My name is Legion 'Murica Parrot Gang, for we are many."

 

(We actually welcome all forms of animated parrot gifs.)

 

The artist formerly known as Aelar_Nailo.

 

Profile Pic designed by the very lovely @Red :)!

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 months later...
On 12/27/2014 at 6:47 PM, Glenwing said:

This is also the reason laptops can dim their displays but computer can’t dim their monitors.

i know this is FAQ is 3 years old now, but i found this statement to not be entirely true.. there is something called DDC or display data channel that basically enables communication protocols between the display and the graphics adapter. the most common hosts appear to be contrast and brightness changes via software. one such software i know of, Screenbright, can communicate via DDC/CI to change brightness, contrast, RGB levels, etc of the monitor. the monitor i am using at the moment has DDC/CI, so i can use Screenbright to change monitor parameters without having to use OSD. this also allows me to set hotkeys to set profiles according to what i'm doing. 

 

edit:

this may not be important for the average user, but i find it to be wonders if one uses their monitors in several tasks, like gaming to casual browsing. with HDR monitors around the corner as well, it is nifty to set profiles so you can have the brightest panel possible for HDR viewing or gaming, and tone down brightness when browsing reddit, for example. 

Link to comment
Share on other sites

Link to post
Share on other sites

58 minutes ago, Technicolors said:

i know this is FAQ is 3 years old now, but i found this statement to not be entirely true.. there is something called DDC or display data channel that basically enables communication protocols between the display and the graphics adapter. the most common hosts appear to be contrast and brightness changes via software. one such software i know of, Screenbright, can communicate via DDC/CI to change brightness, contrast, RGB levels, etc of the monitor. the monitor i am using at the moment has DDC/CI, so i can use Screenbright to change monitor parameters without having to use OSD. this also allows me to set hotkeys to set profiles according to what i'm doing. 

 

edit:

this may not be important for the average user, but i find it to be wonders if one uses their monitors in several tasks, like gaming to casual browsing. with HDR monitors around the corner as well, it is nifty to set profiles so you can have the brightest panel possible for HDR viewing or gaming, and tone down brightness when browsing reddit, for example. 

Yeah I'm aware of DDC/CI, just outdated info :3.

 

The main point is still half true anyway, computers still don't have as direct of a connection as laptops do, DDC just provides a standardized interface to give instructions to the monitor's controllers, the monitor has to specifically support that interface and the instructions for it to work, the same way they need to support Adaptive-Sync operations for that to work. But anyway, this guide will be replaced at some point in the future, so... yeah :P

Link to comment
Share on other sites

Link to post
Share on other sites

Laptops and tablets all use ACPI to communicate the brightness. DDC/CI is support on many monitors. But full support is not guaranteed.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, GoodBytes said:

Laptops and tablets all use ACPI to communicate the brightness. DDC/CI is support on many monitors. But full support is not guaranteed.

do monitors manufacturers list supporting these protocols? i imagine most modern monitors would have it, specifically for the case of gaming monitors that also happen to work quite well for professional tasks. 

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, Technicolors said:

do monitors manufacturers list supporting these protocols? i imagine most modern monitors would have it, specifically for the case of gaming monitors that also happen to work quite well for professional tasks. 

Nope. Some manufacture do, others don't, even if they have full support. For example, Dell UltarSharp series. Usually business focused monitor will have it, and the option to turn it on or off is in the on-screen menu (so if you can find the manual, you can see if the monitor has some level of support that way).

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...
On 9/12/2016 at 5:21 PM, DigineX said:

Quiet an informative guide, not only as regards of misconception and myths.

 

My suggestion is so when talking about any values such as frame rates, please get specific on numbers with actual context. Such as the average frame rate for HD movies, optimal and the best average which high end PC's clock nowadays.

 

 

 

I know this comment is kind of old but no one responded so here's some info:

 

1) Movies in general are nearly always filmed in 24fps, so for movie content you will have no problem with basically any display technology.  This is actually the reason that many faster (120/240hz) TVs have a "cinema" mode that turns off the frame interpolation, because having 10 frames inserted for each actual frame of content makes it look really unnatural.  If you've ever seen a movie playing on a TV at a store that looked like everyone was an alien moving with unnatural smoothness... that's why.

 

At any rate, talking about media frame-rates isn't really meaningful when you're discussing content like games...

 

2) It's really hard to talk about specific frame-rates because there are a LOT of variables involved other than your display.  I would say that 60fps is a bare minimum for a playable frame-rate, any lower and even if you're using GSync/FreeSync to help with smoothing it out you're still playing something that is far less accurate and responsive than it should be. 

 

I would say at 1080p you should be aiming for 120fps or higher on average and 300fps is optimal.  At 1440p you should be aiming at 90fps on average and 150fps is optimal.  Those numbers assume you're running with relatively good but not necessarily max settings, and I would also suggest that regardless of which refresh rate and panel type you use that you make sure you have ULMB or an equivalent backlight technology.  Getting rid of frame persistence (and therefore motion blur) is the single biggest jump in overall picture quality after going from 60hz to 120hz+. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, aithos said:

I know this comment is kind of old but no one responded so here's some info:

 

1) Movies in general are nearly always filmed in 24fps, so for movie content you will have no problem with basically any display technology.  This is actually the reason that many faster (120/240hz) TVs have a "cinema" mode that turns off the frame interpolation, because having 10 frames inserted for each actual frame of content makes it look really unnatural.  If you've ever seen a movie playing on a TV at a store that looked like everyone was an alien moving with unnatural smoothness... that's why.

 

At any rate, talking about media frame-rates isn't really meaningful when you're discussing content like games...

 

2) It's really hard to talk about specific frame-rates because there are a LOT of variables involved other than your display.  I would say that 60fps is a bare minimum for a playable frame-rate, any lower and even if you're using GSync/FreeSync to help with smoothing it out you're still playing something that is far less accurate and responsive than it should be. 

 

I would say at 1080p you should be aiming for 120fps or higher on average and 300fps is optimal.  At 1440p you should be aiming at 90fps on average and 150fps is optimal.  Those numbers assume you're running with relatively good but not necessarily max settings, and I would also suggest that regardless of which refresh rate and panel type you use that you make sure you have ULMB or an equivalent backlight technology.  Getting rid of frame persistence (and therefore motion blur) is the single biggest jump in overall picture quality after going from 60hz to 120hz+. 

Thanks for explaining it up. I assume G-Sync / Freesync is pretty much inevitable if ever looking for the beefier frame rates.

 

I hope I made the right choice on buying Asus PG279Q OC 165Hz, Haven't gaming on so far, but i expect it to be too good on rather buying a 4k monitor.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, DigineX said:

Thanks for explaining it up. I assume G-Sync / Freesync is pretty much inevitable if ever looking for the beefier frame rates.

 

I hope I made the right choice on buying Asus PG279Q OC 165Hz, Haven't gaming on so far, but i expect it to be too good on rather buying a 4k monitor.

 

That's the monitor I have, and it's the best gaming monitor on the market. 

 

What I will tell you is this:

- disable GSync

- set your refresh rate to 120hz

- turn on ULMB and set a pulse width of 65

- download a calibration profile that is meant for those settings

 

You'll get a much better experience with ULMB and the slightly lower refresh rate than you will with 165hz.  Trust me.

 

As for GSync/FreeSync... it only helps when you can't hit a decent frame-rate on a high refresh monitor.  IE: 30fps on a 144hz refresh rate, and what it does then is force the monitor to refresh at the same rate as the FPS... it smooths things out considerably at the cost of a little bit of input lag.  Assuming you have a decent computer and can maintain over 60fps in games you wouldn't really see any benefit to GSync, and over 100fps there is literally no benefit.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 5 months later...

Excelent FAQ, very useful.

 

I would like to ask you something, about the "1080p scaling on a 4K monitor" part of the FAQ. I'm wondering, since this FAQ was written a few years ago, if the more modern 4K monitors handle better 1080p resolutions (maybe not using interpolation or something).

 

I'm about to buy a 4K monitor, but since I game a lot and I doubt I'll be able to play all games in 4K at very high settings with acceptable framerate (i7 4770K + GTX 1080), I wondered how a 4K monitor would handle 1080p resolutions. I remember years ago how disapointed I was when I bought a 1080p TV in a country where most TV broadcasts were in 480p (btw, most channels are 480 and 720p now, only some premium channels are in 1080p and public TV only uses 1080p in very few events, like sports). It looked like someone had placed a very thin white veil in front of the screen.

 

So, the FAQ's answer still holds true to all 4k monitors, or are there newer 4K monitors that handle better 1080p resolutions? If there are, would you recommend a good 4K monitor, please?

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×