Jump to content

Display Technology FAQ / Mythbuster

Glenwing

...

The spoilers seem broken.  :wacko: 

Not sure if only I have the problem. 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Good work, Glenwing.

 

Three questions:

 

1. Why are OLED screens still not in consumer computer monitors yet? I've heard of OLED TVs before.

 

2. You've said that IPS has gone a long way since its inception, with response times going from 50ms to 5ms. And there are many names given to small revisions of IPS technology. So can't we look at the type of IPS as a factor when shopping for monitors? Or do we just purchase the latest monitor assuming that the latest monitor has the latest incremental improvements?

 

Leading to my third question,

 

3. Is there any particular site you would recommend for reading monitor reviews? Like Tom's Hardware, etc.

 

Thanks

In Placebo We Trust - Resident Obnoxious Objective Fangirl (R.O.O.F) - Your Eyes Cannot Hear
Haswell Overclocking Guide | Skylake Overclocking GuideCan my amp power my headphones?

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/3/2015 at 0:18 AM, Dark_wizzie said:

Good work, Glenwing.

 

Three questions:

 

1. Why are OLED screens still not in consumer computer monitors yet? I've heard of OLED TVs before.

 

2. You've said that IPS has gone a long way since its inception, with response times going from 50ms to 5ms. And there are many names given to small revisions of IPS technology. So can't we look at the type of IPS as a factor when shopping for monitors? Or do we just purchase the latest monitor assuming that the latest monitor has the latest incremental improvements?

 

Leading to my third question,

 

3. Is there any particular site you would recommend for reading monitor reviews? Like Tom's Hardware, etc.

 

Thanks

OLED technology is still relatively new and there's still a lot of work to be done on it. Also monitors are the smallest market compared to phones amd TVs, so it will be brought here last. Samsung is concerned with phones and LG is still getting its feet wet with TVs. I think it will be several years before we see OLED start to take off in the monitor space, and I also wouldn't count on first-generation products being very superb. Although at least the TV industry gets to be our guinea pigs, we'll start out with a more refined version of the technlogy when it gets to us :D

Differences between IPS variants are fairly minor unless you look at a huge timespan difference, most IPS monitors use the most or second-most recent IPS technologies at worst, so it's not much of a concern in my opinion.

Tftcentral.co.uk and pcmonitors.info are the places to go :)

Link to comment
Share on other sites

Link to post
Share on other sites

Since you seem to be the Display guru around here (and give technical answers), I'll just copy the thread I created. Some of it is not directly related to display technology per se, but most of it is.

From what I've read and seen, people are mostly claiming that given Free Sync's open nature, the only reason NVidia's cards don't support it, at the moment, is that NVidia doesn't want them to. Otherwise, a simple driver update can fix the problem. How true is that? If so, what do you think about buying a 4K FreeSync monitor with a NVidia build as future proofing (before you say, "Buy a G Sync monitor, then,"an equivalent G Sync monitor is almost twice the cost)? How likely is NVidia to support it in the future, or will it go down the proprietary road that Apple is only to famous for?

Or, would you guys suggest going down from 4K to 1440p and spending that money on G-Sync?

Also, on a different now, what is better, graphics/texture quality or a higher resolution? As in, 4K but medium settings vs. 2K at Ultra?

Link to comment
Share on other sites

Link to post
Share on other sites

Oh, and since I'm a bit of a newbie, about no. 8, what about if the two resolutions are actually not even perfect fractions? What if I'm playing a game at 1440p on a 4K monitor. Is the image quality affected significantly due to the interpolation you mentioned?

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/4/2015 at 5:56 AM, Anonymous1b said:

Since you seem to be the Display guru around here (and give technical answers), I'll just copy the thread I created. Some of it is not directly related to display technology per se, but most of it is.

Good question.

The first thing to consider here is that not all of AMD's GCN-based cards support dynamic refresh rates, only GCN 1.1 cards or above. Older 1.0 cards like the 7970/280X aren't capable, so this means that there is some hardware involved in making this possible. It's possible Kepler does not have this (G-SYNC hardware may be too different to simply act as a substitute), and if that's the case then Maxwell may not have it either. GPUs are designed on a 3–4 year cycle so by the time FreeSync was anounced, Maxwell was already finished, for the most part. It's not unrealistic to think it might not have the required hardware either.

On the other hand, Maxwell mobile GPUs are capable of "Mobile G-SYNC" which is essentially FreeSync, it operates over the same eDP protocol that AMD used to demonstrate FreeSync for the first time. Personally I think Maxwell GPUs at least could be capable of FreeSync given the right software, but only NVIDIA knows for sure.

I do not think NVIDIA will adopt "FreeSync" but I predict they will eventually come out with "G-SYNC 2.0" or something like that which works via DisplayPort Adaptive-Sync like FreeSync, because they will need to compete more on cost. It's just a question of whether they'll take special measures to block monitors with Adaptive-Sync that haven't been certified for "G-SYNC 2.0", which I think is incredibly likely.

It's also worth noting that right now, G-SYNC is superior to any current FreeSync implementation. G-SYNC covers the entire range of each monitor, from 0–144 Hz (or whatever the max is) while FreeSync has a lower and/or upper limit, sometimes quite restrictive ones. G-SYNC is also an entire package, not just the dynamic refresh technology. As it stands, the entire display controller is replaced by the G-SYNC module which includes NVIDIA's anti-ghosting algorithms optimized for dynamic refresh rates, something that works quite well and beats any FreeSync monitor I've seen so far. If NVIDIA uses Adaptive-Sync for "G-SYNC 2.0", it's likely they'd want to ensure it keeps those advantages, so for a monitor vendor there'd be a bit more effort to creating a monitor that is "G-SYNC 2.0" compliant than FreeSync, and it's difficult to say how cross-compatibility will work out.

For the 4K/medium vs 2K/ultra thing, it depends on the specific game. Some games look almost the same to me between medium and ultra, some there's a much larger difference. In general though I think I'd prefer the higher resolution.

Interpolation does happen with 1440p on 4K, any non-native resolution must be interpolated to display fullscreen, with the possible exception of exact fractions like I mentioned. 1440p won't be interpolated any "worse" than 1080p due to its non-exactness if that's what you're asking.

Link to comment
Share on other sites

Link to post
Share on other sites

From what I've read and the videos I've watched (probably from one of Linus' videos), I distinctly remember the person saying that while NVidia obviously has greater control, both standards do have a minimum refresh rate and that, for the most part, they are quite similar in most cases. This also makes sense because - again, I've been learning a bit about how monitors work due to all this - most LCDs need to refresh the panel after some time or you get "flashing" (there was another term used but basically screen goes black for a fraction of a second, until you get a new frame). Then again, I'm the newbie. Still, could you clarify what you mean by "G-Sync can go from 0-[max_range]"?

Technical curiosities aside, what do you mean by "...they will eventually come out with G-SYNC 2.0 which works via DisplayPort Adaptive-Sync...

"?

Right now, they obviously provide their own display logic chip (that would be the scaler, right?), while FreeSync doesn't require additional hardware from AMD and just uses the scaler that the monitor has. Do you mean that 'G-Sync 2.0' will not require a separate logic chip, much like with AMD's FreeSync at the moment, in which case I should just wait for a few more months before buying a monitor...

Link to comment
Share on other sites

Link to post
Share on other sites

From what I've read and the videos I've watched (probably from one of Linus' videos), I distinctly remember the person saying that while NVidia obviously has greater control, both standards do have a minimum refresh rate and that, for the most part, they are quite similar in most cases. This also makes sense because - again, I've been learning a bit about how monitors work due to all this - most LCDs need to refresh the panel after some time or you get "flashing" (there was another term use but basically screen goes black for a fraction of a second, until you get a new frame). Then again, I'm the newbie. Still, could you clarify what you mean by G-Sync can go from 0-[max_range]?

Technical curiosities aside, what do you mean by "...they will eventually come out with G-SYNC 2.0 which works via DisplayPort Adaptive-Sync...

"?

Right now, they obviously provide their own display logic chip (that would be the scaler, right?), while FreeSync doesn't require additional hardware from AMD and just uses the scaler that the monitor has. Do you mean that 'G-Sync 2.0' will not require a separate logic chip, much like with AMD's FreeSync at the moment, in which I should just wait for a few more months before buying a monitor...

 

You're right, LCDs must physically refresh at least every 30-40ms or so. Whenever the framerate falls below a specified amount normally you'd get flickering, because when you get down too far it takes so long for the next frame to arrive that the pixels will drift back to black by the time the next frame arrives, giving you flickering between each frame. This is why FreeSync cannot go below a certain threshold. However, with a G-SYNC module once you fall below the threshold (i.e. 36Hz on the RoG Swift) it simply doubles the refresh rate and shows each frame twice, if you go even lower (18Hz) it shows each frame 4 times, etc. But this is behind-the-scenes operation and is invisible to the user. So from a technical standpoint yes, G-SYNC monitors won't go below 36Hz (for example) but from a user perspective it's the same as if the monitor went from 0-144Hz with G-SYNC enabled.

 

With "G-SYNC 2.0" I mean NVIDIA will most likely not use a proprietary controller chip in the monitor in a future version of G-SYNC, it will work via the same protocol as FreeSync. They will in effect switch to FreeSync (which works DisplayPort Adaptive-Sync), but they'll still call it G-SYNC ;) I do not think this is coming any time soon though.

Link to comment
Share on other sites

Link to post
Share on other sites

You're right, LCDs must physically refresh at least every 30-40ms or so. Whenever the framerate falls below a specified amount normally you'd get flickering, because when you get down too far it takes so long for the next frame to arrive that the pixels will drift back to black by the time the next frame arrives, giving you flickering between each frame. This is why FreeSync cannot go below a certain threshold. However, with a G-SYNC module once you fall below the threshold (i.e. 36Hz on the RoG Swift) it simply doubles the refresh rate and shows each frame twice, if you go even lower (18Hz) it shows each frame 4 times, etc. But this is behind-the-scenes operation and is invisible to the user. So from a technical standpoint yes, G-SYNC monitors won't go below 36Hz (for example) but from a user perspective it's the same as if the monitor went from 0-144Hz with G-SYNC enabled.

With "G-SYNC 2.0" I mean NVIDIA will most likely not use a proprietary controller chip in the monitor in a future version of G-SYNC, it will work via the same protocol as FreeSync. They will in effect switch to FreeSync (which works DisplayPort Adaptive-Sync), but they'll still call it G-SYNC ;) I do not think this is coming any time soon though.

I see. How does the behavior of FreeSync differ below the minimum threshold, then?

Link to comment
Share on other sites

Link to post
Share on other sites

I see. How does the behavior of FreeSync differ below the minimum threshold, then?

 

Well, suppose the minimum threshold is 40Hz, FreeSync monitors so far just stop going down and act like a regular 40Hz monitor until the framerate comes back in range. Note that FreeSync does not dictate how this situation is handled like NVIDIA does with G-SYNC. So theoretically a FreeSync monitor could implement the frame doubling thing, or AMD could implement it in drivers. But so far neither of those has appeared yet.

Link to comment
Share on other sites

Link to post
Share on other sites

Well, suppose the minimum threshold is 40Hz, FreeSync monitors so far just stop going down and act like a regular 40Hz monitor until the framerate comes back in range. Note that FreeSync does not dictate how this situation is handled like NVIDIA does with G-SYNC. So theoretically a FreeSync monitor could implement the frame doubling thing, or AMD could implement it in drivers. But so far neither of those has appeared yet.

Thanks a lot for the explanation! :)

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 month later...

it's missing explanations on resolutions, seeing lots of misunderstanding between 1080p, 2k, 4k and 8k.

 

Some are refering to 1080p as being 2k, while other are talking about 2.5k etc ... but 1080p ≠ 2k or, to be more clear;

 

1920x1080 ≠ 2048x1080

 

else it would be like saying that VGA is the same as WVGA or FWVGA, yes they all "end" with x480 but they aren't the same format!!!

If you need help with your forum account, please use the Forum Support form !

Link to comment
Share on other sites

Link to post
Share on other sites

it's missing explanations on resolutions, seeing lots of misunderstanding between 1080p, 2k, 4k and 8k.

 

Some are refering to 1080p as being 2k, while other are talking about 2.5k etc ... but 1080p ≠ 2k or, to be more clear;

 

1920x1080 ≠ 2048x1080

 

else it would be like saying that VGA is the same as WVGA or FWVGA, yes they all "end" with x480 but they aren't the same format!!!

It's coming, along with a lot of other updates. Currently still finishing things up.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...

So, what is the response time of OLED and what could OLED's maximum refresh rate be?

Link to comment
Share on other sites

Link to post
Share on other sites

So, what is the response time of OLED and what could OLED's maximum refresh rate be?

The response time of OLEDs is, for all intents and purposes, instantaneous. The maximum refresh frequency can be whatever your other electronics can handle. There are a lot of other practical limitations on this (connection interface, for one).

Link to comment
Share on other sites

Link to post
Share on other sites

The response time of OLEDs is, for all intents and purposes, instantaneous. The maximum refresh frequency can be whatever your other electronics can handle. There are a lot of other practical limitations on this (connection interface, for one).

Which means, that a OLED freesync monitor would have no refresh rate at all! Well, where is LG an Samsung...

Link to comment
Share on other sites

Link to post
Share on other sites

Which means, that a OLED freesync monitor would have no refresh rate at all! Well, where is LG an Samsung...

The VESA Adaptive-Sync standard has an upper limit of 240Hz. Which is good enough anyway IMO.

Current OLED tech is not suitable for computer monitors anyway. Longevity is still an issue.

Link to comment
Share on other sites

Link to post
Share on other sites

The VESA Adaptive-Sync standard has an upper limit of 240Hz. Which is good enough anyway IMO.

Current OLED tech is not suitable for computer monitors anyway. Longevity is still an issue.

 

I just hope someone creates a VR headset with two separate OLED free-sync screens, that can be connected to different GPUs for 100% scaling...

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 weeks later...

So I'm in the market for a new monitor. Is a G-Sync monitor at 60hz worth buying if I won't get any higher than 60fps on the games I play?

Also, is 4k worth it for compensation of response time over 1080p or 1440p?

Thanks for any help.

 

Yours Sincerely,

 

Dank Memes :)

Link to comment
Share on other sites

Link to post
Share on other sites

So I'm in the market for a new monitor. Is a G-Sync monitor at 60hz worth buying if I won't get any higher than 60fps on the games I play?

Also, is 4k worth it for compensation of response time over 1080p or 1440p?

Thanks for any help.

 

Yours Sincerely,

 

Dank Memes :)

It's pretty much down to personal preference, personally I like having G-SYNC, even without a 144Hz monitor, but it depends on price as well as to whether it's worth it or not.

Resolution isn't tied to response time, but if you meant refresh frequency (4K maxes at 60Hz, while 1440p is available at 144Hz) then I'd personally take a 1440p 144Hz monitor. 4K doesn't have much practical value for me personally, and I'd rather have the smoother motion of a high frequency monitor.

Link to comment
Share on other sites

Link to post
Share on other sites

So, in that case, I'd go for a 1440p G-SYNC 144Hz monitor. Sound good?

Link to comment
Share on other sites

Link to post
Share on other sites

I have a quick question about #8, about scaling on non-native resolutions...

Would one's GPU scaling options in their driver control panel change any of that?  Could I configure the scaling in NCP or CCC to get that coveted "perfect scaling" to obtain the 4:1 pixel ratio for 1080p on a 4K display, or are the algorithms you have described an entirely different matter and cannot be altered by the consumer?

[witty signature]

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×