Jump to content

Display Technology FAQ / Mythbuster

Glenwing

I have a quick question about #8, about scaling on non-native resolutions...

Would one's GPU scaling options in their driver control panel change any of that?  Could I configure the scaling in NCP or CCC to get that coveted "perfect scaling" to obtain the 4:1 pixel ratio for 1080p on a 4K display, or are the algorithms you have described an entirely different matter and cannot be altered by the consumer?

If you use scaling via the NVIDIA control panel or CCC, the display doesn't do any scaling. The image is scaled up to 4K by the GPU before it is sent, so from the display's point of view it's already receiving a 4K image and doesn't need to do anything to it. Theoretically this could allow for 4:1 pixel mapping on any display if the GPU used that method for its scaling, but as far as I know neither NVIDIA nor AMD feature this type of scaling as an option. If they offered that option, yes it would work as perfect 4:1 on any display.

Link to comment
Share on other sites

Link to post
Share on other sites

If you use scaling via the NVIDIA control panel or CCC, the display doesn't do any scaling. The image is scaled up to 4K by the GPU before it is sent, so from the display's point of view it's already receiving a 4K image and doesn't need to do anything to it. Theoretically this could allow for 4:1 pixel mapping on any display if the GPU used that method for its scaling, but as far as I know neither NVIDIA nor AMD feature this type of scaling as an option. If they offered that option, yes it would work as perfect 4:1 on any display.

Jeez, I forgot how fast you guys are around here!

 

I would love to test this sometime soon, however I only have a 1080p display.  I wonder if I can grab someone with even a 1440p panel and do the tests, which should be easy to do (take a screenshot and zoom in enough to see the pixels)

[witty signature]

Link to comment
Share on other sites

Link to post
Share on other sites

Jeez, I forgot how fast you guys are around here!

 

I would love to test this sometime soon, however I only have a 1080p display.  I wonder if I can grab someone with even a 1440p panel and do the tests, which should be easy to do (take a screenshot and zoom in enough to see the pixels)

You can test it with any resolution, you just need to switch it to 1/2 resolution in each direction, 960x540 in the case of 1080p, if it scales 4:1 to 1920x1080, it should look quite blocky. If it looks fuzzy instead, the image is being interpolated.

Link to comment
Share on other sites

Link to post
Share on other sites

You can test it with any resolution, you just need to switch it to 1/2 resolution in each direction, 960x540 in the case of 1080p, if it scales 4:1 to 1920x1080, it should look quite blocky. If it looks fuzzy instead, the image is being interpolated.

Well yes, however the problem is my monitor doesn't support that (and as I say that I remember CRU... *facepalm*), so I figured getting someone with 1440p would have an easier time with it as they can go down to 720p.  Or 4K down to 1080p.

 

I'll shut up about it and report back when I get results.

[witty signature]

Link to comment
Share on other sites

Link to post
Share on other sites

You can test it with any resolution, you just need to switch it to 1/2 resolution in each direction, 960x540 in the case of 1080p, if it scales 4:1 to 1920x1080, it should look quite blocky. If it looks fuzzy instead, the image is being interpolated.

interestingly, if I fed my old 1680x1050 monitor with that resolution but at 100hz, it WOULD display it, but the monitor seemed to display an image that was downscaled to half the resolution and then upscaled to the nomal res. Text looked weird, but minecraft's menus looked absolutely perfect, because each "pixel" in the text is one of the 2^x numbers. Single link dvi.
Link to comment
Share on other sites

Link to post
Share on other sites

interestingly, if I fed my old 1680x1050 monitor with that resolution but at 100hz, it WOULD display it, but the monitor seemed to display an image that was downscaled to half the resolution and then upscaled to the nomal res. Text looked weird, but minecraft's menus looked absolutely perfect, because each "pixel" in the text is one of the 2^x numbers. Single link dvi.

At 960x540?  Or 840x525?

[witty signature]

Link to comment
Share on other sites

Link to post
Share on other sites

At 960x540?  Or 840x525?

at 1680x1050 @100hz. Whatever happened, may have been the gpu downscaling and the monitor upscaling, who knows. I also tried half that and the monitor displayed it fine, although with an OUF OF RANGE nag message. Considering it worked, i wish it didnt display that message.
Link to comment
Share on other sites

Link to post
Share on other sites

Finally got 960*540 working aaand...

It's being interpolated.  Turns out, after some research, that getting perfect 4:1 has an official term: pixel doubling.  Your hunch was right; it can't be configured with the GPU drivers, even if I put the scaling on the GPU, though pixel doubling is present in a few games here and there.  I've also seen that it's apparently a hardware limitation of LCD panels in general, though I won't stand behind that statement yet.

[witty signature]

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 months later...

SNIP

What do you mean by "“240Hz” and “480Hz” LCD TVs still only take in 60Hz maximum input, they just strobe the backlight "

What's the backlight?

I don't like 2D games...I just couldn't get into them.. ( ͡° ͜ʖ ͡°)

Link to comment
Share on other sites

Link to post
Share on other sites

Side topic: From my understanding, Quantum Dot is still LED. It is put on top of white LEDs to adjust their light output from 1 color to another (basically absorb a bit of blue and output a bit of red, calibrating the "white" LED light to more white, and not a very cold white (or very light blue, depending on how you consider things and white LED grade).

Currently, if a manufacture wants to adjust the 'white' light from the white LED light, they apply a layer of colored phosphor to filter the color. The problem with this system is that phosphor absorb light to some extend, so you need much brighter LEDs to compensate. However, with Quantum Dot, you just need a bit stronger light to compensate.

Link to comment
Share on other sites

Link to post
Share on other sites

b15e758523.jpg

Why do LCDs need illumination? I thought that was the role of the pixels (to provide light as different coloured pixels).?

I don't like 2D games...I just couldn't get into them.. ( ͡° ͜ʖ ͡°)

Link to comment
Share on other sites

Link to post
Share on other sites

Why do LCDs need illumination? I thought that was the role of the pixels (to provide light as different coloured pixels).?

 

LCD pixels do not generate light. LCD is essentially an array of stained-glass windows with adjustable shutters. On a color display, the windows also have color filters over them, so that if any light goes through the window, only the red wavelengths (for example) pass through, and the rest of the light is blocked.

 

You can find LCDs without a backlight, for example on many kitchen appliances, you'll notice some 7-segment LCDs will not be visible in the dark. Even older handheld game systems had non-backlit LCDs like the Nintendo Gameboy advance. The Advance SP is a better example since the backlight can be toggled on and off.

Link to comment
Share on other sites

Link to post
Share on other sites

LCD pixels do not generate light. LCD is essentially an array of stained-glass windows with adjustable shutters. On a color display, the windows also have color filters over them, so that if any light goes through the window, only the red wavelengths (for example) pass through, and the rest of the light is blocked.

 

You can find LCDs without a backlight, for example on many kitchen appliances, you'll notice some 7-segment LCDs will not be visible in the dark. Even older handheld game systems had non-backlit LCDs like the Nintendo Gameboy advance. The Advance SP is a better example since the backlight can be toggled on and off.

So why do games screen tear? If the monitor just pulls the latest frame fro the primary frame buffer then why would two frames ever be on the same screen on the same time? Doesn't the GPU only send an image to the primary frame buffer once it's been fully rendered?

I don't like 2D games...I just couldn't get into them.. ( ͡° ͜ʖ ͡°)

Link to comment
Share on other sites

Link to post
Share on other sites

So why do games screen tear? If the monitor just pulls the latest frame fro the primary frame buffer then why would two frames ever be on the same screen on the same time? Doesn't the GPU only send an image to the primary frame buffer once it's been fully rendered?

The monitor is dumb. It outputs what it receives. Ideally, your game should run at 60Hz (60fps), and your monitor at 60Hz refresh rate in sync.

But that is not the reality of things on the PC space. Games on the PC aren't frame locked (will only tell the graphics card to draw 60 fps, even if technically the hardware allows for more. A form of V-Sync done by the game instead of the GPU). Meaning they'll render the game as fast as possible. So your game can run at 100 fps. That means that the monitor, which can only handle 60fps max (60Hz), will receive, in this example, 100 frames per seconds (fps). So what happens?

Image you are an artists, that loves to draw, but stuck in a sucky job to draw what you are giving (monitor). You are super fast, however, you can draw what you are giving as a task to do (with immense details. Assuming each piece of paper is a task given to you). Assume you can draw 1 task (no matter how complicated or simple it is) in 1 minute. Now, imagine that normally, the company you work for gives you 1 task every minute. Great, no problem. (that is when the game runs are 60fps, and your monitor run at 60Hz. While much faster than you in the example, you see the 1:1 ratio). Now, due to poor management, you are getting 2 tasks per minute. So, you quickly notice a pile of paper to draw piling up on your desk, and spilling on the floor, because you can't draw as fast.

To keep the situation under control, and because the company doesn't care about the end result. You draw half the image with one task, and the other one with the other task, processing 2x tasks per minute, but the drawing is half the old task, and the half the new task. This way, you stay up to to date, you have no over spilling of paper everywhere, and well, company you work for doesn't care about results.

This is where G-Sync and Free Sync comes in, where you have this improved manager that listen to it's employee needs, and sync things so that you do a proper job, every time.

Link to comment
Share on other sites

Link to post
Share on other sites

The monitor is dumb. It outputs what it receives. Ideally, your game should run at 60Hz (60fps), and your monitor at 60Hz refresh rate in sync.But that is not the reality of things on the PC space. Games on the PC aren't frame locked (will only tell the graphics card to draw 60 fps, even if technically the hardware allows for more. A form of V-Sync done by the game instead of the GPU). Meaning they'll render the game as fast as possible. So your game can run at 100 fps. That means that the monitor, which can only handle 60fps max (60Hz), will receive, in this example, 100 frames per seconds (fps). So what happens?Image you are an artists, that loves to draw, but stuck in a sucky job to draw what you are giving (monitor). You are super fast, however, you can draw what you are giving as a task to do (with immense details. Assuming each piece of paper is a task given to you). Assume you can draw 1 task (no matter how complicated or simple it is) in 1 minute. Now, imagine that normally, the company you work for gives you 1 task every minute. Great, no problem. (that is when the game runs are 60fps, and your monitor run at 60Hz. While much faster than you in the example, you see the 1:1 ratio). Now, due to poor management, you are getting 2 tasks per minute. So, you quickly notice a pile of paper to draw piling up on your desk, and spilling on the floor, because you can't draw as fast.To keep the situation under control, and because the company doesn't care about the end result. You draw half the image with one task, and the other one with the other task, processing 2x tasks per minute, but the drawing is half the old task, and the half the new task. This way, you stay up to to date, you have no over spilling of paper everywhere, and well, company you work for doesn't care about results.This is where G-Sync and Free Sync comes in, where you have this improved manager that listen to it's employee needs, and sync things so that you do a proper job, every time.

But is the tearing a result of the frame buffers switching half way through the monitor refresh? (Assuming 120fps @ 60Hz)

@Glenwing

I don't like 2D games...I just couldn't get into them.. ( ͡° ͜ʖ ͡°)

Link to comment
Share on other sites

Link to post
Share on other sites

So why do games screen tear? If the monitor just pulls the latest frame fro the primary frame buffer then why would two frames ever be on the same screen on the same time? Doesn't the GPU only send an image to the primary frame buffer once it's been fully rendered?

 

Frames aren't sent as a package, they're sent to the monitor one pixel or a few pixels at a time. If the GPU suddenly starts sending pixels from a different frame, there's nothing the monitor can do about it. It doesn't have the rest of the previous image.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 5 weeks later...

Very nice thread, Glenwing. It explains so much in true words.

 

I will definitely link people this thread in the future, when they think their new 800 Hz TV will look insanely good, when they launch CS:GO.

 

Bookmarked for good.

Desktop: i9 12900K - 64GB 6000 MHz CL36 - RTX 4090 X3 OC

 

Unraid server: Ryzen 3900X - 32GB 3200 MHz CL14 - Quadro P2000 - 50+ TB raw

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 months later...

I have a question that I don't think is covered. What is the difference between OLED and LED? From what I understand the O stands for Organic. So I had the perception that OLED either used some kind of Bioluminescence or just had a generally better for the environment manufacturing process and less power consumption. Basically, a lot better for the environment. But I might be embarrasingly wrong about this?

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, AlveKatt said:

I have a question that I don't think is covered. What is the difference between OLED and LED? From what I understand the O stands for Organic. So I had the perception that OLED either used some kind of Bioluminescence or just had a generally better for the environment manufacturing process and less power consumption. Basically, a lot better for the environment. But I might be embarrasingly wrong about this?

Not at all :) When you're talking about displays they're totally different, they just both happen to involve light-emitting diodes, but how the diodes are used in each type of display is totally different, it's not like OLED displays are just the same thing as "LED" displays, except swapping out the LEDs for a different type.

 

LED displays do not actually exist, it's just a variation of LCD technology. An LCD panel is like a transparent window, divided into tiny red/green/blue segments, with liquid crystals for each segment that control whether light passes through that segment or not. By controlling the amount of light passing through each colored segment you can control the "brightness" of it, and thus combine red green and blue at different brightnesses to create any color. But like I said, LCD panels are just basically transparent glass, they don't produce light. They can only block or allow light from some other source to pass through. So, LCD displays use an LCD panel in combination with a light source behind the panel. Originally they used fluorescent lamps, modern LCDs use LED strips with a few dozen LEDs on them to generate light. "LED" displays only use LEDs as a light source, not to form the image.

 

With OLEDs, you are correct they're "just" another type of light emitting diode, so they're fairly similar in effect to synthetic LEDs, though they have some properties that make them more suitable for different applications. OLED displays are not like "LED" displays (LED-backlit LCD displays) though. In OLED displays, the LEDs are not just used to provide light, but to form the image as well. Every red green and blue subpixel is an individual LED, so each pixel lights up directly and generates its own light. The image is formed by the LEDs themselves.

 

This design has a lot of advantages. In an LED-backlit LCD display, a central light source provides light for every pixel on the display, so the light cannot be lit up for some pixels and darkened for others; the light is on all the time, and it's up to the LCD panel to block the light from passing through any pixel that is supposed to be black. However, LCD panels cannot completely block the light, so even on a pure black screen, some of the light leaks through, which is why even on a black screen in a dark room you can still see a glowing rectangle where the screen is. Since LCDs can't display black they have inherently poor contrast, and there's the possibility of manufacturing defects causing uneven backlight bleed as well.

 

On OLED displays, each pixel provides its own light, so if a pixel is supposed to be black it can simply turn off. An OLED display showing a black image in a dark room would be invisible; it's indistinguishable from the display being powered off. OLED displays have inherently infinite contrast ratio and perfect uniformity. They also have very low motion blur since LEDs can change brightness instantly, whereas with an LCD it takes some time for the liquid crystals to shift positions when it needs to change from blocking light to letting light pass.

 

In terms of the difference between LEDs and OLEDs themselves, OLEDs have a simpler design which makes them feasible to scale down to smaller sizes. The density required for having an LED for every subpixel is not really possible with synthetic LEDs. OLEDs also produce a different quality of light, they are a surface light source, whereas synthetic LEDs are a point light source. This means that light emitted by OLEDs is softer and more diffuse, and you can look directly at an OLED light without being dazzled.

Link to comment
Share on other sites

Link to post
Share on other sites

On 4/11/2016 at 11:27 PM, Glenwing said:

Not at all :) When you're talking about displays they're totally different, they just both happen to involve light-emitting diodes, but how the diodes are used in each type of display is totally different, it's not like OLED displays are just the same thing as "LED" displays, except swapping out the LEDs for a different type. [...]


Thank you for that awesome answer.
 

Where are all the OLED displays though? The only ones I can find are either huuuge TVs I wouldn't want on my desk as a computer screen or extremely expensive special purpose displays with special pro features that is probably what make them so expensive.

 

I remember OLED TVs were annouced at pretty much the same time as LED TVs, I even remember seeing a working demo example in a news broadcast reporting about this new technology.

 

What happened? Have they been holding back OLED just to get to sell extra units of the LED based, inferior, technology? Everywhere I look they say it is cheaper to manufacture, has better and sharper images, faster colour to colour change than LCD can have, and to boot, draws less power. I mean, either there has been some serious problems not mentioned with, or exaggerations about, this technology, or there is some fishy marketing scam going on to make us buy first LCD, then LED LCD, and then ones everyone already own LED TVs and monitors, introduce OLED.

 

I have a nine year old 24 inch ASUS LCD that is really good still. It feels like I should stick with it until OLED displays become available. Would love your thoughts on that however. I am really drooling over those curved screens with 21:9 aspect ratio. I think a 29 inch would have the same height as my 24 inch 16:9, at least according to the measurements listed on the webstores. (Any higher and I can't just raise my eyes and look at the trees outside my window, I kind of need that for my sanity.)

 

So should I wait for a 29 inch 21:9 OLED to become available regardless the time it takes? Or is there a problem that makes it smarter to go with a LED screen and skip the first generation of OLED displays?

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/29/2014 at 2:26 PM, GrimNeo said:

@Glenwing Can you post a topic about viewing distance for monitors and if people will really notice a difference for 4k. Like this article http://carltonbale.com/1080p-does-matter/ Most people don't have 20/20. Great article and thanks.


Great post dude. Fantastic read <3

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, AlveKatt said:


Thank you for that awesome answer.
 

Where are all the OLED displays though? The only ones I can find are either huuuge TVs I wouldn't want on my desk as a computer screen or extremely expensive special purpose displays with special pro features that is probably what make them so expensive.

 

I remember OLED TVs were annouced at pretty much the same time as LED TVs, I even remember seeing a working demo example in a news broadcast reporting about this new technology.

 

What happened? Have they been holding back OLED just to get to sell extra units of the LED based, inferior, technology? Everywhere I look they say it is cheaper to manufacture, has better and sharper images, faster colour to colour change than LCD can have, and to boot, draws less power. I mean, either there has been some serious problems not mentioned with, or exaggerations about, this technology, or there is some fishy marketing scam going on to make us buy first LCD, then LED LCD, and then ones everyone already own LED TVs and monitors, introduce OLED.

 

I have a nine year old 24 inch ASUS LCD that is really good still. It feels like I should stick with it until OLED displays become available. Would love your thoughts on that however. I am really drooling over those curved screens with 21:9 aspect ratio. I think a 29 inch would have the same height as my 24 inch 16:9, at least according to the measurements listed on the webstores. (Any higher and I can't just raise my eyes and look at the trees outside my window, I kind of need that for my sanity.)

 

So should I wait for a 29 inch 21:9 OLED to become available regardless the time it takes? Or is there a problem that makes it smarter to go with a LED screen and skip the first generation of OLED displays?

Right now OLED has longevity problems. Screens degrade the more they are used. You can see it if you look at demo units at stores if you can find Samsung tablets or phones with OLED screens, if they've been on display for a while.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×