Jump to content
Glenwing

Display Technology FAQ / Mythbuster

Recommended Posts

Excelent FAQ, very useful.

 

I would like to ask you something, about the "1080p scaling on a 4K monitor" part of the FAQ. I'm wondering, since this FAQ was written a few years ago, if the more modern 4K monitors handle better 1080p resolutions (maybe not using interpolation or something).

 

I'm about to buy a 4K monitor, but since I game a lot and I doubt I'll be able to play all games in 4K at very high settings with acceptable framerate (i7 4770K + GTX 1080), I wondered how a 4K monitor would handle 1080p resolutions. I remember years ago how disapointed I was when I bought a 1080p TV in a country where most TV broadcasts were in 480p (btw, most channels are 480 and 720p now, only some premium channels are in 1080p and public TV only uses 1080p in very few events, like sports). It looked like someone had placed a very thin white veil in front of the screen.

 

So, the FAQ's answer still holds true to all 4k monitors, or are there newer 4K monitors that handle better 1080p resolutions? If there are, would you recommend a good 4K monitor, please?

Link to post
Share on other sites
On 26/10/2017 at 10:04 PM, Over said:

Excelent FAQ, very useful.

 

I would like to ask you something, about the "1080p scaling on a 4K monitor" part of the FAQ. I'm wondering, since this FAQ was written a few years ago, if the more modern 4K monitors handle better 1080p resolutions (maybe not using interpolation or something).

 

I'm about to buy a 4K monitor, but since I game a lot and I doubt I'll be able to play all games in 4K at very high settings with acceptable framerate (i7 4770K + GTX 1080), I wondered how a 4K monitor would handle 1080p resolutions. I remember years ago how disapointed I was when I bought a 1080p TV in a country where most TV broadcasts were in 480p (btw, most channels are 480 and 720p now, only some premium channels are in 1080p and public TV only uses 1080p in very few events, like sports). It looked like someone had placed a very thin white veil in front of the screen.

 

So, the FAQ's answer still holds true to all 4k monitors, or are there newer 4K monitors that handle better 1080p resolutions? If there are, would you recommend a good 4K monitor, please?

I was actually on the point to ask the sames questions. frankly I doubt if a 1x4 simple interpolation would work i don't see why it wouldn't have been fixed today. compagnies make mistake but they are not stupid and their engineers are 100x more skilled than linus, people on theses forum and pc monitors reviewers that don't even own the monitor they review and usually know less than people using them.

 

anyway, it's still interesting since it's a pinned post on a geek forum (lol).

 

i think it's ok since if not everyone owning 4K monitors would have complained on forums games looking bad in smaller resolutions. 

 

about advice on a 4K monitor, the list of 4K gaming monitor is pretty small, and they are all premium products so i don't think there is really good advice here. 

 

 

 

Link to post
Share on other sites
On 12/27/2014 at 6:47 PM, Glenwing said:

"Does 1920×1080 scale perfectly on 3840×2160 (4K UHD) monitors? What exactly happens when you run a non-native resolution on a display?"

  Reveal hidden contents

The notion that 1080p content viewed on a 4K UHD display will always look just as sharp as if it were a native 1080p monitor is actually a misconception, admittedly one that I also held before 4K displays were really widely available and tested. However now we know better, as actual tests have shown this to be a myth. Theoretically it can be true; 1080p is an exact fraction of 4K and so could potentially be displayed natively at fullscreen. Each pixel could be represented perfectly by a block of 4 pixels on the 4K screen, so the input image could be displayed exactly as it's received without any kind of modification required. In reality though, this is usually not how these images are handled. It will depend on the display, and most 4K monitors won't scale 1080p perfectly.

Normally, when a display is fed an image at a resolution smaller than its physical/max resolution, it has to perform a process called interpolation to upscale the image to its max resolution, which basically means it has to approximate what the image would look like if it were at that higher resolution. If you had for example a 7×7 image and you wanted to show it on a 10×10 display fullscreen, the image would have to be modified in order to be displayed. The pixels in the 7×7 image aren’t going to line up evenly with the 10×10 grid on your display:

ColorInputGrid.png

Since each pixel on your 10×10 display can only be a single color, it’s impossible to display the image exactly like it appears above. In order to display anything you need to calculate a new 10×10 pattern that approximates the original as closely as possible:

ColorOutputGrid.png

As you can see, the approximation results in some pretty obvious blurriness, although if you stand far enough back you can still sort of make out a hint of the original pattern. Sort of. The negative effects of interpolation are especially noticeable with text:

EInterpolation.png
AInterpolation.png
TInterpolation.png

These examples are using a very simplistic interpolation technique, overlaying the desired resolution on the image and calculating each new pixel using the “average color” that lies within each boundary. This is just to help visualize the concept of interpolation, monitors use more complex approaches to interpolation that give better, sharper approximations in most situations. Unfortunately, these approaches also affect resolutions which are exact fractions of the physical resolution.

If monitors interpolated using the simple averaging method I used above, exact fractions like 1080p on 2160p actually would display natively, since each pixel on the original image would line up exactly with every 2×2 grid on the new resolution, so every pixel would be a solid color and averaging them wouldn’t change anything. But most monitors use different interpolation methods which do affect resolutions which are exact fractions, and you can easily prove this to yourself by changing your desktop to such a resolution. If you’re using a 1920×1080 monitor, change to 960×540, if you’re on 1440p change it to 1280×720, etc. If it scales perfectly without interpolation then it should look quite blocky, especially with text, but you’ll most likely find that it looks rather fuzzy instead.

Long story short: no, most 4K monitors do not scale 1080p with simple 4:1 pixel mapping, no interpolation, despite the fact that it’s an exact fraction of 3840×2160. Most monitors will still interpolate the image. It’s certainly possible for a monitor to display such resolutions natively, but it would have to be purposely designed to not interpolate those resolutions. They won’t be displayed natively simply by virtue of being an exact fraction.

If you still don't believe me you can read professional reviews of actual 4K monitors and verify that 1080p does not scale perfectly, but is in fact interpolated.

 

 

 

This reminded me of something I was wondering that's kinda the opposite of this. I have a monitor with a resolution of 1440x900 - it's a 16:10 monitor, so 16:9 video would play at a resolution of 1440x810 (1440/16*9=810). But when I play 1080p video on YouTube (1920x1080), it looks noticeably sharper than 720p video.

 

Why is this? 1440x900 is less than a quarter of the area of 1920x1080, and 1280x720 is much closer.

Link to post
Share on other sites
Posted · Original PosterOP
1 minute ago, xn--cr8h said:

This reminded me of something I was wondering that's kinda the opposite of this. I have a monitor with a resolution of 1440x900 - it's a 16:10 monitor, so 16:9 video would play at a resolution of 1440x810 (1440/16*9=810). But when I play 1080p video on YouTube (1920x1080), it looks noticeably sharper than 720p video.

 

Why is this? 1440x900 is less than a quarter of the area of 1920x1080, and 1280x720 is much closer.

A 1920×1080 source image has more detail, so a more accurate approximation can be made. 1280×720 being closer does not really give it a special advantage; maybe the 720p scaled up to 1440×810 does look more like the original 720 image than the 1080p scaled does to the original 1080, but looking closer to the original doesn't necessarily mean it looks better when one of the originals is much worse than the other.

 

Let's put it this way, picking some random numbers, lets say scaling up to 720p image to 1440×810 looks 90% as good as the original 720p image. And the 1080p image scaled down to 1440×810 only looks 80% as good as the original 1080p image. But if the original 720p only looks half as good as the original 1080p, then the scaled 1080p is going to look better than the scaled 720p too.

 

Also, playing YouTube videos at higher resolutions gives you a higher quality version of the video because less compression is used on the higher ones, so there are quality differences besides just the resolution change when watching different resolution versions of a video on YouTube.

Link to post
Share on other sites

My colleague keeps telling me that his apparently 9 year old, 1080p, LED, 240Hz 'refresh rate' TV is better than any tech now. He claims that his is true 240hz with no software to fake the refresh rate. And that the OLED displays look like shit compared to what he has. I would LOVE to send him this post but I think it's too much information and will probably just get ignored.

 

Is there a very concise response that I can give him, that will be easily understandable, to explain why he's wrong? Or am I in the wrong here!?

Link to post
Share on other sites
Posted · Original PosterOP
51 minutes ago, inShaneity said:

My colleague keeps telling me that his apparently 9 year old, 1080p, LED, 240Hz 'refresh rate' TV is better than any tech now. He claims that his is true 240hz with no software to fake the refresh rate. And that the OLED displays look like shit compared to what he has. I would LOVE to send him this post but I think it's too much information and will probably just get ignored.

 

Is there a very concise response that I can give him, that will be easily understandable, to explain why he's wrong? Or am I in the wrong here!?

9 years ago the newest version of HDMI was version 1.4. The data rate required for 1080p 240 Hz (which is around 14 Gbit/s) is almost twice as much as the maximum allowed by HDMI 1.4, which is 8.16 Gbit/s. It is not possible to transmit 240 frames of 1080p per second over HDMI 1.4. "240 Hz" is most likely backlight strobing, but even if the TV does show 240 frames then the extras are being generated by the TV (motion interpolation software), not from the original source.

 

https://linustechtips.com/main/topic/729232-guide-to-display-cables-adapters-v2/?section=calc&H=1920&V=1080&F=240&format=ycbcr444&calculations=show&formulas=show

 

https://linustechtips.com/main/topic/729232-guide-to-display-cables-adapters-v2/?section=calc&mode=maxfreq&H=1920&V=1080&F=240&format=ycbcr444&calculations=show&formulas=show

 

Link to post
Share on other sites
On 1/25/2018 at 2:18 PM, Glenwing said:

9 years ago the newest version of HDMI was version 1.4. The data rate required for 1080p 240 Hz (which is around 14 Gbit/s) is almost twice as much as the maximum allowed by HDMI 1.4, which is 8.16 Gbit/s. It is not possible to transmit 240 frames of 1080p per second over HDMI 1.4. "240 Hz" is most likely backlight strobing, but even if the TV does show 240 frames then the extras are being generated by the TV (motion interpolation software), not from the original source.

 

https://linustechtips.com/main/topic/729232-guide-to-display-cables-adapters-v2/?section=calc&H=1920&V=1080&F=240&format=ycbcr444&calculations=show&formulas=show

 

https://linustechtips.com/main/topic/729232-guide-to-display-cables-adapters-v2/?section=calc&mode=maxfreq&H=1920&V=1080&F=240&format=ycbcr444&calculations=show&formulas=show

 

Thank you very much for this response! Will pass it along.

Link to post
Share on other sites

@Glenwing Would you mind if i copy/pasta your post here to reddit? Giving you credit of course, and also posting the link to this original post.


New PC: Ryzen 7 1700 @ 4Ghz - Corsair H60i- 16GB DDR4 Corsair LPX Vengeance @2933Mhz - EVGA GeForce GTX 980ti Classifed - 120 GB M.2 SSD (boot) - 5 TB Raid HDD's 

I run a sub-reddit for I.R.C. Section 1031 Tax Free Exchanges, save taxes on your next real estate exchange; /r/1031TaxExchange/

Always gaming, Always mining. 

 

Link to post
Share on other sites
Posted · Original PosterOP
1 hour ago, Xerora said:

@Glenwing Would you mind if i copy/pasta your post here to reddit? Giving you credit of course, and also posting the link to this original post.

This is a somewhat outdated post, I'm working on an updated version, so it might be better to wait for that :)

Link to post
Share on other sites
2 minutes ago, Glenwing said:

This is a somewhat outdated post, I'm working on an updated version, so it might be better to wait for that :)

Then, might i suggest reposting the updated one on reddit as well? Too many flame wars are being started because no one knows the differences with the things that you've highlighted here and I love it when people realize they're wrong. xD:P


New PC: Ryzen 7 1700 @ 4Ghz - Corsair H60i- 16GB DDR4 Corsair LPX Vengeance @2933Mhz - EVGA GeForce GTX 980ti Classifed - 120 GB M.2 SSD (boot) - 5 TB Raid HDD's 

I run a sub-reddit for I.R.C. Section 1031 Tax Free Exchanges, save taxes on your next real estate exchange; /r/1031TaxExchange/

Always gaming, Always mining. 

 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


  • Recently Browsing   0 members

    No registered users viewing this page.

Buy VPN

×