Jump to content

Why Do Monitor Manufacturers Not Allow Native 1080p On 4K Displays?

Hiya Guys,

 

I was wondering if anyone could explain to me why monitor manufactures do not allow native 1080p viewing on 4K displays? To me this seems like an obvious path forward when it comes to the migration process. I like many other people here play video games regularly and watch a lot of HD media. Since 4K is basically 2x2 pixels to when compared to each 1080p pixel, it would make perfect sense to allow the display to act like a 1080p display by treating each 2x2 pixel grid as a single pixel. This would allow a perfectly sharp 1080p image to be displayed. The reality is that watching 1080p media on a 4K display or playing games at 1080p on a 4K display is simply worse than watching or playing on a native 1080p display. I wish I could have best of both worlds...

 

The reason I bring this up now is that we know that Asus and Acer are going to be soon releasing 27inch 4K 144hz HDR displays.

I personally would love a 4k responsive desktop to work on and watch 4k movies on.

I just don't believe I should have to sacrifice image quality on 1080p media and choose between framerate and image quality when deciding to game in 1080p vs 4k.

 

It seems like such an obvious step moving forward that I wonder if I am missing something that prevents this from happening?

I see the future of display tech not being about "native" resolutions.

8K monitors could allow for 720p, 1080p, 1440p and 4k resolutions without blurring an image.

Yet we still live in a world of compromise...

 

  

Link to comment
Share on other sites

Link to post
Share on other sites

Can't you just make your games run in 1080p? If I understand you right, that would work just as well with minimal effort.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, EminentSun said:

3840/1080=3.5555555555555555555555555 =/= 2

Doesn't work like that.

4k is        3840x2160p

1080p is 1920x1080p

if each dimension is 2x, then 1920x2=3840 and 1080x2=2160.

Therefore, 4k = 1080p x 4.

As #muricaparrotgang's founder, I invite you to join our ranks today.

"My name is Legion 'Murica Parrot Gang, for we are many."

 

(We actually welcome all forms of animated parrot gifs.)

 

The artist formerly known as Aelar_Nailo.

 

Profile Pic designed by the very lovely @Red :)!

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, PixelTwitch said:

Hiya Guys,

 

I was wondering if anyone could explain to me why monitor manufactures do not allow native 1080p viewing on 4K displays? To me this seems like an obvious path forward when it comes to the migration process. I like many other people here play video games regularly and watch a lot of HD media. Since 4K is basically 2x2 pixels to when compared to each 1080p pixel, it would make perfect sense to allow the display to act like a 1080p display by treating each 2x2 pixel grid as a single pixel. This would allow a perfectly sharp 1080p image to be displayed. The reality is that watching 1080p media on a 4K display or playing games at 1080p on a 4K display is simply worse than watching or playing on a native 1080p display. I wish I could have best of both worlds...

 

The reason I bring this up now is that we know that Asus and Acer are going to be soon releasing 27inch 4K 144hz HDR displays.

I personally would love a 4k responsive desktop to work on and watch 4k movies on.

I just don't believe I should have to sacrifice image quality on 1080p media and choose between framerate and image quality when deciding to game in 1080p vs 4k.

 

It seems like such an obvious step moving forward that I wonder if I am missing something that prevents this from happening?

I see the future of display tech not being about "native" resolutions.

8K monitors could allow for 720p, 1080p, 1440p and 4k resolutions without blurring an image.

Yet we still live in a world of compromise...

 

  

Since most of the time scaling is done by the GPU, it's honestly not that important for monitors to "support" it; it's something NVIDIA and AMD could add in a driver update.

 

What you are looking for is nearest-neighbor upsampling, which gives native quality lower resolution images when they are exact fractions but doesn't look so great at any other resolution. Monitors use bilinear or bicubic scaling, which works pretty well at any resolution. What you would want is to make a special exception where if certain resolutions are used, it uses a different scaling algorithm. Likely this just isn't something monitor/TV manufacturers have really spent much time looking into. There isn't much consumer knowledge of details like this so I imagine there isn't much discussion on the topic. If it becomes something of larger attention, maybe they'll look into it. But I think it is more likely NVIDIA / AMD will implement it first.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, SCGazelle said:

Can't you just make your games run in 1080p? If I understand you right, that would work just as well with minimal effort.

You can run a game in 1080p but the monitor/gpu still scale the image to then display on the monitor. This means that 1080p on a 4k display has a horrible blurred effect. Much like the effect you get when running 720p on a 1080p monitor. 

 

2 minutes ago, EminentSun said:

3840/1080=3.5555555555555555555555555 =/= 2

This a joke?

3840 = 1920 x 2

2160 = 1080 x 2

so

3840x2180 = 1920x1080 x 2

2x2

 

You are confusing width and length.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Part of the reason is because we really haven't moved on from the way to transmit data to analog displays. DVI is essentially a digital form of VGA, and the basic form of VGA included the following signals:

  • A vertical sync to tell the display to return to the top.
  • A horizontal sync to tell the display to return to the left
  • And a stream of data

There's no indication here of what the resolution of the image really is. Also the rate at which data is sent isn't constant. For example in DVI, 1920x1080 is sent at a different clock speed than 1440x900. So all the display controller can do in this case is stretch out the signal to the native resolution, regardless if the resolution can cleanly divide into the native resolution. And doing anything more may introduce significant input lag.

 

I'm not privy to how DisplayPort does this, because it moves away from the constant data stream idea.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Glenwing said:

Since most of the time scaling is done by the GPU, it's honestly not that important for monitors to "support" it; it's something NVIDIA and AMD could add in a driver update.

 

What you are looking for is nearest-neighbor upsampling, which gives native quality lower resolution images when they are exact fractions but doesn't look so great at any other resolution. Monitors use bilinear or bicubic scaling, which works pretty well at any resolution. What you would want is to make a special exception where if certain resolutions are used, it uses a different scaling algorithm. Likely this just isn't something monitor/TV manufacturers have really spent much time looking into. There isn't much consumer knowledge of details like this so I imagine there isn't much discussion on the topic. If it becomes something of larger attention, maybe they'll look into it. But I think it is more likely NVIDIA / AMD will implement it first.

While not a perfect solution I almost imagined a button on a monitor to switch between 1080p and 4k. While this may seem stupid on the surface we already have this with Gysnc displays that allow the monitor to change the refresh rate on the fly. Thats why I was surprised its not already been done when the tech seems to be already there. Obviously it would require the monitor manufacturers to have a way to trick windows into thinking its a 1080p native display as well as treating itself as a 1080p display. It just seems like it would be the best feature ever added to a 4k monitor :D specially with 4k 144hz knocking. 

Link to comment
Share on other sites

Link to post
Share on other sites

This would be amazing. I hate how some applications don't scale properly with 4K, so a switch to make the 4K panel basically 1080p native would be fantastic. 

 

Also I can now play watch dogs 2 on ultra at 1080p/55fps woo

Ryzen 5 3600 stock | 2x16GB C13 3200MHz (AFR) | GTX 760 (Sold the VII)| ASUS Prime X570-P | 6TB WD Gold (128MB Cache, 2017)

Samsung 850 EVO 240 GB 

138 is a good number.

 

Link to comment
Share on other sites

Link to post
Share on other sites

I call 1080p 2K and 1440p 2.5K. Because 1080p x2 = UHD

PSU Nerd | PC Parts Flipper | Cable Management Guru

Helpful Links: PSU Tier List | Why not group reg? | Avoid the EVGA G3

Helios EVO (Main Desktop) Intel Core™ i9-10900KF | 32GB DDR4-3000 | GIGABYTE Z590 AORUS ELITE | GeForce RTX 3060 Ti | NZXT H510 | EVGA G5 650W

 

Delta (Laptop) | Galaxy S21 Ultra | Pacific Spirit XT (Server)

Full Specs

Spoiler

 

Helios EVO (Main):

Intel Core™ i9-10900KF | 32GB G.Skill Ripjaws V / Team T-Force DDR4-3000 | GIGABYTE Z590 AORUS ELITE | MSI GAMING X GeForce RTX 3060 Ti 8GB GPU | NZXT H510 | EVGA G5 650W | MasterLiquid ML240L | 2x 2TB HDD | 256GB SX6000 Pro SSD | 3x Corsair SP120 RGB | Fractal Design Venturi HF-14

 

Pacific Spirit XT - Server

Intel Core™ i7-8700K (Won at LTX, signed by Dennis) | GIGABYTE Z370 AORUS GAMING 5 | 16GB Team Vulcan DDR4-3000 | Intel UrfpsgonHD 630 | Define C TG | Corsair CX450M

 

Delta - Laptop

ASUS TUF Dash F15 - Intel Core™ i7-11370H | 16GB DDR4 | RTX 3060 | 500GB NVMe SSD | 200W Brick | 65W USB-PD Charger

 


 

Intel is bringing DDR4 to the mainstream with the Intel® Core™ i5 6600K and i7 6700K processors. Learn more by clicking the link in the description below.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, JDE said:

I call 1080p 2K and 1440p 2.5K.

That's what they are.  1920 is roughly 2K pixels, and 2560 is roughly 2.5K pixels.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, JDE said:

I call 1080p 2K and 1440p 2.5K. Because 1080p x2 = UHD

What a maverick making up them rules as you go along... I LIKE IT! 

 

That's what they are.  1920 is roughly 2K pixels, and 2560 is roughly 2.5 pixels.

 

I thought 2560 is closer to 3k

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, PixelTwitch said:

What a maverick making up them rules as you go along... I LIKE IT! 

 

 

I thought 2560 is closer to 3k

2560 is closer to 2500 than 3000

PSU Nerd | PC Parts Flipper | Cable Management Guru

Helpful Links: PSU Tier List | Why not group reg? | Avoid the EVGA G3

Helios EVO (Main Desktop) Intel Core™ i9-10900KF | 32GB DDR4-3000 | GIGABYTE Z590 AORUS ELITE | GeForce RTX 3060 Ti | NZXT H510 | EVGA G5 650W

 

Delta (Laptop) | Galaxy S21 Ultra | Pacific Spirit XT (Server)

Full Specs

Spoiler

 

Helios EVO (Main):

Intel Core™ i9-10900KF | 32GB G.Skill Ripjaws V / Team T-Force DDR4-3000 | GIGABYTE Z590 AORUS ELITE | MSI GAMING X GeForce RTX 3060 Ti 8GB GPU | NZXT H510 | EVGA G5 650W | MasterLiquid ML240L | 2x 2TB HDD | 256GB SX6000 Pro SSD | 3x Corsair SP120 RGB | Fractal Design Venturi HF-14

 

Pacific Spirit XT - Server

Intel Core™ i7-8700K (Won at LTX, signed by Dennis) | GIGABYTE Z370 AORUS GAMING 5 | 16GB Team Vulcan DDR4-3000 | Intel UrfpsgonHD 630 | Define C TG | Corsair CX450M

 

Delta - Laptop

ASUS TUF Dash F15 - Intel Core™ i7-11370H | 16GB DDR4 | RTX 3060 | 500GB NVMe SSD | 200W Brick | 65W USB-PD Charger

 


 

Intel is bringing DDR4 to the mainstream with the Intel® Core™ i5 6600K and i7 6700K processors. Learn more by clicking the link in the description below.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, PixelTwitch said:

What a maverick making up them rules as you go along... I LIKE IT! 

 

 

I thought 2560 is closer to 3k

https://linustechtips.com/main/topic/691408-2k-does-not-mean-2560×1440/

You should read that

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

Aha, now I figured out something: the transistor in TFT panels act like a capacitor to maintain voltage. In other words, they drain. So the controller must go from top to bottom, left to right, like a CRT display. So it's likely the interpolation a digital display panel does is a side effect of still trying to act like a CRT.

 

However, riddle me this: why does the monitor have to do the work? The GPU is perfectly capable of sending a native resolution image with a scaled up  version of its output. But GPUs still interpolate the image.

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, M.Yurizaki said:

Aha, now I figured out something: the transistor in TFT panels act like a capacitor to maintain voltage. In other words, they drain. So the controller must go from top to bottom, left to right, like a CRT display. So it's likely the interpolation a digital display panel does is a side effect of still trying to act like a CRT.

 

However, riddle me this: why does the monitor have to do the work? The GPU is perfectly capable of sending a native resolution image with a scaled up  version of its output. But GPUs still interpolate the image.

On a TFT-controlled display panel like LCD or OLED, all the pixels in one row are refreshed simultaneously, although it does go from top to bottom one row at a time. This doesn't really have anything to do with interpolation though, because interpolation doesn't factor into the physical process. The controller sets the color of each pixel, and the physical pixels being addressed never change. The color each pixel needs to be is determined in processing after the controller receives the image data. From the source image, it determines which color each pixel on the display needs to be, through various interpolation methods if the resolutions don't match. The interpolation method used isn't related to how the physical pixels are addressed though.

 

Generally it's good for monitors to have scaling capabilities of their own as a backup to the GPU's scaling. If devices such as consoles or media devices like blu-ray players or streaming boxes or something that doesn't support scaling the source image, it's good for monitors to be able to handle the signal.

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, Glenwing said:

On a TFT-controlled display panel like LCD or OLED, all the pixels in one row are refreshed simultaneously, although it does go from top to bottom one row at a time. This doesn't really have anything to do with interpolation though, because interpolation doesn't factor into the physical process. The controller sets the color of each pixel, and the physical pixels being addressed never change. The color each pixel needs to be is determined in processing after the controller receives the image data. From the source image, it determines which color each pixel on the display needs to be, through various interpolation methods if the resolutions don't match. The interpolation method used isn't related to how the physical pixels are addressed though.

The way TFT's are addressed though is by setting a row and column line. So it can only address one pixel at a time. This is to reduce complexity because if you want to address any pixel individually, you'll need m x n lines rather than m + n lines (having m x n lines also creates issues with cross talk). The part about TFTs being passive or active is whether or not they need a capacitor to maintain state. Passive TFTs don't need a capacitor but active ones do. And since this behavior of a discharging capacitor is very similar to that of a phosphor fading, it's likely TFT LCDs are driven similarly to CRTs. And it's likely OLEDs are driven this way too.

 

Either way, the signaling used doesn't really say what the resolution is. For all the display controller knows, it's getting this stream of data. Plus it would be easier/faster (i.e., less input lag) to do one method of interpolation or any arbitrary resolution rather than try to detect if a signal is generating a certain resolution and scaling appropriately.

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, M.Yurizaki said:

The way TFT's are addressed though is by setting a row and column line. So it can only address one pixel at a time. This is to reduce complexity because if you want to address any pixel individually, you'll need m x n lines rather than m + n lines (having m x n lines also creates issues with cross talk). The part about TFTs being passive or active is whether or not they need a capacitor to maintain state. Passive TFTs don't need a capacitor but active ones do. And since this behavior of a discharging capacitor is very similar to that of a phosphor fading, it's likely TFT LCDs are driven similarly to CRTs. And it's likely OLEDs are driven this way too.

 

Either way, the signaling used doesn't really say what the resolution is. For all the display controller knows, it's getting this stream of data. Plus it would be easier/faster (i.e., less input lag) to do one method of interpolation or any arbitrary resolution rather than try to detect if a signal is generating a certain resolution and scaling appropriately.

 

TFTs are addressed one row at a time; it addresses all columns of subpixels simultaneously, it just needs to select which row to activate.

Link to comment
Share on other sites

Link to post
Share on other sites

The way things work does not really play a factor in why we do or do not have the ability to send a native 1080p frame to a display. 

 

Provided that the GPU renders the image at 1080p and sends that image to to the monitor, the monitor though dedicated hardware should in theory be able to x2 them pixels to its display. 

 

OR

 

The GPU though drivers should be able to x2 a 1080p frame without any fancy interp and send to a 4k monitor.

 

Regardless of how things work, I believe its safe to say in this case that the only thing limiting the ability I am asking for is either AMD/NVIDIA or display manufacturers willingness to tackle the problem. 

 

That said, for gaming especially there is nothing stopping Direct X, Vulkan or Game Developers from having interp free resolution scaling in their products. 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, PixelTwitch said:

The way things work does not really play a factor in why we do or do not have the ability to send a native 1080p frame to a display. 

 

Provided that the GPU renders the image at 1080p and sends that image to to the monitor, the monitor though dedicated hardware should in theory be able to x2 them pixels to its display.

But again, the signal itself usually has no real indication of what its resolution should be. All the display controller knows is its getting some data that it should throw on the screen. Anything you may add can cause significant input lag (significant being +16ms), which some people seem to have found that running non-native resolutions on a monitor can do.

 

So I think this is a problem that display manufacturers don't need to address, or rather, they shouldn't address.

Link to comment
Share on other sites

Link to post
Share on other sites

Maybe I'm missing something, but AFAIK when the resolution is multiple of the display, the blurriness should be quite modest (i.e. look a lot better than if it wasn't).

 

Also, the NVidia driver (on Linux) already offers the option to make scaling on the GPU instead of the display. I have to admit, I don't currently own a 4K display so haven't tried it in practice on that resolution... (I'm not sure if it allows "just multiply" scaling, what I believe OP is after here....)

 

So, I believe there already should be enough support to get decent 1080p support on a 4K display? Or is there some other reason a 4K display will cause bad quality? Or, are there displays that don't accept 1080p at all?

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, M.Yurizaki said:

But again, the signal itself usually has no real indication of what its resolution should be. All the display controller knows is its getting some data that it should throw on the screen. Anything you may add can cause significant input lag (significant being +16ms), which some people seem to have found that running non-native resolutions on a monitor can do.

 

So I think this is a problem that display manufacturers don't need to address, or rather, they shouldn't address.

That makes no sense. If that was the cause technologies like Gsync/Freesync would simply not work. Same with active 3D over HDMI, High hz interp or HDR. 

 

Also when it comes to input lag that is normally due to additional processing on either the GPU or Display. 

This is why I am suggesting NATIVE 1080p on a 4K display and not simply scaling. 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×