Jump to content

1080p on a 1440p monitor...

Paul Rudd

Why do people claim that it looks awful or bad or terrible or horrible? They claim it's distorted and has lots of blur. I sometimes game in 1080p on my 1440p monitor and have yet to notice anything wrong with it. It looks just as good as gaming in 1080p on my 1080p monitor.

 

Do different monitors have issues with resolution scaling or something?

 

I know the whole math with the pixels and everything but I'm telling you that my 1440p monitor has zero issues gaming in 1080p. Does anyone else have a crystal clear gaming experience in 1080p on their 1440p? Thoughts on this?

Link to comment
Share on other sites

Link to post
Share on other sites

Gaming is different than most other things. There's motion, usually not a lot of text, and other things. Doing pretty much anything else does look pretty rough when running 1080p on a 1440p panel. 

Phobos: AMD Ryzen 7 2700, 16GB 3000MHz DDR4, ASRock B450 Steel Legend, 8GB Nvidia GeForce RTX 2070, 2GB Nvidia GeForce GT 1030, 1TB Samsung SSD 980, 450W Corsair CXM, Corsair Carbide 175R, Windows 10 Pro

 

Polaris: Intel Xeon E5-2697 v2, 32GB 1600MHz DDR3, ASRock X79 Extreme6, 12GB Nvidia GeForce RTX 3080, 6GB Nvidia GeForce GTX 1660 Ti, 1TB Crucial MX500, 750W Corsair RM750, Antec SX635, Windows 10 Pro

 

Pluto: Intel Core i7-2600, 32GB 1600MHz DDR3, ASUS P8Z68-V, 4GB XFX AMD Radeon RX 570, 8GB ASUS AMD Radeon RX 570, 1TB Samsung 860 EVO, 3TB Seagate BarraCuda, 750W EVGA BQ, Fractal Design Focus G, Windows 10 Pro for Workstations

 

York (NAS): Intel Core i5-2400, 16GB 1600MHz DDR3, HP Compaq OEM, 240GB Kingston V300 (boot), 3x2TB Seagate BarraCuda, 320W HP PSU, HP Compaq 6200 Pro, TrueNAS CORE (12.0)

Link to comment
Share on other sites

Link to post
Share on other sites

LCD displays have a native resolution, which is literally the pixel layout of the panel. Unlike something like a CRT, you can't really display any arbitrary resolution. The way LCDs display non-native resolution is to interpolate them, which is to say they approximate what pixels can/should be displayed an map that onto the native resolution.

 

If you have a display with 1440p native resolution, and you display a 1080p source, it's still using the 1440p resolution, because it has to, but then it upsamples the 1080p pixels to fit. It's just using bog standard bilinear upscaling, though, so it's basically stuffing pixels into the image, which leads to artifacting.

 

If you're not paying that close attention or there's a lot of movement or fast paced action, you might not notice it, but if you pause on a frame and look at it, it will look pixelated and the edges will be off, etc. This effect bothers some people more than others, so it's still a YMMV thing. It definitely is a degraded image; it just depends on how much that personally bothers you.

CPU: AMD Ryzen 9 5900X · Cooler: Artic Liquid Freezer II 280 · Motherboard: MSI MEG X570 Unify · RAM: G.skill Ripjaws V 2x16GB 3600MHz CL16 (2Rx8) · Graphics Card: ASUS GeForce RTX 3060 Ti TUF Gaming · Boot Drive: 500GB WD Black SN750 M.2 NVMe SSD · Game Drive: 2TB Crucial MX500 SATA SSD · PSU: Corsair White RM850x 850W 80+ Gold · Case: Corsair 4000D Airflow · Monitor: MSI Optix MAG342CQR 34” UWQHD 3440x1440 144Hz · Keyboard: Corsair K100 RGB Optical-Mechanical Gaming Keyboard (OPX Switch) · Mouse: Corsair Ironclaw RGB Wireless Gaming Mouse

Link to comment
Share on other sites

Link to post
Share on other sites

I'd say 1080p gaming is passable on a 1440p panel - it's not unplayable but it's hardly crystal clear.

 

Probably depends on your game though. If you're playing a strategy game or something with lots of static elements it's probably far worse than a fast-paced fps.

CPU: i7 4790k, RAM: 16GB DDR3, GPU: GTX 1060 6GB

Link to comment
Share on other sites

Link to post
Share on other sites

i'd rather play windowed than non-native full screen, no matter the monitor.

 

 

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

38 minutes ago, xg32 said:

i'd rather play windowed than non-native full screen, no matter the monitor.

100% agree. I think it looks terrible and notice it instantly. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, BondiBlue said:

Gaming is different than most other things.

 

2 hours ago, Chris Pratt said:

It definitely is a degraded image

 

2 hours ago, tim0901 said:

it's not unplayable but it's hardly crystal clear.

 

2 hours ago, xg32 said:

i'd rather play windowed than non-native full scree

 

1 hour ago, rickeo said:

I think it looks terrible and notice it instantly.

Here's what I request all of you to do. Name a good 3-5 games for me to run in 1080p on my 1440p monitor. And then tell me to go to a location in said game and literally look at what you would label as, "not crystal clear or terrible or awful or bad or anything I can notice with my own eyes".

 

The reason I say 3-5 games is so that I'll possibly have at least one of the games you speak of.

 

Because I'm just not seeing it here Lloyd. Every single game I have ran in 1080p on a 1440p monitor are crystal clear.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Paul Rudd said:

Because I'm just not seeing it here Lloyd. Every single game I have ran in 1080p on a 1440p monitor are crystal clear.

We're all human. We all have different eyes. Anything on screen when I'm running my 1440p displays at 1080p looks blurry in some way, and I don't even have the best eyesight. Maybe 1080p on a 1440p display is perfectly clear to you, but it sure isn't to me. 

Phobos: AMD Ryzen 7 2700, 16GB 3000MHz DDR4, ASRock B450 Steel Legend, 8GB Nvidia GeForce RTX 2070, 2GB Nvidia GeForce GT 1030, 1TB Samsung SSD 980, 450W Corsair CXM, Corsair Carbide 175R, Windows 10 Pro

 

Polaris: Intel Xeon E5-2697 v2, 32GB 1600MHz DDR3, ASRock X79 Extreme6, 12GB Nvidia GeForce RTX 3080, 6GB Nvidia GeForce GTX 1660 Ti, 1TB Crucial MX500, 750W Corsair RM750, Antec SX635, Windows 10 Pro

 

Pluto: Intel Core i7-2600, 32GB 1600MHz DDR3, ASUS P8Z68-V, 4GB XFX AMD Radeon RX 570, 8GB ASUS AMD Radeon RX 570, 1TB Samsung 860 EVO, 3TB Seagate BarraCuda, 750W EVGA BQ, Fractal Design Focus G, Windows 10 Pro for Workstations

 

York (NAS): Intel Core i5-2400, 16GB 1600MHz DDR3, HP Compaq OEM, 240GB Kingston V300 (boot), 3x2TB Seagate BarraCuda, 320W HP PSU, HP Compaq 6200 Pro, TrueNAS CORE (12.0)

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, Paul Rudd said:

 

 

 

 

Here's what I request all of you to do. Name a good 3-5 games for me to run in 1080p on my 1440p monitor. And then tell me to go to a location in said game and literally look at what you would label as, "not crystal clear or terrible or awful or bad or anything I can notice with my own eyes".

 

The reason I say 3-5 games is so that I'll possibly have at least one of the games you speak of.

 

Because I'm just not seeing it here Lloyd. Every single game I have ran in 1080p on a 1440p monitor are crystal clear.

Again, it's not "crystal clear". It literally can't be, because of the nature of the display. It might be true that your eyes are not sensitive enough to notice a difference, but the difference is there, none the less. There's no need to prove anything. You can't take a 1080p image and stretch it to 1440p without distorting the image and introducing artifacts. Period.

CPU: AMD Ryzen 9 5900X · Cooler: Artic Liquid Freezer II 280 · Motherboard: MSI MEG X570 Unify · RAM: G.skill Ripjaws V 2x16GB 3600MHz CL16 (2Rx8) · Graphics Card: ASUS GeForce RTX 3060 Ti TUF Gaming · Boot Drive: 500GB WD Black SN750 M.2 NVMe SSD · Game Drive: 2TB Crucial MX500 SATA SSD · PSU: Corsair White RM850x 850W 80+ Gold · Case: Corsair 4000D Airflow · Monitor: MSI Optix MAG342CQR 34” UWQHD 3440x1440 144Hz · Keyboard: Corsair K100 RGB Optical-Mechanical Gaming Keyboard (OPX Switch) · Mouse: Corsair Ironclaw RGB Wireless Gaming Mouse

Link to comment
Share on other sites

Link to post
Share on other sites

just do any shooter at high refresh at non-native fullscreen, the "slight" blur ... might be ok for sightseeing stuff and anime games like witcher and tales of arise

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, xg32 said:

i'd rather play windowed than non-native full screen, no matter the monitor.

Go fullscreen and set "integer scaling" inside nvidia control panel. That way you automatically get the biggest possible window that is scaled "pixel perfect" and the rest of your screen is black.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, BondiBlue said:

Maybe 1080p on a 1440p display is perfectly clear to you, but it sure isn't to me. 

Gaming in 1080p on my 1440p monitor looks exactly the same as gaming in 1080p on the 1080p monitor to my left. Since you did not list any games for me to see what your eyes see, I'll take that as you having no games in mind that show me what your eyes see.

10 hours ago, Chris Pratt said:

Again, it's not "crystal clear". It literally can't be, because of the nature of the display. It might be true that your eyes are not sensitive enough to notice a difference, but the difference is there, none the less. There's no need to prove anything. You can't take a 1080p image and stretch it to 1440p without distorting the image and introducing artifacts. Period.

Gaming in 1080p on my 1440p monitor looks as crystal clear as gaming in 1080p on the 1080p monitor to my left. Until you inform me of any games that your eyes notice a difference, I'll never see what your eyes see. And I have never seen any distortion and I have never seen any artifacts in any games in 1080p on my 1440p monitor. Period.

8 hours ago, xg32 said:

just do any shooter at high refresh at non-native fullscreen, the "slight" blur ... might be ok for sightseeing stuff and anime games like witcher and tales of arise

Which particular shooter? Which particular location in said shooter? And tell me exactly what your eyes see as terrible or awful. Tell me exactly what to put my eyes on to see what your eyes see.

1 hour ago, Stahlmann said:

Go fullscreen and set "integer scaling" inside nvidia control panel. That way you automatically get the biggest possible window that is scaled "pixel perfect" and the rest of your screen is black.

Why?

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Paul Rudd said:

Gaming in 1080p on my 1440p monitor looks as crystal clear as gaming in 1080p on the 1080p monitor to my left. Until you inform me of any games that your eyes notice a difference, I'll never see what your eyes see. And I have never seen any distortion and I have never seen any artifacts in any games in 1080p on my 1440p monitor. Period

So what the hell was the point of this topic in the first place? Multiple people have told you the situation, and it's indisputable. The 1080p image is upscaled, and any form of upscaling degrades image quality. These are simple facts. That you don't notice a difference speaks only to the quality of your vision; it doesn't change reality.

CPU: AMD Ryzen 9 5900X · Cooler: Artic Liquid Freezer II 280 · Motherboard: MSI MEG X570 Unify · RAM: G.skill Ripjaws V 2x16GB 3600MHz CL16 (2Rx8) · Graphics Card: ASUS GeForce RTX 3060 Ti TUF Gaming · Boot Drive: 500GB WD Black SN750 M.2 NVMe SSD · Game Drive: 2TB Crucial MX500 SATA SSD · PSU: Corsair White RM850x 850W 80+ Gold · Case: Corsair 4000D Airflow · Monitor: MSI Optix MAG342CQR 34” UWQHD 3440x1440 144Hz · Keyboard: Corsair K100 RGB Optical-Mechanical Gaming Keyboard (OPX Switch) · Mouse: Corsair Ironclaw RGB Wireless Gaming Mouse

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Chris Pratt said:

So what the hell was the point of this topic in the first place?

To see what you're describing as what you're describing. I have yet to see it.

6 minutes ago, Chris Pratt said:

Multiple people have told you the situation, and it's indisputable.

But no one has yet to provide a single game for me to see what your eyes see.

6 minutes ago, Chris Pratt said:

any form of upscaling degrades image quality.

Which I have yet to see. My eyes have seen no degradation that you speak of.

6 minutes ago, Chris Pratt said:

These are simple facts.

No, they are not.

6 minutes ago, Chris Pratt said:

That you don't notice a difference speaks only to the quality of your vision; it doesn't change reality.

You see, that's what I also don't understand. People start to question my eye sight, versus providing to me something their eyes see as awful or terrible.

 

I have a 1440p monitor right in front of me. Tell me to lay my eyes on something in 1080p on this 1440p monitor that your eyes see as distorted.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Paul Rudd said:

But no one has yet to provide a single game for me to see what your eyes see.

We all have different eyes. How the heck can you see exactly what we see just based on looking at a game?

1 minute ago, Paul Rudd said:

Which I have yet to see. My eyes have seen no degradation that you speak of.

Your eyes may not see a difference.

1 minute ago, Paul Rudd said:

No, they are not.

Yes they are. You cannot take a 1920x1080 image or video and scale it up to 2560x1440 without some sort of affect on the image or video. 

2 minutes ago, Paul Rudd said:

You see, that's what I also don't understand. People start to question my eye sight, versus providing to me something their eyes see as awful or terrible.

Because there's no way you can see through our eyes. 

 

Personally I have pretty bad eyesight. My left eye is terrible. I can't read with that eye past a distance of a couple feet. But still, I can clearly see a difference between native 1440p content and upscaled 1080p content on a 1440p display. Maybe you can't, and that's fine, but it does call into question your eyesight. 

Phobos: AMD Ryzen 7 2700, 16GB 3000MHz DDR4, ASRock B450 Steel Legend, 8GB Nvidia GeForce RTX 2070, 2GB Nvidia GeForce GT 1030, 1TB Samsung SSD 980, 450W Corsair CXM, Corsair Carbide 175R, Windows 10 Pro

 

Polaris: Intel Xeon E5-2697 v2, 32GB 1600MHz DDR3, ASRock X79 Extreme6, 12GB Nvidia GeForce RTX 3080, 6GB Nvidia GeForce GTX 1660 Ti, 1TB Crucial MX500, 750W Corsair RM750, Antec SX635, Windows 10 Pro

 

Pluto: Intel Core i7-2600, 32GB 1600MHz DDR3, ASUS P8Z68-V, 4GB XFX AMD Radeon RX 570, 8GB ASUS AMD Radeon RX 570, 1TB Samsung 860 EVO, 3TB Seagate BarraCuda, 750W EVGA BQ, Fractal Design Focus G, Windows 10 Pro for Workstations

 

York (NAS): Intel Core i5-2400, 16GB 1600MHz DDR3, HP Compaq OEM, 240GB Kingston V300 (boot), 3x2TB Seagate BarraCuda, 320W HP PSU, HP Compaq 6200 Pro, TrueNAS CORE (12.0)

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, BondiBlue said:

We all have different eyes. How the heck can you see exactly what we see just based on looking at a game?

By laying my eyes on what your eyes describe as distorted.

4 minutes ago, BondiBlue said:

Your eyes may not see a difference.

My eyes also may see a difference. It's what I'm doing my best to achieve here.

5 minutes ago, BondiBlue said:

Yes they are. You cannot take a 1920x1080 image or video and scale it up to 2560x1440 without some sort of affect on the image or video. 

So show me this affect. Tell me something, anything, to lay my eyes on, that your eyes see as being affected. I'm going to be honest about what I see when you tell me to look at it. I'm not going to make anything up. If you have a 1440p monitor, put it in 1080p, find something distorted, then tell me to look at that exact something.

7 minutes ago, BondiBlue said:

Because there's no way you can see through our eyes.

I can try.

7 minutes ago, BondiBlue said:

I can clearly see a difference between native 1440p content and upscaled 1080p content on a 1440p display.

What content? Give me this specific content, describe it to me. And I'll look at it.

8 minutes ago, BondiBlue said:

it does call into question your eyesight.

So put it to the test. Provide this distortion to me. I'd love to see it.

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Paul Rudd said:

.

i was just playing borderlands 3 on the wrong resolution after a reinstall, the texture's blurry the text is even worse, it's not as noticeble in cyberpunk but it's there, generally it's more noticeble in high fps shooters, it's just blurry compared to native, but if you don't notice it then there isn't a problem.

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Paul Rudd said:

To see what you're describing as what you're describing. I have yet to see it.

But no one has yet to provide a single game for me to see what your eyes see.

Which I have yet to see. My eyes have seen no degradation that you speak of.

No, they are not.

You see, that's what I also don't understand. People start to question my eye sight, versus providing to me something their eyes see as awful or terrible.

 

I have a 1440p monitor right in front of me. Tell me to lay my eyes on something in 1080p on this 1440p monitor that your eyes see as distorted.

You're missing the point entirely. Everyone's eyes are different. The incontrovertible part is that there is image degradation. It's entirely possible *you* can't see any difference, but that doesn't mean there isn't one, or that other people wouldn't notice. The reason I said this is fact is because it's a function of LCD display technology. A 1440p LCD display very literally cannot display 1080p. It just can't. Instead, it upscales 1080p to 1440p to display at its native resolution: 1440p. Again, incontrovertible facts.

 

Giving you examples is an exercise in futility, because 1) literally everything is an example and 2) you obviously are incapable of perceiving the difference anyways, or you wouldn't be arguing so belligerently. None of that changes simple reality though: 1080p upscaled to 1440p is degraded.

 

At this point, any further discussion is moot. So, believe what you want and go on your merry way, or kindly direct further arguments to the nearest wall, because it will do you as much good.

CPU: AMD Ryzen 9 5900X · Cooler: Artic Liquid Freezer II 280 · Motherboard: MSI MEG X570 Unify · RAM: G.skill Ripjaws V 2x16GB 3600MHz CL16 (2Rx8) · Graphics Card: ASUS GeForce RTX 3060 Ti TUF Gaming · Boot Drive: 500GB WD Black SN750 M.2 NVMe SSD · Game Drive: 2TB Crucial MX500 SATA SSD · PSU: Corsair White RM850x 850W 80+ Gold · Case: Corsair 4000D Airflow · Monitor: MSI Optix MAG342CQR 34” UWQHD 3440x1440 144Hz · Keyboard: Corsair K100 RGB Optical-Mechanical Gaming Keyboard (OPX Switch) · Mouse: Corsair Ironclaw RGB Wireless Gaming Mouse

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Chris Pratt said:

you obviously are incapable of perceiving the difference anyways, or you wouldn't be arguing so belligerently.

You don't know that. Especially if right now, you own a 1440p monitor. And you put that 1440p monitor in 1080p resolution. And then you look at something, whatever you want, and you see it as distorted. Then all you have to do is provide exactly what you see as distorted to me, so that I can then put it in 1080p on my 1440p monitor and see what your eyes are seeing.

5 minutes ago, Chris Pratt said:

At this point, any further discussion is moot. So, believe what you want and go on your merry way, or kindly direct further arguments to the nearest wall, because it will do you as much good.

My point exactly. You apparently cannot provide anything for me to see what you describe. It's that simple. Why you cannot provide this? My guess is because you think something in your mind that when I see what you're describing, you think I'm going to just say, "there's nothing wrong with that" or "I see no degradation". But in reality, if what you're claiming is indeed true, then what I would actually see and then say to you, would be, "ohhhhhh, I see now, I see what you're talking about".

 

I would like to do that. And the fastest way for me to do that, is for you to simply provide anything for me to see that you're claiming. NOT a bunch of words saying what I'd see. I'd like to see what you're claiming as degraded. What is it? Is it, everything in 1080p on my 1440p monitor? Does this mean I can simply load up any game in 1080p on my 1440p monitor, look at it, and then relay to you what I see? Is that what you're saying? Because I can do that. I can load up any game in 1080p on my 1440p monitor, and describe to you what I see.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, xg32 said:

borderlands 3 cyberpunk

I don't own either of these games. Would Borderlands 2 work?

1 hour ago, xg32 said:

if you don't notice it then there isn't a problem.

While there isn't a problem, I still want to see why people describe 1080p on a 1440p monitor as bad. Or terrible. Or awful. I want to see that.

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Paul Rudd said:

You don't know that. Especially if right now, you own a 1440p monitor. And you put that 1440p monitor in 1080p resolution. And then you look at something, whatever you want, and you see it as distorted. Then all you have to do is provide exactly what you see as distorted to me, so that I can then put it in 1080p on my 1440p monitor and see what your eyes are seeing.

My point exactly. You apparently cannot provide anything for me to see what you describe. It's that simple. Why you cannot provide this? My guess is because you think something in your mind that when I see what you're describing, you think I'm going to just say, "there's nothing wrong with that" or "I see no degradation". But in reality, if what you're claiming is indeed true, then what I would actually see and then say to you, would be, "ohhhhhh, I see now, I see what you're talking about".

 

I would like to do that. And the fastest way for me to do that, is for you to simply provide anything for me to see that you're claiming. NOT a bunch of words saying what I'd see. I'd like to see what you're claiming as degraded. What is it? Is it, everything in 1080p on my 1440p monitor? Does this mean I can simply load up any game in 1080p on my 1440p monitor, look at it, and then relay to you what I see? Is that what you're saying? Because I can do that. I can load up any game in 1080p on my 1440p monitor, and describe to you what I see.

Dude. Perhaps go back and re-read my last post, because you apparently didn't comprehend a thing from it. You're talking about your own personal perception, and I'm talking about objective reality. Those two don't necessarily and won't necessarily mesh. If you see no difference, great. Good for you. Enjoy your 1080p content stretched to 1440p to your heart's content and drop it.

 

The reason no one, including myself, has provided "examples", is because *any* 1080p content is an example. There's no well this particular thing is degraded when upscaled, but this other thing isn't. If you want an example, go into any image editing program and just arbitrarily resize some image to 1.5x it's size. That's it right there. That's the very same thing the display is doing.

 

Just because you can't see the difference, apparently, doesn't mean it's not there.

CPU: AMD Ryzen 9 5900X · Cooler: Artic Liquid Freezer II 280 · Motherboard: MSI MEG X570 Unify · RAM: G.skill Ripjaws V 2x16GB 3600MHz CL16 (2Rx8) · Graphics Card: ASUS GeForce RTX 3060 Ti TUF Gaming · Boot Drive: 500GB WD Black SN750 M.2 NVMe SSD · Game Drive: 2TB Crucial MX500 SATA SSD · PSU: Corsair White RM850x 850W 80+ Gold · Case: Corsair 4000D Airflow · Monitor: MSI Optix MAG342CQR 34” UWQHD 3440x1440 144Hz · Keyboard: Corsair K100 RGB Optical-Mechanical Gaming Keyboard (OPX Switch) · Mouse: Corsair Ironclaw RGB Wireless Gaming Mouse

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Chris Pratt said:

go into any image editing program and just arbitrarily resize some image to 1.5x it's size. That's it right there. That's the very same thing the display is doing.

Okay, so here is a random 2560 x 1440 picture of strawberries. I just randomly found it on Google and posted it to imgur. I then right clicked on the picture on imgur and resaved it to my PC. I now have it open and at the top of the picture it will allow me to zoom in as high as 100. Tell me how far to zoom in and then describe to me what to look for as degraded. Or allow me to describe what I see, either or.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Chris Pratt said:

Just because you can't see the difference, apparently, doesn't mean it's not there.

Okay, I have discovered something. Would you like to see it?

Link to comment
Share on other sites

Link to post
Share on other sites

I run Cyberpunk at 1080p on a 1440p monitor, as my PCs glory days are far behind it. Sure, it's not optimal, but once you're playing you don't really notice it. How well your monitor handles the upscale is probably a large factor here, they aren't all equal at this. It's far less noticeable than a poor frame rate, so if it's what you have to do to get a decent frame rate, it's worth the trade-off IMO. It's not like 3090s grow on trees, or even 3070s & 6700XTs right now.

 

Something to test it with is a YouTube video. Open the same vid twice, set one to 1080p and one to 1440p, and full screen them. The 1440p one is sharper, but the 1080p is far from unwatchable. And the 1080p video on a 1080p monitor would look less sharp than the 1440p video would on a 1440p monitor of the same size anyway, so how much of the loss in quality can be blamed on the non-native resolution is somewhat debatable.

 

I feel an idea for an LMG video coming on, can the LMG staff tell the difference between 1080p content on a 1080p and 1440p monitor. And I guess 1440p content on a 1440p and a 4K monitors. Anthony probably could, but I wouldn't be shocked if over 50% couldn't.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×