Jump to content

Visualize 1920x1080 vs 2560x1440?

zindan

Hello, is there some way I can like visually see the diff between two 27 inch monitors with different resolutions. It must be same size, very important- 


27" 2560x1440 vs 27" 1920x1080

thank you. 

Link to comment
Share on other sites

Link to post
Share on other sites

Unfortunately, because the difference between two monitors of the same size but different resolution comes down to the number of pixels per inch, which will be impossible to accurately represent because any image created would be scaled to the resolution of your monitor. However, I will try to explain the difference through a small wall of text below. 

 

The difference between the two monitors you describe simple: number of pixels. But what does this mean? It means that for every square inch of monitor, there are more individual points of color, which leads to a sharper image. Essentially, the 1440p monitor will allow you to sit closer to it before you start to notice the "blocky" pixel artifacts that make up the images on the screen. So, if you sit 3 feet away from the 27 inch monitor, you might not see a major improvement from the 1440p monitor if your vision isn't great, ie, less than 20/20. However, as someone with really good eyes, I personally experience this "blocky" effect a little when I sit 3 feet away from my monitor, so your milage may vary. Essentially, go to a store and look at a 1080p and 1440p monitor one after the other at the approximate distance you will sit at. If you see a significant difference in sharpness of text or in sharpness of game visuals, I would spring for the 1440p panel if you can afford it. If not, get the 1080p one.

 

TLDR: 1440p means that the image is sharper, not more colorful or responsive. If you can see a difference at the appropriate viewing distance in a store, then think about getting the 1440p monitor.

 

Sorry for the text wall, but I hope this helped!

Edited by Aelar_Nailo
Spelling...
As #muricaparrotgang's founder, I invite you to join our ranks today.

"My name is Legion 'Murica Parrot Gang, for we are many."

 

(We actually welcome all forms of animated parrot gifs.)

 

The artist formerly known as Aelar_Nailo.

 

Profile Pic designed by the very lovely @Red :)!

Link to comment
Share on other sites

Link to post
Share on other sites

Optimal viewing distance for both is very similar, within a few inches at that size and distance.

 

You can measure how far your eye is from the screen then do the math (or Google it, there are many sites that have charts on this, as understanding when to buy an HD vs 4k TV for a room is important)

Link to comment
Share on other sites

Link to post
Share on other sites

Well, to keep things simple:

 

They're both 27" so in theory, the width and height of the rectangle with pixels is the same.

This means that the 2560x1440 squeezes more pixels in the same area, which means the pixels are smaller.

Smaller pixels is better for images, but slightly less good for small text, and make no practical difference when it comes to playing movies or games.

Basically, small text may look a bit worse or you may have to move a tiny bit closer to the screen to see the small text right when you're reading a big chunk of text with small fonts.

The added resolution is nice for example when you're editing a document (as you'd be able to have two pages side by side much easier compared to 1080p) and the vertical height also helps - i notice often me favoring the 1920x1200 panel I have because of the extra 100 pixels or so (i have taskbar as double height on that panel, makes it so each app isn't shrunk to icons and i can switch between apps much faster)

For games, the added resolution can improve visual quality, but it depends on the video card - it's a balance between added visual quality due to resolution but lower framerate because the extra pixels give more work to do to the video card. So, you may have to actually go in game settings a lower the visual quality a tiny bit just to keep the frame rates at a decent level, which basically makes you lose almost all that visual improvement gained by higher resolution. So it's 1-2 plus, 0.5-1 minus ...

 

I would also look into the type of panel. Basically, if one is true 10 bit and the other is 8 bit+FRC,  or if it's 6bit+FRC vs true 8bit - I personally would always choose the "true" panel. Some panels that do 2560x1440 have to resort to 6bit+FRC to or 8bit+FRC to be able to update the screen 60 times a second or whatever... Most of the time, you don't notice the loss of visual quality, these days these tricks that fool your yes are quite good. If you're not into Photoshop or serious video editing (where you're dealing with color grading and color accuracy is important), you wouldn't care about panel being "true" 8 bit or 10 bit. For games like CS or Overwatch or competition games in particular it truly doesn't matter.

 

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, VegetableStu said:

 

on second thoughts this isn't a methodically correct demonstration ,_, no other way than to actually see it IRL

  Reveal hidden contents

 

jPoXGBj.jpg

good luck.

(open imgur image in new tab and zoom to 1:1)

(sit in reading distance and see if you can tell (NO CHEATING))

 

Interesting... a few of the trees on the right picture look 'fuzzier' against the sky background.  But it is hard to tell.

 

Now on my monitors at home it's quite easy to tell and see the pixel difference between 1440p and 1080p

"And I'll be damned if I let myself trip from a lesser man's ledge"

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, VegetableStu said:

that image would only work on a 1440p display (to compare with a theoretical 1080p monitor of the same physical dimensions), lol ,_,

On a 1080p monitor I can see the right isnt as sharp... if I get right up close.

i5 8600 - RX580 - Fractal Nano S - 1080p 144Hz

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, VegetableStu said:

in your case, imagine a 810p monitor of your size exists

 

also SIT BACK

 

2 hours ago, NineEyeRon said:

On a 1080p monitor I can see the right isnt as sharp... if I get right up close.

 

3 hours ago, Velcade said:

Interesting... a few of the trees on the right picture look 'fuzzier' against the sky background.  But it is hard to tell.

 

Now on my monitors at home it's quite easy to tell and see the pixel difference between 1440p and 1080p

 

3 hours ago, mariushm said:

Well, to keep things simple:

 

They're both 27" so in theory, the width and height of the rectangle with pixels is the same.

This means that the 2560x1440 squeezes more pixels in the same area, which means the pixels are smaller.

Smaller pixels is better for images, but slightly less good for small text, and make no practical difference when it comes to playing movies or games.

Basically, small text may look a bit worse or you may have to move a tiny bit closer to the screen to see the small text right when you're reading a big chunk of text with small fonts.

The added resolution is nice for example when you're editing a document (as you'd be able to have two pages side by side much easier compared to 1080p) and the vertical height also helps - i notice often me favoring the 1920x1200 panel I have because of the extra 100 pixels or so (i have taskbar as double height on that panel, makes it so each app isn't shrunk to icons and i can switch between apps much faster)

For games, the added resolution can improve visual quality, but it depends on the video card - it's a balance between added visual quality due to resolution but lower framerate because the extra pixels give more work to do to the video card. So, you may have to actually go in game settings a lower the visual quality a tiny bit just to keep the frame rates at a decent level, which basically makes you lose almost all that visual improvement gained by higher resolution. So it's 1-2 plus, 0.5-1 minus ...

 

I would also look into the type of panel. Basically, if one is true 10 bit and the other is 8 bit+FRC,  or if it's 6bit+FRC vs true 8bit - I personally would always choose the "true" panel. Some panels that do 2560x1440 have to resort to 6bit+FRC to or 8bit+FRC to be able to update the screen 60 times a second or whatever... Most of the time, you don't notice the loss of visual quality, these days these tricks that fool your yes are quite good. If you're not into Photoshop or serious video editing (where you're dealing with color grading and color accuracy is important), you wouldn't care about panel being "true" 8 bit or 10 bit. For games like CS or Overwatch or competition games in particular it truly doesn't matter.

 

 

4 hours ago, Aelar_Nailo said:

Unfortunately, because the difference between two monitors of the same size but different resolution comes down to the number of pixels per inch, which will be impossible to accurately represent because any image created would be scaled to the resolution of your monitor. However, I will try to explain the difference through a small wall of text below. 

 

The difference between the two monitors you describe simple: number of pixels. But what does this mean? It means that for every square inch of monitor, there are more individual points of color, which leads to a sharper image. Essentially, the 1440p monitor will allow you to sit closer to it before you start to notice the "blocky" pixel artifacts that make up the images on the screen. So, if you sit 3 feet away from the 27 inch monitor, you might not see a major improvement from the 1440p monitor if your vision isn't great, ie, less than 20/20. However, as someone with really good eyes, I personally experience this "blocky" effect a little when I sit 3 feet away from my monitor, so your milage may vary. Essentially, go to a store and look at a 1080p and 1440p monitor one after the other at the approximate distance you will sit at. If you see a significant difference in sharpness of text or in sharpness of game visuals, I would spring for the 1440p panel if you can afford it. If not, get the 1080p one.

 

TLDR: 1440p means that the image is sharper, not more colorful or responsive. If you can see a difference at the appropriate viewing distance in a store, then think about getting the 1440p monitor.

 

Sorry for the text wall, but I hope this helped!

Thank you very much for your replies. I  should have written down, "in terms of gaming" I have a Acer predator xb271hu 165hz IPS currrently. But I am dying to try the new ASUS monitor which also have g sync and 165hz 27" but the resolution is 1920x1080. So I am like wondering how the size of the characters in game etc witll differ. How my aim will differ and so on and so forth. 
The reason for this additional monitor is because I can't get stable 165fps with any card available in the store today unfortunately. Not even the 2080 TI can keep up with what I need... 

Link to comment
Share on other sites

Link to post
Share on other sites

Can’t you just change the resolution if it’s too high?

i5 8600 - RX580 - Fractal Nano S - 1080p 144Hz

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, NineEyeRon said:

Can’t you just change the resolution if it’s too high?

No no I can't there are no good fits if you play 2560x1440. The resolution change is really really really super bad. I don't know how to explain it but if you do 1080p on a 2560 monitor it is just totally baskfaskfasjfkasljfasjgkaj  :( 

Link to comment
Share on other sites

Link to post
Share on other sites

Ill tell you what - the 2k 144hz acer monitor on my Ryzen rig is like 100% better than my 1080p monitor when it comes to how vibrant and beautiful a game can look.  Also comes down to the type of panel.

Workstation Laptop: Dell Precision 7540, Xeon E-2276M, 32gb DDR4, Quadro T2000 GPU, 4k display

Wifes Rig: ASRock B550m Riptide, Ryzen 5 5600X, Sapphire Nitro+ RX 6700 XT, 16gb (2x8) 3600mhz V-Color Skywalker RAM, ARESGAME AGS 850w PSU, 1tb WD Black SN750, 500gb Crucial m.2, DIYPC MA01-G case

My Rig: ASRock B450m Pro4, Ryzen 5 3600, ARESGAME River 5 CPU cooler, EVGA RTX 2060 KO, 16gb (2x8) 3600mhz TeamGroup T-Force RAM, ARESGAME AGV750w PSU, 1tb WD Black SN750 NVMe Win 10 boot drive, 3tb Hitachi 7200 RPM HDD, Fractal Design Focus G Mini custom painted.  

NVIDIA GeForce RTX 2060 video card benchmark result - AMD Ryzen 5 3600,ASRock B450M Pro4 (3dmark.com)

Daughter 1 Rig: ASrock B450 Pro4, Ryzen 7 1700 @ 4.2ghz all core 1.4vCore, AMD R9 Fury X w/ Swiftech KOMODO waterblock, Custom Loop 2x240mm + 1x120mm radiators in push/pull 16gb (2x8) Patriot Viper CL14 2666mhz RAM, Corsair HX850 PSU, 250gb Samsun 960 EVO NVMe Win 10 boot drive, 500gb Samsung 840 EVO SSD, 512GB TeamGroup MP30 M.2 SATA III SSD, SuperTalent 512gb SATA III SSD, CoolerMaster HAF XM Case. 

https://www.3dmark.com/3dm/37004594?

Daughter 2 Rig: ASUS B350-PRIME ATX, Ryzen 7 1700, Sapphire Nitro+ R9 Fury Tri-X, 16gb (2x8) 3200mhz V-Color Skywalker, ANTEC Earthwatts 750w PSU, MasterLiquid Lite 120 AIO cooler in Push/Pull config as rear exhaust, 250gb Samsung 850 Evo SSD, Patriot Burst 240gb SSD, Cougar MX330-X Case

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, zindan said:

No no I can't there are no good fits if you play 2560x1440. The resolution change is really really really super bad. I don't know how to explain it but if you do 1080p on a 2560 monitor it is just totally baskfaskfasjfkasljfasjgkaj  :( 

Well, that's not exactly true ...

You can configure your video card's driver to run the game at 1080p but keep the resolution it sends to the monitor as 2560x1440 and center the image on screen

For example, on AMD video cards... in the Display section, click on GPU Scaling and set to On , and then at Scaling mode , set Center 

My first display is 1920x1200 - if I enable gpu scaling and set scaling mode to center and then launch a game at 1680x1050, I'd get the game image at 1680x1050 centered in the middle of the screen with black bars around the image. The monitor would run at 1920x1200.

This way, the scaling in the monitor isn't used (which is often poor quality) and i get the performance of 1680x1050 resolution for the game but monitor runs at 1920x1200

Just the same, you could have the monitor run at 2560x1440, but you could run the game at 2560x1080 or 2048x1280 (16:10) - you can add custom resolutions in Windows.

The GPU scaling on with center for scaling results in no scaling, so it's basically free - no gpu performance is used.

You can also set scaling mode to "full panel" and then video card's scaling will be used instead of the monitor's scaling and that will cause a tiny performance loss because the video card does resizing to the full panel size. BUT, the scalers in the video card are usually very fast if you're dealing with 1.5x , 1.75x, 2x etc and have better quality.

So for example, you could play the game at 1/2  (1280x720) or 0.75 (1920x1080)  and scaling to full panel should be fast and good quality.

 

image.png.e5d9d1bf9928b35af367a989d3f7b640.png

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, mariushm said:

Well, that's not exactly true ...

You can configure your video card's driver to run the game at 1080p but keep the resolution it sends to the monitor as 2560x1440 and center the image on screen

For example, on AMD video cards... in the Display section, click on GPU Scaling and set to On , and then at Scaling mode , set Center 

My first display is 1920x1200 - if I enable gpu scaling and set scaling mode to center and then launch a game at 1680x1050, I'd get the game image at 1680x1050 centered in the middle of the screen with black bars around the image. The monitor would run at 1920x1200.

This way, the scaling in the monitor isn't used (which is often poor quality) and i get the performance of 1680x1050 resolution for the game but monitor runs at 1920x1200

Just the same, you could have the monitor run at 2560x1440, but you could run the game at 2560x1080 or 2048x1280 (16:10) - you can add custom resolutions in Windows.

The GPU scaling on with center for scaling results in no scaling, so it's basically free - no gpu performance is used.

You can also set scaling mode to "full panel" and then video card's scaling will be used instead of the monitor's scaling and that will cause a tiny performance loss because the video card does resizing to the full panel size. BUT, the scalers in the video card are usually very fast if you're dealing with 1.5x , 1.75x, 2x etc and have better quality.

So for example, you could play the game at 1/2  (1280x720) or 0.75 (1920x1080)  and scaling to full panel should be fast and good quality.

 

image.png.e5d9d1bf9928b35af367a989d3f7b640.png

 

Ever since amd tricked me with that point system shit I am not getting amd. I have nvidia.

Link to comment
Share on other sites

Link to post
Share on other sites

56 minutes ago, zindan said:

Ever since amd tricked me with that point system shit I am not getting amd. I have nvidia.

Was giving an example. nViidia should have something similar in the settings panel.May be named some other way but does same job.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×