Jump to content

4k vs 1440p 27inch new pc

tiny_disaster

Forget the faster hz forget every other difference. Assume 2 identical monitors in every aspect ips g sync... but 1: the first is 4k the other is 1440p (both 27 inch) is there a big    difference visually?

I'm building a new pc and every time I bring up this question the topic changes to can your pc handle 4k or if the 144hz is much better or not. So neglect all that.

I just want to understand how big a difference is 4k compared to 1440p on a 27 inch monitor. I have seen gaming at 1440p and it looks excellent how much of a jump are we talking about here?

Link to comment
Share on other sites

Link to post
Share on other sites

i see a minor difference from 1440p to 4k. PLease don not get a 4k 144hz monitor. IT is hard to reach 60fps  maxed out on very demanding games. You will need a very powerful setup or keep waiting for more stronger GPUS. I play in 4k but can barely hit 60fps on some games. I know my rig bottlenecks bu I'mma upgradee later
 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, reapersivan said:

i see a minor difference from 1440p to 4k. PLease don not get a 4k 144hz monitor. IT is hard to reach 60fps  maxed out on very demanding games. You will need a very powerful setup or keep waiting for more stronger GPUS. I play in 4k but can barely hit 60fps on some games. I know my rig bottlenecks bu I'mma upgradee later
 

Assume my pc can run it at 60...if the difference is minor i can get 1440p and use the extra hz if not i will stick to 4k as i can always sli or upgrade gpus in the future...got another question for you...your cx psu are u ocing on that? Is it stable? 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, tiny_disaster said:

Assume my pc can run it at 60...if the difference is minor i can get 1440p and use the extra hz if not i will stick to 4k as i can always sli or upgrade gpus in the future...got another question for you...your cx psu are u ocing on that? Is it stable? 

No OC at all. Surprised it hasn't even exploded ?. I needa switch it out

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, tiny_disaster said:

 I have seen gaming at 1440p and it looks excellent how much of a jump are we talking about here?

2560x1440 = 3.6 megapixel

3840x2160 = 8.3 megapixel

 

4k is more than twice the pixel density as 1440p. It should look at least twice as sharp. Whether or not your particular eyes are sensitive enough to pick up on this is anyone's guess. obviously there is a limit to what the human eye can detect, and that will vary a bit from person to person. If you had them side by side you could probably pick out which one was the 4k with minimal effort. if all you had was the 1440p in front of you, and if all you were ever used to was 1080p, then you should only notice how much sharper it is in comparison to what you're used to.

 

The other thing to take into account is how much GPU power you're going to need to power 4k to get any reasonable framerate. even a GTX 1080 isn't enough to get 60 fps on most games with max or near max settings in modern AAA games. A GTX 1080 should be able to get SMOOTH framerates in most games at most settings though, so keep that in mind. but if you want to go for 60 fps 4k gaming, you're going to need something stronger than a GTX 1080, look into SLI setups, or perhaps be forced to turn some settings down.

 

Lastly, you also have to consider things like UI scaling when it comes to 4k. on screen things like text and icons are going to appear about 1/4 the size they would on a 1080p monitor. A lot of applications, games, and windows have some UI scaling tricks you can use to make them bigger, but not all of them will have this feature, and not all of them may be able to scale up well enough for your desires. you will still experience this in 1440p, but to a much lesser extent.

 

 

I would recommend gaming on 1440p unless you directly intend on getting GPU's beefy enough to run 4k at a high framerate. 1440p would certainly be a lot more affordable. Money and power aside, 1440p's are touted around the world as the best gaming experience you can have. increased refresh rates directly lead to reduced input lag, increased framerates also contribute to the look and feel of your game. You may not think that 60 fps is a big deal, but when most people taste the sweet necter of 100+ fps gaming, they're usually hooked for life. I've never done it, but I'm sure theres a reason for it! =)

 

My recommendation would be if you're big into gaming, go 1440p. If you already have a 4k monitor (which it sounds like you have), just take the money you were going to use on a high quality 1440p monitor, and get GPU's good enough to run 4k

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Zyndo said:

2560x1440 = 3.6 megapixel

3840x2160 = 8.3 megapixel

 

4k is more than twice the pixel density as 1440p. It should look at least twice as sharp. Whether or not your particular eyes are sensitive enough to pick up on this is anyone's guess. obviously there is a limit to what the human eye can detect, and that will vary a bit from person to person. If you had them side by side you could probably pick out which one was the 4k with minimal effort. if all you had was the 1440p in front of you, and if all you were ever used to was 1080p, then you should only notice how much sharper it is in comparison to what you're used to.

 

The other thing to take into account is how much GPU power you're going to need to power 4k to get any reasonable framerate. even a GTX 1080 isn't enough to get 60 fps on most games with max or near max settings in modern AAA games. A GTX 1080 should be able to get SMOOTH framerates in most games at most settings though, so keep that in mind. but if you want to go for 60 fps 4k gaming, you're going to need something stronger than a GTX 1080, look into SLI setups, or perhaps be forced to turn some settings down.

 

Lastly, you also have to consider things like UI scaling when it comes to 4k. on screen things like text and icons are going to appear about 1/4 the size they would on a 1080p monitor. A lot of applications, games, and windows have some UI scaling tricks you can use to make them bigger, but not all of them will have this feature, and not all of them may be able to scale up well enough for your desires. you will still experience this in 1440p, but to a much lesser extent.

 

 

I would recommend gaming on 1440p unless you directly intend on getting GPU's beefy enough to run 4k at a high framerate. 1440p would certainly be a lot more affordable. Money and power aside, 1440p's are touted around the world as the best gaming experience you can have. increased refresh rates directly lead to reduced input lag, increased framerates also contribute to the look and feel of your game. You may not think that 60 fps is a big deal, but when most people taste the sweet necter of 100+ fps gaming, they're usually hooked for life. I've never done it, but I'm sure theres a reason for it! =)

 

My recommendation would be if you're big into gaming, go 1440p. If you already have a 4k monitor (which it sounds like you have), just take the money you were going to use on a high quality 1440p monitor, and get GPU's good enough to run 4k

Thank u for the informative reply! I currently have a single evga gtx 1080 sc and i havent bought a monitor yet! My eyes fortunately  (unfortunately?) Are as good as a human eye can get 20/8 which is considerably above average. too bad the same cant be said about my hearing haha.that is what headphone use gets you! I know a gtx 1080 will struggle with 4k but the way i thought of it is that i can turn down settings slightly in fps games .and use gsync to keep it smooth when its above 50hz in single player or rpg games...i have seen 144hz and while there is a difference it is not as big as the marketing campaign so im trying to weight the 4k difference if it is big i will not sacrifice it for the 144hz...if its small then a sacrifice may be worth it. I will upgrade gpus or sli in the future so a 4k may be more future proof vs 1440 more current. Im so lost man i have no idea what to get!

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, tiny_disaster said:

snip

Well long term, and if money isn't a HUGE issue for you. you may be better off getting a 4k monitor then, especially if you have keen eyes.... the titan P and 1080ti are likely going to be the first single GPU's to be able to hit that magical 60 fps on 4k (although that is largely speculation at this point. they haven't even been announced yet, so who knows how good they'll actually be). BUT if you intend on upgrading at some point, you could get the 4k now, settle for high-ultra  settings and be totally fine. then when something totally baws comes out, or if you want to SLI 1080's, then you could do that and really take advantage of your monitor. and if your monitor is G-sync, then perhaps you won't even need to do that route. if you're into casual gaming, your 1080 should keep your framerates smooth enough, even at max or near max settings (although you may only get 30-40 fps tops) and your G-sync could handle the rest of the work to make the experience feel seemless

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, Zyndo said:

Well long term, and if money isn't a HUGE issue for you. you may be better off getting a 4k monitor then, especially if you have keen eyes.... the titan P and 1080ti are likely going to be the first single GPU's to be able to hit that magical 60 fps on 4k (although that is largely speculation at this point. they haven't even been announced yet, so who knows how good they'll actually be). BUT if you intend on upgrading at some point, you could get the 4k now, settle for high-ultra  settings and be totally fine. then when something totally baws comes out, or if you want to SLI 1080's, then you could do that and really take advantage of your monitor. and if your monitor is G-sync, then perhaps you won't even need to do that route. if you're into casual gaming, your 1080 should keep your framerates smooth enough, even at max or near max settings (although you may only get 30-40 fps tops) and your G-sync could handle the rest of the work to make the experience feel seemless

I dont mind playing single player games like darksouls at a little below 60hz g sync is on. Most competitive games are not demanding anyway and i guess playing those at 60hz and g sync is still better than regular 60hz.

The 1440p 144hz ips gsync costs as much as the 4k gsync ips 60hz here! Im having a hard time justifying that.

I can get a 1440p 144hz tn(good tn) with no gsync for much less though. You think g sync will make a difference in such high hz monitors?! 

Link to comment
Share on other sites

Link to post
Share on other sites

Gsync is pretty much DESIGNED for high fps in mind. If you have a 60hz panel, and are able to get 60hz pinned all the time all day long no matter what, then Gsync wont do much of anything for you (except reduce input lag when using Vsync). Basically what Gsync does (other than slightly reducing input lag caused by Vsync) is adjust the monitors framerate on the fly so that when you use Vsync, and you're NOT at your monitors fps cap, or when you experience high fps fluctuations, you don't experience gameplay stutters.

 

With normal Vsync @ 60 hz, and if you're NOT at 60 hz for any reason, 1 of 2 things will happen:

1. Your monitor will continue to draw at 60 fps. but if wants to draw, and there is no new frame to be drawn, it will simply draw the previous frame again (or hold it there). If you are playing at 45 fps, you could imagine this being a problem when it holds (approximately) every 4th frame for twice as long. you get little jitters in your game.

2. If you drop below 60 fps, your monitor will immediately cut to half its fps, in this case, 30 fps. It will then draw at 30 fps, and there won't really be any stutters, but the game will often look mildly choppy in comparison to the 60 you were just playing at.

 

Gsync will largely eliminate these issues, as well as reduce the input lag caused by Vsync. Gsync will dynamically adjust your monitors framerate in order to match whatever your GPU is pumping out. if you have a 60 hz panel, and you're always at 60 fps, this technology won't do a whole lot for you. if your monitor is 165hz, and you're playing anywhere between that and 120, this technology can be amazing. Its also good if you can't hit a full 60 fps obviously, and there are still perks for when you have a steady framerate (again, that reduced input lag), but its really designed for fluctuating fps gaming.

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, Zyndo said:

Gsync is pretty much DESIGNED for high fps in mind. If you have a 60hz panel, and are able to get 60hz pinned all the time all day long no matter what, then Gsync wont do much of anything for you (except reduce input lag when using Vsync). Basically what Gsync does (other than slightly reducing input lag caused by Vsync) is adjust the monitors framerate on the fly so that when you use Vsync, and you're NOT at your monitors fps cap, or when you experience high fps fluctuations, you don't experience gameplay stutters.

 

With normal Vsync @ 60 hz, and if you're NOT at 60 hz for any reason, 1 of 2 things will happen:

1. Your monitor will continue to draw at 60 fps. but if wants to draw, and there is no new frame to be drawn, it will simply draw the previous frame again (or hold it there). If you are playing at 45 fps, you could imagine this being a problem when it holds (approximately) every 4th frame for twice as long. you get little jitters in your game.

2. If you drop below 60 fps, your monitor will immediately cut to half its fps, in this case, 30 fps. It will then draw at 30 fps, and there won't really be any stutters, but the game will often look mildly choppy in comparison to the 60 you were just playing at.

 

Gsync will largely eliminate these issues, as well as reduce the input lag caused by Vsync. Gsync will dynamically adjust your monitors framerate in order to match whatever your GPU is pumping out. if you have a 60 hz panel, and you're always at 60 fps, this technology won't do a whole lot for you. if your monitor is 165hz, and you're playing anywhere between that and 120, this technology can be amazing. Its also good if you can't hit a full 60 fps obviously, and there are still perks for when you have a steady framerate (again, that reduced input lag), but its really designed for fluctuating fps gaming.

Alright i will stick with gsync even if i go with 144hz but im starting to lean towards 4k i will give it some more thought 

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 month later...

Hello guys.I have a situation and could really use some advise. I owned a 24 inch 1440p monitor (122 ppi)  and upgraded to 27 inch 4k (160 ppi). At every game I tested it looks exactly the same to.me except witcher 3 maybe 5% better and I m not even sure about that. I'm thinking of selling the 4k monitor and get a 1440p 27 inch (108 PPI) because the frame reduction is massive but I m concerned that the quality of 1440p at 27 inch will look worst than 1440p at 24 inch. What do you think? Anyone with similar experience?

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×