Jump to content

Which is more demanding: 1440p or 144hz?

Go to solution Solved by Emberstone,

In theory, wouldn't 1080p@144 be slightly more taxing than 1440p@60?

 

If simple calculator math would suffice, 1920x1080 = 2,073,600 pixels. Referesh those 144 times per second means 298,598,400 pixels redrawn per second.

 

2560x1440 = 3,686,400 pixels. Refresh those 60 times per second means 221,184,000 pixels redrawn per second.

 

In theory, this would seem that 1080p@144 requires more GPU horsepower, but I don't think it's that simple and could depend on the game. Someone more knowledgeable should disprove/confirm what I've said before you believe it. xD

I have a 480 (overclocked to 1412 so basically an overclocked 580) and im wondering which will be easier to achieve. 

13700k, 3070, 32GB@3200

                   

Link to comment
Share on other sites

Link to post
Share on other sites

Well if you have over 144fps in the games you play then you can take advantage of that high refresh rate. You can try setting a custom resolution to 1440p and play on it and see if the frame rate is good or not.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, tp95112 said:

Well if you have over 144fps in the games you play then you can take advantage of that high refresh rate. You can try setting a custom resolution to 1440p and play on it and see if the frame rate is good or not.

 

i mean like which will be more taxing and require me to lower the settings more:1440p60fps or 1080p144fps

13700k, 3070, 32GB@3200

                   

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, huilun02 said:

You need to question what is the point of 1440p60 or 1080p144 and which would be more beneficial for what you do with your PC.

As to which solution is more forgiving, a 144Hz screen will refresh at 144Hz even if the framerate doesn't match up.

Chances are, you would decide between a 1440p60 IPS screen and a 1080p144 TN. So color requirements are also a factor to consider.

 

Yeah but i mean which will force me to compromise more in order to fully utilize. 

13700k, 3070, 32GB@3200

                   

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Nickathom said:

Yeah but i mean which will force me to compromise more in order to fully utilize. 

Both will be pretty much the same. 144, you want to get as muchframes  as possible. 1440p you want to squeeze every frame you can get to obtain a smooth(60 fps)experience

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Nickathom said:

I have a 480 (overclocked to 1412 so basically a 580) and im wondering which will be easier to achieve. 

All I can say is that my rig (4770k@ 4.4 Ghz, GTX 1060 6GB@ 2.05 Ghz) hardly reaches 140fps in any modern triple-a titles, and since your GPU is slightly slower your rig won't either. Theres absolutely no point in having a 144HZ monitor then.

Link to comment
Share on other sites

Link to post
Share on other sites

In theory, wouldn't 1080p@144 be slightly more taxing than 1440p@60?

 

If simple calculator math would suffice, 1920x1080 = 2,073,600 pixels. Referesh those 144 times per second means 298,598,400 pixels redrawn per second.

 

2560x1440 = 3,686,400 pixels. Refresh those 60 times per second means 221,184,000 pixels redrawn per second.

 

In theory, this would seem that 1080p@144 requires more GPU horsepower, but I don't think it's that simple and could depend on the game. Someone more knowledgeable should disprove/confirm what I've said before you believe it. xD

Current Build:

CPU: Ryzen 7 5800X3D

GPU: RTX 3080 Ti FE

RAM: 32GB G.Skill Trident Z CL16 3200 MHz

Mobo: Asus Tuf X570 Plus Wifi

CPU Cooler: NZXT Kraken X53

PSU: EVGA G6 Supernova 850

Case: NZXT S340 Elite

 

Current Laptop:

Model: Asus ROG Zephyrus G14

CPU: Ryzen 9 5900HS

GPU: RTX 3060

RAM: 16GB @3200 MHz

 

Old PC:

CPU: Intel i7 8700K @4.9 GHz/1.315v

RAM: 32GB G.Skill Trident Z CL16 3200 MHz

Mobo: Asus Prime Z370-A

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, black0utm1rage said:

All I can say is that my rig (4770k@ 4.4 Ghz, GTX 1060 6GB@ 2.05 Ghz) hardly reaches 140fps in any modern triple-a titles, and since your GPU is slightly slower your rig won't either. Theres absolutely no point in having a 144HZ monitor then.

 
 
 

Uh.... What? 580s are faster than 1060s. 
Also keep in mind that he was using 17.4.4 and with the release of the new 17.5.1 drivers, 480s/580s can expect a +4% boost in performance all around.
Also 140 is high, what games are you playing?

 

13700k, 3070, 32GB@3200

                   

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Emberstone said:

 

In theory, this would seem that 1080p@144 requires more GPU horsepower, but I don't think it's that simple and could depend on the game. Someone more knowledgeable should disprove/confirm what I've said before you believe it. xD

the theory is about that, indeed.

the downside of that is that some games behave differently with resolution scaling, and other issues may pop up as well that blur the field even further. for example if you have a slower processor, 144Hz may be an issue on that side, and if someone insists that anti aliasing should be on with high resolutions (please.. why..) that'd be quite the show stopper at high resolutions.

 

but honestly, different gaming experiences cater to different preferences on this end. the question shouldnt be which one you *can* run, it should be which one you *want* to run. slow paced games like anno, civilization, conan exiles to a degree, the vareous simulator games, etc often benefit more from having more pixels than having a faster refresh rate. it's the so called "eyecandy titles" where you have time to go look for the details.

on the flip side you have the fast paced stuff like counter strike and other shooters, racing games like trackmania, etc which are so fast it pretty much doesnt even matter if a plant actually has nice textures on it, is in super high res, and has some ladybug crawling over it, because honestly it's only in frame for a split second as you're running past to explain to your russian comrades the bomb is at B.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Nickathom said:

Uh.... What? 580s are faster than 1060s. 

Yeah thats not exactly true and you know it. I've seen that HUB video an all it shows is that the average 580 is faster in some scenarios than the average 1060, but most of the time they are tied.

But as for our case: You don't even have a 580 and my particular GTX 1060 is oc'ed to over 2 Ghz (which is easily possible on most aftermarket 1060s anyways), so it should be pretty obvious which one is faster.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, black0utm1rage said:

Yeah thats not exactly true and you know it. I've seen that HUB video an all it shows is that the average 580 is faster in some scenarios than the average 1060, but most of the time they are tied.

But as for our case: You don't even have a 580 and my particular GTX 1060 is oc'ed to over 2 Ghz (which is easily possible on most aftermarket 1060s anyways), so it should be pretty obvious which one is faster.

 

My 480 is significantly faster than the reference 580, too....

13700k, 3070, 32GB@3200

                   

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, black0utm1rage said:

 

But as for our case: You don't even have a 580 and my particular GTX 1060 is oc'ed to over 2 Ghz (which is easily possible on most aftermarket 1060s anyways), so it should be pretty obvious which one is faster.

comparing an OC'd card to a non-OC'd card is a pretty useless metric...

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, manikyath said:

comparing an OC'd card to a non-OC'd card is a pretty useless metric...

Not to mention my 480 is still faster than most 580s.

13700k, 3070, 32GB@3200

                   

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Nickathom said:

My 480 is significantly faster than the reference 580, too....

True, lets just not argue anymore. ;) Shouldn't make a difference anyways, since both our GPUs are too slow for 144fps in triple-a games...

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, black0utm1rage said:

True, lets just not argue anymore. ;) Shouldn't make a difference anyways, since both our GPUs are too slow for 144fps in triple-a games...

yeah. sorry :(

13700k, 3070, 32GB@3200

                   

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, manikyath said:

comparing an OC'd card to a non-OC'd card is a pretty useless metric...

Go ahead and explain pls. What you are saying is, that higher clockspeeds on the core and memory does not result in higher performance? Or is it a useless metric because you can't determine how much faster the oc'd card actually is?

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, black0utm1rage said:

Go ahead and explain pls. What you are saying is, that higher clockspeeds on the core and memory does not result in higher performance? Or is it a useless metric because you can't determine how much faster the oc'd card actually is?

 

prolly because its not really fair. its like comparing an adults strength to a childs. 

13700k, 3070, 32GB@3200

                   

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, black0utm1rage said:

Go ahead and explain pls. What you are saying is, that higher clockspeeds on the core and memory does not result in higher performance? Or is it a useless metric because you can't determine how much faster the oc'd card actually is?

because the owner of said non-OC'd card can just OC it and the tables are turned again.

 

also, what your card may OC to may not be possible on someone else's card. if you'd recommend someone a specific card because *your OC'd card can do something* would make the person feel less than pleased when his isnt capable of the same results.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Nickathom said:

prolly because its not really fair. its like comparing an adults strength to a childs. 

good analogy there ;)

 

you'll always win a running contest against a kid, but if you'd wait for the kid to grow up (in theory ;)) the results could be very different.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, manikyath said:

because the owner of said non-OC'd card can just OC it and the tables are turned again.

 

also, what your card may OC to may not be possible on someone else's card. if you'd recommend someone a specific card because *your OC'd card can do something* would make the person feel less than pleased when his isnt capable of the same results.

 

Eh, 2ghz isnt that hard to hit on a 1060. Youd have to completely fail the silicon lottery. but yeah youre right

13700k, 3070, 32GB@3200

                   

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Nickathom said:

yeah. sorry :(

I mean the industry screws us all into being competetive about such stupid things... There is just no point in arguing. Also in terms of DX12 you are most likely right about the AMD cards being significantly faster. Nvidia should really pull their thumbs out and fix their drivers...

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Nickathom said:

Eh, 2ghz isnt that hard to hit on a 1060. Youd have to completely fail the silicon lottery. but yeah youre right

you know that one day, someone's gonna lose the lotery, and that guy's gonna feel boned AF :D

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, manikyath said:

you know that one day, someone's gonna lose the lotery, and that guy's gonna feel boned AF :D

maybe :P though often bad chips are downgraded into lower teir cards like the 1050ti. idk if they do this with pascal, though i think they do, but it did happen with maxwell for sure

13700k, 3070, 32GB@3200

                   

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Emberstone said:

In theory, wouldn't 1080p@144 be slightly more taxing than 1440p@60?

 

If simple calculator math would suffice, 1920x1080 = 2,073,600 pixels. Referesh those 144 times per second means 298,598,400 pixels redrawn per second.

 

2560x1440 = 3,686,400 pixels. Refresh those 60 times per second means 221,184,000 pixels redrawn per second.

 

In theory, this would seem that 1080p@144 requires more GPU horsepower, but I don't think it's that simple and could depend on the game. Someone more knowledgeable should disprove/confirm what I've said before you believe it. xD

It is that simple, for the most part :)

 

However, reaching 144 Hz will require more CPU power than reaching 60 Hz, so holding aside GPU requirements, you'll need a stronger CPU to be capable of hitting 1080p 144 fps than you would for hitting 1440p 60 fps.

 

GPU demand scales with pixels per second, CPU demand scales with frames per second.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×