Jump to content

Does 4k@60Htz or 1440@144Htz need more GPU power? (#OutOfIdeas)

Gram
Go to solution Solved by EunSoo,

Answer is pretty easy. Lets look at some Rise of the Tomb Raider benchmarks for the 1080ti

 

1440p: 117fps

4k: 62fps

https://arstechnica.com/gadgets/2017/03/nvidia-gtx-1080-ti-review/

 

Therefore, 4k60hz is easier to run than 1440p144hz. Just look at reviews for your GPU and the games you play and the answer is there.

What setup takes more GPU power to run (on average)?

  • 4k with average FPS of ~60
  • 1440 with average FPS of ~144

 

Depending on the answer it may change the screen I choose as 4k screens are getting cheap now and I could see the benefit in both. I could be mistaken but I would expect that in slow games (like Civ) FPS is less important and with faster games (like CS:GO) FPS is more important.

Either way I think this would be a cool video idea, what does the LTT form community think?

 

Gram

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Gram said:

What setup takes more GPU power to run (on average)?

  • 4k with average FPS of ~60
  • 1440 with average FPS of ~144

 

Depending on the answer it may change the screen I choose as 4k screens are getting cheap now and I could see the benefit in both. I could be mistaken but I would expect that in slow games (like Civ) FPS is less important and with faster games (like CS:GO) FPS is more important.

Either way I think this would be a cool video idea, what does the LTT form community think?

 

Gram

in a game like CS:GO, 4K would probably need more GPU power. 

Civ... not sure.

 

Either way, if those are the hardest games you'd be playing a GTX 1080 would easily handle either. 

Different PCPartPickers for different countries:

UK-----Italy----Canada-----Spain-----Germany-----Austrailia-----New Zealand-----'Murica-----France-----India

 

10 minutes ago, Stardar1 said:

Well, with an i7, GTX 1080, Full tower and flashy lights, it can obviously only be for one thing:

Solitaire. 

Link to comment
Share on other sites

Link to post
Share on other sites

Answer is pretty easy. Lets look at some Rise of the Tomb Raider benchmarks for the 1080ti

 

1440p: 117fps

4k: 62fps

https://arstechnica.com/gadgets/2017/03/nvidia-gtx-1080-ti-review/

 

Therefore, 4k60hz is easier to run than 1440p144hz. Just look at reviews for your GPU and the games you play and the answer is there.

Link to comment
Share on other sites

Link to post
Share on other sites

144Hz 1440p is going to be harder to run but id rather have that then 4K but i would take 4k 75Hz over both any day lol

I spent $2500 on building my PC and all i do with it is play no games atm & watch anime at 1080p(finally) watch YT and write essays...  nothing, it just sits there collecting dust...

Builds:

The Toaster Project! Northern Bee!

 

The original LAN PC build log! (Old, dead and replaced by The Toaster Project & 5.0)

Spoiler

"Here is some advice that might have gotten lost somewhere along the way in your life. 

 

#1. Treat others as you would like to be treated.

#2. It's best to keep your mouth shut; and appear to be stupid, rather than open it and remove all doubt.

#3. There is nothing "wrong" with being wrong. Learning from a mistake can be more valuable than not making one in the first place.

 

Follow these simple rules in life, and I promise you, things magically get easier. " - MageTank 31-10-2016

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1440p@144Hz is much MUCH more demanding than 4K60Hz

[FS][US] Corsair H115i 280mm AIO-AMD $60+shipping

 

 

System specs:
Asus Prime X370 Pro - Custom EKWB CPU/GPU 2x360 1x240 soft loop - Ryzen 1700X - Corsair Vengeance RGB 2x16GB - Plextor 512 NVMe + 2TB SU800 - EVGA GTX1080ti - LianLi PC11 Dynamic
 

Link to comment
Share on other sites

Link to post
Share on other sites

its straight up just more pixels per second at 1440p 144hz vs 4k 60hz.

so yes 1440p 144hz is more demanding.

CPU: I7 8700k @ 5ghz | Motherboard: Asus Z370-Prime | RAM: White Crucial balistix DDR4 2133mhz | GPU: GTX 1080TI | Storage: ssd HyperX 240gig, 2x2tb seagate Firecuda 1tb, BPX 480 gig nvme, 1tb sandisk ssd  | Cooling: Custom loop | PSU: Evga supernova 850w G2 | Case: Phanteks enthoo evolv atx black White modded | system theme: White/RGB/Weiss

Link to comment
Share on other sites

Link to post
Share on other sites

In order to hit 1440p 144Hz you need a fast GPU and CPU. The GPU to handle the resolution and the CPU to handle the call rate.

 

Higher resolutions (4k+) will be limited by the GPU. CPU will be waiting for the GPU to be done with a frame before asking it to render another frame.

 

Lower resolution (1080p) high refresh rate (144Hz+) will be CPU bound due to the rate at which the CPU can request frames. GPU will be waiting for the CPU to tell it to render another frame.

 

1440p 144hz is smack dab right in the middle. Requiring lots of both GPU and CPU power. In this situation a 7700k with a 1080 Ti both will be maxed out. GPU wont be waiting on the CPU and the CPU wont be waiting on the GPU. Well not quite, more like they will trade off who is waiting on whom. 

 

Now 4k 144Hz will come in the future when stronger GPUs come around (Volta and beyond). At that point we may see the CPU limiting the GPU at 4k gaming. Maybe hit somewhere around 120FPS at 4k in current gen game titles (new game titles may not) IMO.

 

So to answer your question 4K and 1440p gaming will both max out a GPU at the moment. But the 1440p high refresh rate will also max out the CPU.

 

Now lets get to the point: You want to know what kind of monitor to buy right? Well it all comes down to what you want to use it for. Do you want to play FPS games? MMORPGs? STRAT? ect.

 

If you play lots of both then (her comes my opinion) I would get the 1440p 144Hz monitor because you will get a better experience in FPS games and you wont get a bad experience in MMORPGs or any other game that doesn't need a high refresh rate. It scales well and the overall experience will be nice.

Link to comment
Share on other sites

Link to post
Share on other sites

Purely based off pixels/second 4k60 is 497,664,000 and 1440p/144 is 530,841,600 so based off of that 4k/60 renders only 94% of the pixels per second that 1440p/144 has to render. However, you can't just base demand off of pure pixels rendered per second otherwise you would always see that 4k framerates in games are exactly 1/4 of the 1080p framerates or 4/9th 1440p framerates. In reality there are factors to consider that contribute to overhead demand. For example, some processes like AI do not usually depend on framerate or resolution. Other calculations, like physics, do not depend on resolution but do depend on framerate.

 

The general rule of thumb is that higher framerates place a higher demand on the CPU and higher resolutions place a higher demand on the GPU but beyond that it's a game by game basis and there's no good way to be sure without testing each game individually.

Primary PC-

CPU: Intel i7-6800k @ 4.2-4.4Ghz   CPU COOLER: Bequiet Dark Rock Pro 4   MOBO: MSI X99A SLI Plus   RAM: 32GB Corsair Vengeance LPX quad-channel DDR4-2800  GPU: EVGA GTX 1080 SC2 iCX   PSU: Corsair RM1000i   CASE: Corsair 750D Obsidian   SSDs: 500GB Samsung 960 Evo + 256GB Samsung 850 Pro   HDDs: Toshiba 3TB + Seagate 1TB   Monitors: Acer Predator XB271HUC 27" 2560x1440 (165Hz G-Sync)  +  LG 29UM57 29" 2560x1080   OS: Windows 10 Pro

Album

Other Systems:

Spoiler

Home HTPC/NAS-

CPU: AMD FX-8320 @ 4.4Ghz  MOBO: Gigabyte 990FXA-UD3   RAM: 16GB dual-channel DDR3-1600  GPU: Gigabyte GTX 760 OC   PSU: Rosewill 750W   CASE: Antec Gaming One   SSD: 120GB PNY CS1311   HDDs: WD Red 3TB + WD 320GB   Monitor: Samsung SyncMaster 2693HM 26" 1920x1200 -or- Steam Link to Vizio M43C1 43" 4K TV  OS: Windows 10 Pro

 

Offsite NAS/VM Server-

CPU: 2x Xeon E5645 (12-core)  Model: Dell PowerEdge T610  RAM: 16GB DDR3-1333  PSUs: 2x 570W  SSDs: 8GB Kingston Boot FD + 32GB Sandisk Cache SSD   HDDs: WD Red 4TB + Seagate 2TB + Seagate 320GB   OS: FreeNAS 11+

 

Laptop-

CPU: Intel i7-3520M   Model: Dell Latitude E6530   RAM: 8GB dual-channel DDR3-1600  GPU: Nvidia NVS 5200M   SSD: 240GB TeamGroup L5   HDD: WD Black 320GB   Monitor: Samsung SyncMaster 2693HM 26" 1920x1200   OS: Windows 10 Pro

Having issues with a Corsair AIO? Possible fix here:

Spoiler

Are you getting weird fan behavior, speed fluctuations, and/or other issues with Link?

Are you running AIDA64, HWinfo, CAM, or HWmonitor? (ASUS suite & other monitoring software often have the same issue.)

Corsair Link has problems with some monitoring software so you may have to change some settings to get them to work smoothly.

-For AIDA64: First make sure you have the newest update installed, then, go to Preferences>Stability and make sure the "Corsair Link sensor support" box is checked and make sure the "Asetek LC sensor support" box is UNchecked.

-For HWinfo: manually disable all monitoring of the AIO sensors/components.

-For others: Disable any monitoring of Corsair AIO sensors.

That should fix the fan issue for some Corsair AIOs (H80i GT/v2, H110i GTX/H115i, H100i GTX and others made by Asetek). The problem is bad coding in Link that fights for AIO control with other programs. You can test if this worked by setting the fan speed in Link to 100%, if it doesn't fluctuate you are set and can change the curve to whatever. If that doesn't work or you're still having other issues then you probably still have a monitoring software interfering with the AIO/Link communications, find what it is and disable it.

Link to comment
Share on other sites

Link to post
Share on other sites

Thank you all for your replies they were all informative, I had thought it would be a close battle but it seems like it is fairly cut and dry.

 

I guess this was common knowledge, I am glad I am up to speed now.

 

4 hours ago, SCGazelle said:

Answer is pretty easy. Lets look at some Rise of the Tomb Raider benchmarks for the 1080ti

 

1440p: 117fps

4k: 62fps

https://arstechnica.com/gadgets/2017/03/nvidia-gtx-1080-ti-review/

 

Therefore, 4k60hz is easier to run than 1440p144hz. Just look at reviews for your GPU and the games you play and the answer is there.

Side note:

Although maybe not as informative I like this approach, using it I should be able to just look at different benchmarks of popular games to get an idea of some real world examples.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×