Jump to content

Why does performance increase as resolution is increased?

Jetrise
Go to solution Solved by Aereldor,

That's GPU bottlenecking. 

 

Think of PC performance as horses pulling a carriage. At 1080p, the GPU isn't too stressed, so the GPU 'horse' can run at full speed. Since the GPU horse is running at full speed, the CPU horse can also run at full speed.

 

At 1440p, that's almost twice as many pixels as 1080p. The carriage is twice as heavy, and the GPU horse can't go as fast anymore. As a result, no matter how strong the CPU horse is, it can only run at the speed set by the GPU horse. 

This can work vice versa.

Hi All,

 

Quick question for the community, my friend was getting horrific FPS in games such as Squad and Warzone. While watching task manager, we noticed the CPU was a bottleneck. After a bit of trial and error, we switched from a 1080p monitor to a 1440p monitor and noticed his FPS almost double in these titles, and CPU usage was not constantly at 100%. Could someone explain to me how CPU performance decreases when you increase the resolution of the monitor? I've done a bit of googling around this topic and I can't seem to understand it.

 

Specs for reference:

CPU: i5 4690K

GPU: GTX 1650 Super

RAM: 16GB

CPU: AMD 5800X                                            RAM: Corsair Vengeance LPX 2x16GB (3200MHz)                                                OS: Windows 10 Pro 64-bit                                         

GPU: Gigabyte RTX 3070 Gaming OC           Storage: WD SN750 1TB + Samsung 860 Evo 500GB + 2TB Seagate HDD         Case: Corsair 4000D Airflow                                       

MOBO: MSI X570 Tomahawk (Wi-Fi)              PSU: Corsair RM 850x                                                                                             CPU Cooling: Noctua NH-U12S chromax.black        

Mouse: Corsair M55                                        Keyboard: Corsair K55                                                                                            Headset: HyperX Cloud II                                           

 

              

                                                                                         

  

Link to comment
Share on other sites

Link to post
Share on other sites

That's not how it works. As I understood it, the CPU sets a rough limit for the FPS you can reach at a given set of settings (excluding the resolution). When you increase the render resolution you shouldn't be able to achieve a higher FPS, no matter what. I'd expect your testing methodology to be flawed, sry.

http://linustechtips.com/main/topic/334934-unofficial-ltt-beginners-guide/ (by Minibois) and a few things that will make our community interaction more pleasent:
1. FOLLOW your own topics                                                                                2.Try to QUOTE people so we can read through things easier
3.Use
PCPARTPICKER.COM - easy and most importantly approved here        4.Mark your topics SOLVED if they are                                
Don't change a running system

Link to comment
Share on other sites

Link to post
Share on other sites

That doesn't make sense, if you game at 1440p, the FPS should drop especially if you have 1650 super.

 

As for the cpu load decrease, that is normal, when you have lower fps (for the higher resolution), the load will be more on the gpu.

Ryzen 5700g @ 4.4ghz all cores | Asrock B550M Steel Legend | 3060 | 2x 16gb Micron E 2666 @ 4200mhz cl16 | 500gb WD SN750 | 12 TB HDD | Deepcool Gammax 400 w/ 2 delta 4000rpm push pull | Antec Neo Eco Zen 500w

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Jetrise said:

Hi All,

 

Quick question for the community, my friend was getting horrific FPS in games such as Squad and Warzone. While watching task manager, we noticed the CPU was a bottleneck. After a bit of trial and error, we switched from a 1080p monitor to a 1440p monitor and noticed his FPS almost double in these titles, and CPU usage was not constantly at 100%. Could someone explain to me how CPU performance decreases when you increase the resolution of the monitor? I've done a bit of googling around this topic and I can't seem to understand it.

 

Specs for reference:

CPU: i5 4690K

GPU: GTX 1650 Super

RAM: 16GB

This may sound stupid. but are you sure you weren't originally using a 4k monitor?

geometry is hard
b550 > x570

Link to comment
Share on other sites

Link to post
Share on other sites

Why buy new gpu when you can just up the resolution!

 

Genius!  

 

 

Except that's not how it works lol sorry. 

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

This is what I thought also. I checked that he was accidently rendering at a higher resolution when displaying 1080p.

 

Nope, was most certainly on a 1080p going to 1440p @Downkey.

 

I came across this article when googling which may explain it: https://www.reddit.com/r/buildapc/comments/9ikrwz/why_does_increasing_resolution_lower_cpu_load/

CPU: AMD 5800X                                            RAM: Corsair Vengeance LPX 2x16GB (3200MHz)                                                OS: Windows 10 Pro 64-bit                                         

GPU: Gigabyte RTX 3070 Gaming OC           Storage: WD SN750 1TB + Samsung 860 Evo 500GB + 2TB Seagate HDD         Case: Corsair 4000D Airflow                                       

MOBO: MSI X570 Tomahawk (Wi-Fi)              PSU: Corsair RM 850x                                                                                             CPU Cooling: Noctua NH-U12S chromax.black        

Mouse: Corsair M55                                        Keyboard: Corsair K55                                                                                            Headset: HyperX Cloud II                                           

 

              

                                                                                         

  

Link to comment
Share on other sites

Link to post
Share on other sites

That's GPU bottlenecking. 

 

Think of PC performance as horses pulling a carriage. At 1080p, the GPU isn't too stressed, so the GPU 'horse' can run at full speed. Since the GPU horse is running at full speed, the CPU horse can also run at full speed.

 

At 1440p, that's almost twice as many pixels as 1080p. The carriage is twice as heavy, and the GPU horse can't go as fast anymore. As a result, no matter how strong the CPU horse is, it can only run at the speed set by the GPU horse. 

This can work vice versa.

i5 12600KF | Zotac RTX 4080 Gaming trinity | Team Vulcan 2x16GB DDR4 3600 | ASRock Z690M-ITX/ac | WD Black SN850x 2TB

Cooler Master NR200P v2 | ID Cooling Zoomflow 280 XT | SeaSonic Focus SGX-750 | Thermalright 2x140mm + 2x120mm aRGB

LG C2 OLED 48" 120hz | Epomaker TH80 (Gateron Yellow) | Logitech MX Master 3 | Koss Porta Pro Comm

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Jetrise said:

This is what I thought also. I checked that he was accidently rendering at a higher resolution when displaying 1080p.

 

Nope, was most certainly on a 1080p going to 1440p @Downkey.

 

I came across this article when googling which may explain it: https://www.reddit.com/r/buildapc/comments/9ikrwz/why_does_increasing_resolution_lower_cpu_load/

 

from your link :

 

Quote

The resolution in and of itself doesn't have anything to do with it. Your CPU gets taxed when the framerates are high. Gaming at a higher resolution puts more work on the video card - making it harder to generate higher framerates. Lower framerates means less work for the CPU.

 

 

12 minutes ago, Aereldor said:

That's GPU bottlenecking. 

 

Think of PC performance as horses pulling a carriage. At 1080p, the GPU isn't too stressed, so the GPU 'horse' can run at full speed. Since the GPU horse is running at full speed, the CPU horse can also run at full speed.

 

At 1440p, that's almost twice as many pixels as 1080p. The carriage is twice as heavy, and the GPU horse can't go as fast anymore. As a result, no matter how strong the CPU horse is, it can only run at the speed set by the GPU horse. 

This can work vice versa.

Thats a good explanation and it can actually go both ways, yes. But what I don't know is how rare is this actually, it seems it would only work with games that have a heavy imbalance in some way?

 

 

Edit: like for example if I play MHW with my new and shiny RTX 3070 at 1080p it'll be like... 50% gpu usage, 25% cpu usage, on average. 

 

But if I up the resolution to 1440p (with NVIDIA DSR) I'll get close to 100% gpu usage, and cpu usage will stay roughly the same although it may even go down slightly, like 15-20%... (I'd need to check this more thoroughly though) 

 

So yeah, in other words how bad has the bottleneck actually to be for this to occur (higher fps at higher res, it seems nearly impossible)? 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I am as confused as everyone else is. It makes zero sense to me, thus my question. But, he is now playing games with 60% (1440p) CPU usage instead of 100% (1080p). GPU usage has also increase. Before, he was around 70% of 3D usage, now it is constantly maxed out. In Squad, he was getting an average FPS of 31, now he is getting 60. It makes absoluetly zero sense to me, that's why I am here. 

CPU: AMD 5800X                                            RAM: Corsair Vengeance LPX 2x16GB (3200MHz)                                                OS: Windows 10 Pro 64-bit                                         

GPU: Gigabyte RTX 3070 Gaming OC           Storage: WD SN750 1TB + Samsung 860 Evo 500GB + 2TB Seagate HDD         Case: Corsair 4000D Airflow                                       

MOBO: MSI X570 Tomahawk (Wi-Fi)              PSU: Corsair RM 850x                                                                                             CPU Cooling: Noctua NH-U12S chromax.black        

Mouse: Corsair M55                                        Keyboard: Corsair K55                                                                                            Headset: HyperX Cloud II                                           

 

              

                                                                                         

  

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Jetrise said:

I am as confused as everyone else is. It makes zero sense to me, thus my question. But, he is now playing games with 60% (1440p) CPU usage instead of 100% (1080p). GPU usgae has also increase. Before, he was around 70% of 3D usage, now it is constantly maxed out. In Squad, he was getting an average FPS of 31, now he is getting 60. It makes absoluetly zero sense to me, that's why I am here. 

Basically I think I figured it out... in practice this *only* works if you're 100% CPU bound, then upping the resolution can actually take away stress from the cpu because the gpu has to do more work now. 

 

I believe this would also have something to do with the game engine, and of course the actual imbalance of the pc build overall. 

 

Like I've described above, this sort of effect would be impossible with all hardware combinations I had so far with MHW, because I *know* for sure the game will instantly drop frames when gpu hits 100%...

 

 

of course this would also happen if cpu hits 100%, but this just seems extremely unlikely with a modern multi threaded cpu as the engine is heavily optimized for multi threading (MT Framework 3.0*)

 

*MT" stands for "Multi-Thread", "Meta Tools" and "Multi-Target". While initially MT Framework was intended to power 2006's Dead Rising and Lost Planet: Extreme Condition only, Capcom later decided for their internal development divisions to adopt it as their default engine.

 

 

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Thanks for the feedback all. I'm still not 100% sound why this works, but, I will remember it in the future. 

CPU: AMD 5800X                                            RAM: Corsair Vengeance LPX 2x16GB (3200MHz)                                                OS: Windows 10 Pro 64-bit                                         

GPU: Gigabyte RTX 3070 Gaming OC           Storage: WD SN750 1TB + Samsung 860 Evo 500GB + 2TB Seagate HDD         Case: Corsair 4000D Airflow                                       

MOBO: MSI X570 Tomahawk (Wi-Fi)              PSU: Corsair RM 850x                                                                                             CPU Cooling: Noctua NH-U12S chromax.black        

Mouse: Corsair M55                                        Keyboard: Corsair K55                                                                                            Headset: HyperX Cloud II                                           

 

              

                                                                                         

  

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Mark Kaine said:

what I don't know is how rare is this

Uh, it's not rare. Literally every single game has a bottleneck... Whether that's the CPU, GPU, RAM, or even the engine itself.  

i5 12600KF | Zotac RTX 4080 Gaming trinity | Team Vulcan 2x16GB DDR4 3600 | ASRock Z690M-ITX/ac | WD Black SN850x 2TB

Cooler Master NR200P v2 | ID Cooling Zoomflow 280 XT | SeaSonic Focus SGX-750 | Thermalright 2x140mm + 2x120mm aRGB

LG C2 OLED 48" 120hz | Epomaker TH80 (Gateron Yellow) | Logitech MX Master 3 | Koss Porta Pro Comm

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×