Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Down scaling and it's effects or lack there of on system performance?

I'm sorry if it's already been covered I'm a first-time poster and if it has please let me know, but I'd be interested to see how down scaling a higher resolution monitor to lower resolution effects system performance.  I've read on some forums or publications that suggest, with no definitive testing, that it could potentially tax the gpu up to 4x as much when compared to running native resolution in either 16:9 or 21:9.  The explanation on the down scaling from higher to lower resolution suggest that even though the resolution may be lower the pixel density of say a 4k monitor at 32 inches running 1920x1080 demands greater gpu horsepower to drive the more densely populated screen.  I know it's true that with DLSS 2.0 the effects of up sampling can improve in game fps due to the gpu rendering at lower resolution, so to me at least it wouldn't make sense that down scaling would have any negative impact on the gpu.  What are your thoughts? 

Screenshot_20210410-203921_Chrome.jpg

Link to post
Share on other sites

Downscaling means you're rendering at a higher resolution than what can be displayed. So, if you've got a 1080p display, but you're rendering at 1440p and downscaling to 1080p, the card is having to render 2x as many pixels as native, and thus will be half as powerful (i.e. FPS effectively will be cut in half). This can be an effective alternative to other AA techniques if you've got the horsepower to spare, but if you're already using your card to the max, it's just going to be a worse experience.

Link to post
Share on other sites

If you're talking about supersampling, that is where you run the game at 1440p or 4k on a 1080p monitor, and it 'downscales' the image to fit. This means you get a sharper image but it takesmore GPU power to run, since you're basically playing the game at 1440p or 4k instead of 1080p.

 

If you're talking about changing your monitor resolution to something lower, like having a 4k monitor but setting your PC to output 1080p (and the monitor upscales it to fit 4k) then essentially you're playing the game at 1080p so your performance is much higher but the details are all worse.

NEW PC build: Blank Heaven   minimalist white and black PC     Old S340 build log "White Heaven"        The "LIGHTCANON" flashlight build log        Project AntiRoll (prototype)        Custom speaker project

Spoiler

Ryzen 3950X | AMD Vega Frontier Edition | ASUS X570 Pro WS | Corsair Vengeance LPX 64GB | NZXT H500 | Seasonic Prime Fanless TX-700 | Custom loop | Coolermaster SK630 White | Logitech MX Master 2S | Samsung 980 Pro 1TB + 970 Pro 512GB | Samsung 58" 4k TV | Scarlett 2i4 | 2x AT2020

 

Link to post
Share on other sites

Or... is he talking Desktop Overheads by using Nvidia DSR 4X....?
1080p Desktop with DSR active for 4K Games...

VS 2160P DSR Desktop (Zero filtering in DSR) and playing in 4K DSR...

I have DSR activated with Zero filtering set for 4K specifically, and my desktop runs at 2160p (faked) and so do my games.
Only when using non 1:1 scaling (1440p) do I then use DSRfiltering again...


So the overheads of Desktop would increase and be more demanding is the question asked?
Answer is yes, but how much I cannot say (outside of VRAM usage)

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to post
Share on other sites

I believe Enderman's second statement best describes my situation, having to upscale a game running at 1080p  to fit a 4k monitor in this explanation is upscaling not downscaling my mistake.  So in the situation where up scaling a game from 1080p to fit a 4k monitor, isn't the gpu still driving the same amount of pixels as if it were running the game in 4k?  @ Skilledrebuilds that's an interesting statement I appreciate your input!  @ Chris thanks for the clarification.

Link to post
Share on other sites
1 minute ago, Sabercbr929 said:

I believe Enderman's second statement best describes my situation, having to upscale a game running at 1080p  to fit a 4k monitor in this explanation is upscaling not downscaling my mistake.  So in the situation where up scaling a game from 1080p to fit a 4k monitor, isn't the gpu still driving the same amount of pixels as if it were running the game in 4k?  @ Skilledrebuilds that's an interesting statement I appreciate your input!  @ Chris thanks for the clarification.

No. The render resolution is the render resolution. Upscaling alone doesn't require anything. Now, things like DLSS use AI and lean on the tensor cores of the Nvidia GPU to upscale in a smarter way than just making 1 pixel into 4. That takes some amount of processing power, but it's pretty minute overall.

Link to post
Share on other sites
1 minute ago, Chris Pratt said:

No. The render resolution is the render resolution. Upscaling alone doesn't require anything. Now, things like DLSS use AI and lean on the tensor cores of the Nvidia GPU to upscale in a smarter way than just making 1 pixel into 4. That takes some amount of processing power, but it's pretty minute overall.

Makes sense thank you for the comments!

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×