Jump to content

Is it possible to get GPU to upscale/ output at a higher resolution than the windows resolution setting?

Hello, I have a somewhat unusual problem I suppose, I have a laptop with a 14" 3072 x 1920  120Hz display which is frankly a lot more pixels than are strictly necessary most of the time. I created a custom resolution of 1536 x 960 for a couple reasons: namely power savings, it's only rendering a quarter as many pixels per second and the high refresh rate was more important to me than the resolution, and because I get horrible stuttering if I set the dedicated iGPU video memory to anything less than 2GB at base resolution so this could actually free up RAM for me. Honestly at 14 inches 960p is perfectly usable but it is still noticeably blurry if you look closely. I was wondering if there was any way to get the GPU (6800HS iGPU) to render windows at my decreased resolution but output to the internal display at its native resolution with some cheap upscaling/ antialiasing method (it doesn't need to be very good the pixel density's so high anyway I'd hardly notice) so I can make use of all these pixels I've got. Basically, low render resolution with higher output resolution that you see for programs/ games but for all of windows instead. I've looked online to see if anyone had a solution but best I can tell I'm the first person to even ask this question ¯\_(ツ)_/¯

Any help you can give me in achieving this (or just telling me why it's impossible) is appreciated.

Link to comment
Share on other sites

Link to post
Share on other sites

If you want sharp blocky pixels I think there is an integer scaling option in the drivers. Otherwise the other options will smoothly stretch it but it'll look blurry in comparison.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, BloopBucket said:

Hello, I have a somewhat unusual problem I suppose, I have a laptop with a 14" 3072 x 1920  120Hz display which is frankly a lot more pixels than are strictly necessary most of the time. I created a custom resolution of 1536 x 960 for a couple reasons: namely power savings, it's only rendering a quarter as many pixels per second and the high refresh rate was more important to me than the resolution, and because I get horrible stuttering if I set the dedicated iGPU video memory to anything less than 2GB at base resolution so this could actually free up RAM for me. Honestly at 14 inches 960p is perfectly usable but it is still noticeably blurry if you look closely. I was wondering if there was any way to get the GPU (6800HS iGPU) to render windows at my decreased resolution but output to the internal display at its native resolution with some cheap upscaling/ antialiasing method (it doesn't need to be very good the pixel density's so high anyway I'd hardly notice) so I can make use of all these pixels I've got. Basically, low render resolution with higher output resolution that you see for programs/ games but for all of windows instead. I've looked online to see if anyone had a solution but best I can tell I'm the first person to even ask this question ¯\_(ツ)_/¯

Any help you can give me in achieving this (or just telling me why it's impossible) is appreciated.

Not sure what's your goals, rendering windows at decreased res don't give any power decrease, and if you want better framerate in games use native res and upscaling 

System : AMD R9 5900X / Gigabyte X570 AORUS PRO/ 2x16GB Corsair Vengeance 3600CL18 ASUS TUF Gaming AMD Radeon RX 7900 XTX OC Edition GPU/ Phanteks P600S case /  Eisbaer 280mm AIO (with 2xArctic P14 fans) / 2TB Crucial T500  NVme + 2TB WD SN850 NVme + 4TB Toshiba X300 HDD drives/ Corsair RM850x PSU/  Alienware AW3420DW 34" 120Hz 3440x1440p monitor / Logitech G915TKL keyboard (wireless) / Logitech G PRO X Superlight mouse / Audeze Maxwell headphones

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, da na said:

It's not saving you power to run your display at a lower resolution. You can run games at a lower resolution to save power but running the whole system at that resolution is unnecessary.

Rendering fewer pixels should mean the GPU doesn't need to work as hard and will consume less power, no? Just to confirm my musings I opened up HWinfo and looked at GPU ASIC power while on the two resolutions and found at 960p it averaged 3W idle, 5W while watching video and peaked 5W scrolling while at 1920p it averaged 3W idle, 8W in video and peaked 10W scrolling. It seems the more of the screen it has to refresh the greater the disparity becomes (I was scrolling and throwing windows around like a madman to get full screen refreshes). I don't know how accurate the GPU ASIC power measurement is to real world power consumption (these numbers seem a bit high) but it does at least seem to demonstrate a difference.

Link to comment
Share on other sites

Link to post
Share on other sites

It sounds like you're overthinking things... the amount of power saved by reducing the resolution is extremely minor - you'll get more savings by reducing screen brightness by a few percent. 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, BloopBucket said:

Rendering fewer pixels should mean the GPU doesn't need to work as hard and will consume less power, no?

When gaming, sure. However, in that case you'll want to lower resolution, maybe use something like DLSS or FSR to upscale, and most importantly limit frame rate to actually get meaningful power savings. Maybe combine it with some undervolting and power limit while you're at it.

 

On the desktop? 5W vs 8W is not much of a meaningful difference. Use a watt meter to measure total system usage. A decrease of 3W in probably well below 1% reduction in total system power consumption.

Remember to either quote or @mention others, so they are notified of your reply

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Eigenvektor said:

When gaming, sure. However, in that case you'll want to lower resolution, maybe use something like DLSS or FSR to upscale, and most importantly limit frame rate to actually get meaningful power savings.

 

On the desktop? 5W vs 8W is not much of a meaningful difference.

It is a meaningful difference on a laptop, decreasing power draw by 3W gives over an hour of battery life when total system draw is less than 20W in the first place.

Link to comment
Share on other sites

Link to post
Share on other sites

I think you got something confused here.

 

All the pixels on the panel are still on and working nothing is off. Is there a difference in power consumption betwee 4k and 1080p? Yes if they are 4k and 1080p PANELS. Else well running at a lower resolution gives extremely minor power savings. Basically a couple minutes extra battery.

 

When not doing anything intensive it wont cause extra battery drain (maybe 0.001% extra). When it will cause more battery drain is if something is RENDERING at that resolution.

 

So for your case just run the panel at full res and use something like fsr and dlss in games. Or just run it at lower resolutions in games.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, BloopBucket said:

It is a meaningful difference on a laptop, decreasing power draw by 3W gives over an hour of battery life when total system draw is less than 20W in the first place.

In that case lowering screen brightness will likely give you more of a reduction, as would lowering refresh rate from 120 Hz to 60 Hz.

Remember to either quote or @mention others, so they are notified of your reply

Link to comment
Share on other sites

Link to post
Share on other sites

I've gotta say that I'm getting a bit confused about all these replies. I'm using HWinfo and seeing a difference of multiple watts in terms of draw from the battery between the resolutions while I'm doing anything other than idling, the difference doesn't seem negligible to me. Maybe with a 1080p60 display there would be a much less happening, but even in windows with a 3K120Hz display I'm seeing the iGPU getting up to 50%+ utilization, with over a gigabyte of VRAM usage, windows itself seems a relatively intensive task for the thing at this scale.

 

And sure I could lower the brightness, or decrease the refresh rate, but why not do all three? If lowering the resolution is making a measurable difference is there a reason I shouldn't have one extra tool at my disposal?

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, porina said:

If you want sharp blocky pixels I think there is an integer scaling option in the drivers. Otherwise the other options will smoothly stretch it but it'll look blurry in comparison.

Thanks for the reply, unfortunately it seems scaling options like that are hidden for internal displays or something, I only have options like preserve aspect ratio or fill panel.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, BloopBucket said:

If lowering the resolution is making a measurable difference is there a reason I shouldn't have one extra tool at my disposal?

By all means, do it. Though I'd recommend to measure with a watt meter. Software based solutions are generally more like rough estimates.

 

What @jaslion was talking about is that all pixels on the display are still on, so the panel itself will draw the same power as before. In general the panel and its background lighting is the most power hungry component. The only power savings will come from the GPU having to render fewer pixels.

 

In gaming having to render fewer pixels and limiting frame rate will make a huge difference. In that case the GPU can easily draw tens of watts (on a desktop easily 100W+)

Remember to either quote or @mention others, so they are notified of your reply

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, BloopBucket said:

Rendering fewer pixels should mean the GPU doesn't need to work as hard and will consume less power, no? Just to confirm my musings I opened up HWinfo and looked at GPU ASIC power while on the two resolutions and found at 960p it averaged 3W idle, 5W while watching video and peaked 5W scrolling while at 1920p it averaged 3W idle, 8W in video and peaked 10W scrolling. It seems the more of the screen it has to refresh the greater the disparity becomes (I was scrolling and throwing windows around like a madman to get full screen refreshes). I don't know how accurate the GPU ASIC power measurement is to real world power consumption (these numbers seem a bit high) but it does at least seem to demonstrate a difference.

But saving 3w is totally useless, it needs 2 weeks of continuous usage to save 1kw worth 15 to 30 cents depending on your local electricity prices ...

System : AMD R9 5900X / Gigabyte X570 AORUS PRO/ 2x16GB Corsair Vengeance 3600CL18 ASUS TUF Gaming AMD Radeon RX 7900 XTX OC Edition GPU/ Phanteks P600S case /  Eisbaer 280mm AIO (with 2xArctic P14 fans) / 2TB Crucial T500  NVme + 2TB WD SN850 NVme + 4TB Toshiba X300 HDD drives/ Corsair RM850x PSU/  Alienware AW3420DW 34" 120Hz 3440x1440p monitor / Logitech G915TKL keyboard (wireless) / Logitech G PRO X Superlight mouse / Audeze Maxwell headphones

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, PDifolco said:

But saving 3w is totally useless, it needs 2 weeks of continuous usage to save 1kw worth 15 to 30 cents depending on your local electricity prices ...

As OP pointed out, it's a notebook and increases its runtime by an hour.

Remember to either quote or @mention others, so they are notified of your reply

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×