Jump to content

What graphical settings do you lower or disable for better performance?

MiLik97

So my GPU (GTX 980 Ti) is starting to show it's age. There's no more time of being able to bump all video settings to max. and roll 60+ FPS with it.

 

Nowadays, on most modern games, I have to lower my video settings to a certain degree to maintain a steady 60 FPS gameplay, so I'm looking for some advice on which settings are the most demanding.

 

When opening a game, and going into it's settings menu, what video settings do you guys lower (or disable) in order to gain more FPS, and in what order?

 

What are in your opinion settings that have the most impact on FPS while barely giving any graphical difference?

Link to comment
Share on other sites

Link to post
Share on other sites

I always turn off shadows or drop them a level or two by everything else. I also turn off motion blur and vsync, but I don't think those two will do anything for you.

My PC Specs: (expand to view)

 

 

Main Gaming Machine

CPU: Intel Core i7-10700K - OC to 5 GHz All Cores
CPU Cooler: Corsair iCUE H115i RGB Pro XT (Front Mounted AIO)
Motherboard: Asus TUF GAMING Z490-PLUS (WI-FI)
Memory: Corsair Vengeance LPX 32 GB (2 x 16 GB) DDR4-3600

Storage: Intel 665p 1 TB M.2-2280 NVME SSD (x2)
Video Card: Zotac RTX 3070 8 GB GAMING Twin Edge OC

Power Supply: Corsair RM850 850W
Case: Corsair 4000D Airflow
Case Fan 120mm: Noctua F12 PWM 54.97 CFM 120 mm (x1)
Case Fan 140mm: Noctua A14 PWM 82.5 CFM 140 mm (x4)
Monitor Main: Asus VG278QR 27.0" 1920x1080 165 Hz
Monitor Vertical: Asus VA27EHE 27.0" 1920x1080 75 Hz

Link to comment
Share on other sites

Link to post
Share on other sites

The first settings I look at are usually antialiasing and shadows. The highest levels of those settings tend to need massive amounts of graphics power for not much benefit.

 

One setting I definitely wouldn't change is texture filtering. It's so cheap on any barely modern GPU (and by "barely modern" I mean like GTX 500/Radeon HD 6000) and it makes a huge difference in visual quality. There are games released within the last year with settings presets that will turn down texture filtering on presets other than low. Why? What were the devs thinking? The GPU set as the minimum requirement for your game would laugh off 16x anisotropic filtering, so why does it drop when going from ultra to high?

 

\rant

 

Other than that I think it's pretty difficult to make general statements about which settings to change. Games will often name things slightly differently, and the same setting might have a larger performance or visual impact in some games compared to others. In some cases a setting with the same name might control entirely different things in different games (especially generically named settings, "effects quality" is a pretty common setting that can mean a whole lot of different things). You can try changing settings and looking for yourself at what you notice most and what has the biggest framerate impacts, or you can look at settings repositories like GeForce Experience or the many settings websites out there for guidance.

¯\_(ツ)_/¯

 

 

Desktop:

Intel Core i7-11700K | Noctua NH-D15S chromax.black | ASUS ROG Strix Z590-E Gaming WiFi  | 32 GB G.SKILL TridentZ 3200 MHz | ASUS TUF Gaming RTX 3080 | 1TB Samsung 980 Pro M.2 PCIe 4.0 SSD | 2TB WD Blue M.2 SATA SSD | Seasonic Focus GX-850 Fractal Design Meshify C Windows 10 Pro

 

Laptop:

HP Omen 15 | AMD Ryzen 7 5800H | 16 GB 3200 MHz | Nvidia RTX 3060 | 1 TB WD Black PCIe 3.0 SSD | 512 GB Micron PCIe 3.0 SSD | Windows 11

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, MiLik97 said:

What are in your opinion settings that have the most impact on FPS while barely giving any graphical difference?

Depends a lot on the game. For some games/settings high to medium makes virtually no difference in visuals, but greatly improves performance, for others it's the exact reverse. I'd play around until you find an acceptable compromise between performance and looks that works for you.

 

Personally, I often disable AA, reduce detail levels and/or screen resolution. I try to keep texture resolution as high as possible. Depending on the game less than 60 fps is also acceptable, as long as frame rate is consistent. Of course a screen with adaptive sync helps.

Remember to either quote or @mention others, so they are notified of your reply

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, MiLik97 said:

So my GPU (GTX 980 Ti) is starting to show it's age. There's no more time of being able to bump all video settings to max. and roll 60+ FPS with it.

 

Nowadays, on most modern games, I have to lower my video settings to a certain degree to maintain a steady 60 FPS gameplay, so I'm looking for some advice on which settings are the most demanding.

 

When opening a game, and going into it's settings menu, what video settings do you guys lower (or disable) in order to gain more FPS, and in what order?

 

What are in your opinion settings that have the most impact on FPS while barely giving any graphical difference?

Usually, shadows and shadow resolution, but I've also noticed stuff like foliage and water quality can be a big performance hog, and the obvious ones like texture quality.

 

I

Link to comment
Share on other sites

Link to post
Share on other sites

Hey, everyone! Apologies for the late reply. I forgot to check the thread for any replies, and before I remembered it was already night-time, so I went to sleep.

Anyway, thanks for all the suggestions, I've written them down, but I still have some questions:

 

On 2/17/2023 at 7:12 PM, BobVonBob said:

The first settings I look at are usually antialiasing and shadows. The highest levels of those settings tend to need massive amounts of graphics power for not much benefit.

On 2/17/2023 at 7:12 PM, Eigenvektor said:

Personally, I often disable AA, reduce detail levels and/or screen resolution. I try to keep texture resolution as high as possible. Depending on the game less than 60 fps is also acceptable, as long as frame rate is consistent. Of course a screen with adaptive sync helps.

7 hours ago, eLLe7 said:

Shadow, AA, Resolution(depends on the game), Render quality or view distance. 

I play at 1440p, so AA is usually disabled, as I hardly notice any jagged lines, but sometimes they do appear and are quite visible.

 

In such cases, which AA method would you guys recommend me to use that does not have a big hit on performance (except FXAA, as it looks ugly to me)? Also how much sampling should I use (2x, 4x...)?

Link to comment
Share on other sites

Link to post
Share on other sites

GTX 1070 here I don't lower anything, i just enable FSR and keep everything maxed out at 1440p. The bummer is that it's supported from the 10 series and above so you can't apply it on the 980Ti.

| Ryzen 7 5800X3D | Arctic Liquid Freezer II 360 Rev 7| AsRock X570 Steel Legend |

| 4x16GB G.Skill Trident Z Neo 4000MHz CL16 | Sapphire Nitro+ RX 6900 XT | Seasonic Focus GX-1000|

| 512GB A-Data XPG Spectrix S40G RGB | 2TB A-Data SX8200 Pro| Phanteks Eclipse G500A |

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, QuantumSingularity said:

GTX 1070 here I don't lower anything, i just enable FSR and keep everything maxed out at 1440p. The bummer is that it's supported from the 10 series and above so you can't apply it on the 980Ti.

I don't mind not being able to use it, as I can live without it, but what I'm curious about is, what exactly are DLSS/FSR technologies, and what are they used for?

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, MiLik97 said:

I don't mind not being able to use it, as I can live without it, but what I'm curious about is, what exactly are DLSS/FSR technologies, and what are they used for?

DLSS and FSR are basically the same technology, only different ways of executing it. What it basically does is rendering each frame at a lower than the selected resolution, thus helping your GPU pump more frames out and artificially upscaling them to that said resolution. Will try to describe it as simple as possible - if you select 1440p resolution, then DLSS/FSR will make your GPU render at 1080p (could be lower or higher depending on the settings you chose), which will produce more frames then if it had to render at 1440p. Then what it does is basically resizing each and every frame to your selected 1440p resolution. Buy if you ever resized an image in MS Paint, you know how bad the quality gets. To get around that, the technology adds algorithms which try to guess what the missing pixels might looks like and adds them to produce your final image with minimal to noticeable loss of quality (again depending on the settings).

DLSS is Nvidia's way of doing it with deep learning (AI), FSR is AMD's way with pure human power via trial and error and open source. Recently Nvidia got criticized  because their latest DLSS 3.0 adds one more additional "fake" frame in between you 2 actually rendered frames to produce higher average FPS scores, but the side effect of that was incredibly high latency and sluggish feel to every game that uses it. In simple words - you see above 100fps average on the screen, but it feels like you are playing at 30 or less FPS. 

Nvidia wanted that technology to be available only on their RTX 3000 and above series of GPUs, but i guess they didn't expect AMD to go all out with FSR and provide support all the way back to the GTX 10 series. Now instead of playing at 60-80 FPS in games with my GTX 1070, i get 100-140 FPS average with minimal downsides. This is the only thing that still keeps people from buying the insanely overpriced RTX 3000/4000 GPUs... Except of course for the people who have no idea about that and pour $500+ into an RTX 3060.

| Ryzen 7 5800X3D | Arctic Liquid Freezer II 360 Rev 7| AsRock X570 Steel Legend |

| 4x16GB G.Skill Trident Z Neo 4000MHz CL16 | Sapphire Nitro+ RX 6900 XT | Seasonic Focus GX-1000|

| 512GB A-Data XPG Spectrix S40G RGB | 2TB A-Data SX8200 Pro| Phanteks Eclipse G500A |

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, QuantumSingularity said:

DLSS and FSR are basically the same technology, only different ways of executing it. What it basically does is rendering each frame at a lower than the selected resolution, thus helping your GPU pump more frames out and artificially upscaling them to that said resolution. Will try to describe it as simple as possible - if you select 1440p resolution, then DLSS/FSR will make your GPU render at 1080p (could be lower or higher depending on the settings you chose), which will produce more frames then if it had to render at 1440p. Then what it does is basically resizing each and every frame to your selected 1440p resolution. Buy if you ever resized an image in MS Paint, you know how bad the quality gets. To get around that, the technology adds algorithms which try to guess what the missing pixels might looks like and adds them to produce your final image with minimal to noticeable loss of quality (again depending on the settings).

DLSS is Nvidia's way of doing it with deep learning (AI), FSR is AMD's way with pure human power via trial and error and open source. Recently Nvidia got criticized  because their latest DLSS 3.0 adds one more additional "fake" frame in between you 2 actually rendered frames to produce higher average FPS scores, but the side effect of that was incredibly high latency and sluggish feel to every game that uses it. In simple words - you see above 100fps average on the screen, but it feels like you are playing at 30 or less FPS. 

Nvidia wanted that technology to be available only on their RTX 3000 and above series of GPUs, but i guess they didn't expect AMD to go all out with FSR and provide support all the way back to the GTX 10 series. Now instead of playing at 60-80 FPS in games with my GTX 1070, i get 100-140 FPS average with minimal downsides. This is the only thing that still keeps people from buying the insanely overpriced RTX 3000/4000 GPUs... Except of course for the people who have no idea about that and pour $500+ into an RTX 3060.

Thanks for the useful info! I think I get the gist of it now :)

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, QuantumSingularity said:

DLSS and FSR are basically the same technology, only different ways of executing it. What it basically does is rendering each frame at a lower than the selected resolution, thus helping your GPU pump more frames out and artificially upscaling them to that said resolution. Will try to describe it as simple as possible - if you select 1440p resolution, then DLSS/FSR will make your GPU render at 1080p (could be lower or higher depending on the settings you chose), which will produce more frames then if it had to render at 1440p. Then what it does is basically resizing each and every frame to your selected 1440p resolution. Buy if you ever resized an image in MS Paint, you know how bad the quality gets. To get around that, the technology adds algorithms which try to guess what the missing pixels might looks like and adds them to produce your final image with minimal to noticeable loss of quality (again depending on the settings).

DLSS is Nvidia's way of doing it with deep learning (AI), FSR is AMD's way with pure human power via trial and error and open source. Recently Nvidia got criticized  because their latest DLSS 3.0 adds one more additional "fake" frame in between you 2 actually rendered frames to produce higher average FPS scores, but the side effect of that was incredibly high latency and sluggish feel to every game that uses it. In simple words - you see above 100fps average on the screen, but it feels like you are playing at 30 or less FPS. 

Nvidia wanted that technology to be available only on their RTX 3000 and above series of GPUs, but i guess they didn't expect AMD to go all out with FSR and provide support all the way back to the GTX 10 series. Now instead of playing at 60-80 FPS in games with my GTX 1070, i get 100-140 FPS average with minimal downsides. This is the only thing that still keeps people from buying the insanely overpriced RTX 3000/4000 GPUs... Except of course for the people who have no idea about that and pour $500+ into an RTX 3060.

Oh yeah, another thing that hasn't crossed my mind until now was; since I'm still on my Haswell CPU, what settings should I lower or disable for CPU-bound games? Does it even matter if you change anything in the video options (as they mostly affect the GPU)?

Link to comment
Share on other sites

Link to post
Share on other sites

Usually I just disable motion blur.
Sometimes I mess with V Sync.

For aggressive optimization for FPS, I first select a lower settings option like medium or low.
If that doesn't do enough, I drop the resolution.

Link to comment
Share on other sites

Link to post
Share on other sites

On 2/19/2023 at 11:50 AM, QuantumSingularity said:

DLSS is Nvidia's way of doing it with deep learning (AI), FSR is AMD's way with pure human power via trial and error and open source. Recently Nvidia got criticized  because their latest DLSS 3.0 adds one more additional "fake" frame in between you 2 actually rendered frames to produce higher average FPS scores, but the side effect of that was incredibly high latency and sluggish feel to every game that uses it. In simple words - you see above 100fps average on the screen, but it feels like you are playing at 30 or less FPS. 

That's a bit of an exaggeration. There is a bit more input lag, but for the games it's used in so far (mostly story driven singleplayer) it doesn't really matter if you have a few ms more lag. The technique should ideally be used when the game already runs at 60+ fps without it. Lower fps will result in very noticeable artifacting. But then it'll basically run a 60 fps game at 120 fps. Input lag will feel similar to 60 fps but you still get the added motion smoothness of 120 fps. And if you don't want to make that trade, you can also just enable regular DLSS without frame generation in all games that support DLSS 3.0 and get basically free performance at the "quality" preset.

 

On 2/19/2023 at 11:50 AM, QuantumSingularity said:

Nvidia wanted that technology to be available only on their RTX 3000 and above series of GPUs, but i guess they didn't expect AMD to go all out with FSR and provide support all the way back to the GTX 10 series. Now instead of playing at 60-80 FPS in games with my GTX 1070, i get 100-140 FPS average with minimal downsides. This is the only thing that still keeps people from buying the insanely overpriced RTX 3000/4000 GPUs... Except of course for the people who have no idea about that and pour $500+ into an RTX 3060.

At 4K, DLSS, XeSS and FSR are performing very similar, with XeSS and DLSS having a slight edge in terms of clarity and DLSS being the best in terms of ghosting.

 

As soon as you go to 1440p or 1080p as the target resolution, FSR and XeSS fall off a lot more than DLSS, which handles lower resolutions a lot better. So while Nvidia's option is the most closed down, it's also the best in practically any scenario. The reason why it doesn't run on older cards is because it runs on Tensor cores, which where only introduced with the RTX 2000 series and newer.

 

Intel's XeSS is also interesting. It has 2 versions. One which can run on any GPU and one which runs on Intel's GPU's and uses their specific hardware acceleration. The accelerated one specifically for Intel GPU's looks and performs significantly better than the universal one.

 

 

 

So back on topic: Since you're on a 900-series card, DLSS is out of the picture. In supported games i'd suggest trying out FSR. Other than that you can often turn down volumetric fog for significant performance gains without much visual impact. After that lighting and shadows are the most demanding. But the general rule of thumb is: Play on high, not ultra. Generally there is next to no visual difference and high performs 20-30% better.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Stahlmann said:

That's a bit of an exaggeration. There is a bit more input lag, but for the games it's used in so far (mostly story driven singleplayer) it doesn't really matter if you have a few ms more lag. The technique should ideally be used when the game already runs at 60+ fps without it. Lower fps will result in very noticeable artifacting. But then it'll basically run a 60 fps game at 120 fps. Input lag will feel similar to 60 fps but you still get the added motion smoothness of 120 fps. And if you don't want to make that trade, you can also just enable regular DLSS without frame generation in all games that support DLSS 3.0 and get basically free performance at the "quality" preset.

 

At 4K, DLSS, XeSS and FSR are performing very similar, with XeSS and DLSS having a slight edge in terms of clarity and DLSS being the best in terms of ghosting.

 

As soon as you go to 1440p or 1080p as the target resolution, FSR and XeSS fall off a lot more than DLSS, which handles lower resolutions a lot better. So while Nvidia's option is the most closed down, it's also the best in practically any scenario. The reason why it doesn't run on older cards is because it runs on Tensor cores, which where only introduced with the RTX 2000 series and newer.

 

Intel's XeSS is also interesting. It has 2 versions. One which can run on any GPU and one which runs on Intel's GPU's and uses their specific hardware acceleration. The accelerated one specifically for Intel GPU's looks and performs significantly better than the universal one.

 

 

 

So back on topic: Since you're on a 900-series card, DLSS is out of the picture. In supported games i'd suggest trying out FSR. Other than that you can often turn down volumetric fog for significant performance gains without much visual impact. After that lighting and shadows are the most demanding. But the general rule of thumb is: Play on high, not ultra. Generally there is next to no visual difference and high performs 20-30% better.

Got it! Thanks for the clarification, and great response Stahl!

 

But regarding my post for CPU-bound games

20 hours ago, MiLik97 said:

Oh yeah, another thing that hasn't crossed my mind until now was; since I'm still on my Haswell CPU, what settings should I lower or disable for CPU-bound games? Does it even matter if you change anything in the video options (as they mostly affect the GPU)?

Which settings do you think should be lowered (or disabled) to reduce "pressure" off the CPU and add more FPS?

 

This is especially for games like Total War, ArmA, Civilization, Console Emulators, etc.

Link to comment
Share on other sites

Link to post
Share on other sites

mostly lighting stuff. volumetric lighting high, low or medium in most cases makes no difference other than my card getting louder so theres no point. 

 

this is also purely subjective,  i would *never* turn off AA, thats one of the most important settings to have on with almost no performance impact... only time I'd consider turning it off is when I play something supersampled in 4k, but honestly, nah, still leave it on typically,  2x at least... why see jaggies when its so easily avoidable (to 99%)

 

 

On 2/18/2023 at 8:22 PM, MiLik97 said:

which AA method would you guys recommend me to use that does not have a big hit on performance (except FXAA, as it looks ugly to me)? Also how much sampling should I use (2x, 4x...)

i prefer msaa https://en.m.wikipedia.org/wiki/Multisample_anti-aliasing which has a high performance impact though,  and honestly?  fxaa doesn't look too bad in most games,  i typically don't even change aa at all from default unless i 

notice some irregularities... many games seem to use "DLSS" as aa solution now ¯\_(ツ)_/¯ 

 

thats also really problematic about this topic,  you're asking something that cant really be answered in a one size fits all way, because every game is different,  most of these solutions and settings vary wildly between games.

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 2/20/2023 at 8:01 PM, MiLik97 said:

But regarding my post for CPU-bound games

Which settings do you think should be lowered (or disabled) to reduce "pressure" off the CPU and add more FPS?

 

This is especially for games like Total War, ArmA, Civilization, Console Emulators, etc.

To reduce CPU load you'd have to reduce viewing distance or crowd density. Generally anything that makes it so less NPCs dynamic objects have to be processed. Minecraft and Arma for example lets you set the view distance, so if you're playing scenarios where you don't have to see 5 km into the distance, drop that to 1500m or so. Cyberpunk for example lets you drop the NPC crowd density.

 

But sadly, most games do not offer these kinds of settings to reduce CPU load. In the games where you don't find these settings, you might be able to find Mods that do the same though.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

On 2/20/2023 at 7:01 PM, MiLik97 said:

Got it! Thanks for the clarification, and great response Stahl!

 

But regarding my post for CPU-bound games

Which settings do you think should be lowered (or disabled) to reduce "pressure" off the CPU and add more FPS?

 

This is especially for games like Total War, ArmA, Civilization, Console Emulators, etc.

Just remember, DLSS is only available on RTX 20 series cards and up (so 20 series, 30, and 40).

 

It isn't available on the GTX 10 range.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×