Jump to content

An idiots guide to game settings.

Morrie Sells Wigs
Go to solution Solved by Aereldor,

Here's a good set of general guidelines. The vast majority of titles are designed with consoles in mind. The current generation of consoles - PS4, Xbox One - don't have a lot of processing power. They run at the 'medium' PC settings. The games are designed with these settings in mind.

 

Which is why they're also the best-optimized, or in other words, they have the best ration of visual quality to FPS.

 

The higher the settings go, the more subtle the improvements, and the more they'll cost you. 'Ultra settings' are basically a joke, designed to cripple your performance and trick you into buying more expensive GPUs.

 

So as a guideline, start with medium. The first things you want to turn up are Anisotropic Filtering, which isn't too performance-intensive and provides a significant visual improvement. 

 

At lower resolutions, experiment with the anti-aliasing. This matters less at higher resolutions.

 

At higher resolutions, the lower quality of textures will become more apparent. I'd recommend turning them up until it's no longer obvious. Bear in mind the extreme rarity of a perceptible gameplay difference between 'high' and 'ultra' textures.

 

Ambient Occlusion is a controversial topic. I'll argue that the benefits of HBAO over SSAO are obvious when viewing comparison screenshots, but in gameplay they're too subtle to justify the significant FPS cost of HBAO.

 

A great video on why today's 'ultra settings' are a joke-

 

So, I'm very new to pc gaming, guys. 

 

I have been a console gamer for nearly 30 years, and so the very unique way you can tailor your settings on pc is taking some getting used to.

 

I have an i510400f, 2060, 16gb RAM, plus a 4k 60hz monitor and a 1440p 144hz monitor, and I have been able to run stuff like Destiny 2 and The Division, at (min) 100-120fps with high(est) settings, but I often see articles or comments saying "turn X down" or "turn Y down" as it doesn't make much difference to the overall quality. 

 

There are weird things like Ambient Occlusion, and TAA, all sorts of stuff for shadows and draw distance, which my console gamer brain has to learn all over.

 

I know what some are, and can make a rough guess as to how they will broadly affect the game - but if I come to play more intensive, up to date games, where I may well need to lower some settings to maintain a decent frame rate, I am pretty clueless as to what to do.

 

Are there certain settings that it is better to max, and stuff that aren't as important?

 

Is it better to improve anti-aliasing at the expense of, say, shadow quality?

 

Are there certain games where, say, draw distance isn't as important, and others where it is?

 

Ideally, I'd like to have a good frame rate (60-70ish) and an image that is clean, that doesn't have anything that is jarring or noticeable, even if it isn't as richly detailed as it might be on ultra.

 

I hope I'm making some sense?

 

I'd be really grateful if some of you guys could offer me some advice on settings that can be lowered with minimal impact to the look of the game?

 

What are the most GPU/CPU hungry settings?

 

Which should you turn down to improve frame rate?

 

Is it better to go ultra at, say 1080p, over medium at 1440p?

 

Anything else you guys can think of to add would also be greatly appreciated.

 

Thanks, in advance. 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Each game is different so you really got to try them one by one to know for sure

 

Most of the time I just use presets. Not getting enough fps? Turn down to lower presets

 

If I do tweak, I tweak anti aliasing and texture quality for the most part, and turn off motion blur and vignette effect

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Moonzy said:

Each game is different so you really got to try them one by one to know for sure

 

Most of the time I just use presets. Not getting enough fps? Turn down to lower presets

 

If I do tweak, I tweak anti aliasing and texture quality for the most part, and turn off motion blur and vignette effect

When you say pre-sets, do you mean the pc will automatically adjust them for me and they are the optimal settings?

 

I thought maybe they were slightly reduced by the computer and weren't actually the best option. 

 

I do turn off motion blur and film grain as I hate them.

 

I assume the quality of your monitor can also play a part in the sheer quality of your image too?

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Maury Sells Wigs said:

When you say pre-sets, do you mean the pc will automatically adjust them for me and they are the optimal settings?

You can do this with GeForce experience, but the presets I meant was the ones built into games like "ultra" or "low"

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Moonzy said:

You can do this with GeForce experience, but the presets I meant was the ones built into games like "ultra" or "low"

I see.

 

I must admit, as much as it is to take onboard initially, I do like the sheer wealth of options.

 

The amount of times I used to wish I could have tailored settings on ps4 to my personal taste - and it was never an option.

Link to comment
Share on other sites

Link to post
Share on other sites

Here's a good set of general guidelines. The vast majority of titles are designed with consoles in mind. The current generation of consoles - PS4, Xbox One - don't have a lot of processing power. They run at the 'medium' PC settings. The games are designed with these settings in mind.

 

Which is why they're also the best-optimized, or in other words, they have the best ration of visual quality to FPS.

 

The higher the settings go, the more subtle the improvements, and the more they'll cost you. 'Ultra settings' are basically a joke, designed to cripple your performance and trick you into buying more expensive GPUs.

 

So as a guideline, start with medium. The first things you want to turn up are Anisotropic Filtering, which isn't too performance-intensive and provides a significant visual improvement. 

 

At lower resolutions, experiment with the anti-aliasing. This matters less at higher resolutions.

 

At higher resolutions, the lower quality of textures will become more apparent. I'd recommend turning them up until it's no longer obvious. Bear in mind the extreme rarity of a perceptible gameplay difference between 'high' and 'ultra' textures.

 

Ambient Occlusion is a controversial topic. I'll argue that the benefits of HBAO over SSAO are obvious when viewing comparison screenshots, but in gameplay they're too subtle to justify the significant FPS cost of HBAO.

 

A great video on why today's 'ultra settings' are a joke-

 

i5 12600KF | Zotac RTX 4080 Gaming trinity | Team Vulcan 2x16GB DDR4 3600 | ASRock Z690M-ITX/ac | WD Black SN850x 2TB

Cooler Master NR200P v2 | ID Cooling Zoomflow 280 XT | SeaSonic Focus SGX-750 | Thermalright 2x140mm + 2x120mm aRGB

LG C2 OLED 48" 120hz | Epomaker TH80 (Gateron Yellow) | Logitech MX Master 3 | Koss Porta Pro Comm

Link to comment
Share on other sites

Link to post
Share on other sites

That above highly depend from game to game.

 

Things like Texture quality usually affect your FPS very little unless the memory of your GPU becomes full.

Things like Shadows and Ambient Occlusion affect the FPS much more.

Draw distance have usually maybe an medium effect on FPS.

 

If I had a 27" 1440p monitor I would personally lower other settings to play at 1440p if I had to, and not lower the resolution.

I personally find myself trying to run games at 1800p or 80% at my 32" 4k monitor minimum.

One thing I found in Metro Exodus, putting the monitor setting at 4k and render resolution at 80% looked better than just putting the monitor resolution at 1800p.

“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Aereldor said:

Here's a good set of general guidelines. The vast majority of titles are designed with consoles in mind. The current generation of consoles - PS4, Xbox One - don't have a lot of processing power. They run at the 'medium' PC settings. The games are designed with these settings in mind.

 

Which is why they're also the best-optimized, or in other words, they have the best ration of visual quality to FPS.

 

The higher the settings go, the more subtle the improvements, and the more they'll cost you. 'Ultra settings' are basically a joke, designed to cripple your performance and trick you into buying more expensive GPUs.

 

So as a guideline, start with medium. The first things you want to turn up are Anisotropic Filtering, which isn't too performance-intensive and provides a significant visual improvement. 

 

At lower resolutions, experiment with the anti-aliasing. This matters less at higher resolutions.

 

At higher resolutions, the lower quality of textures will become more apparent. I'd recommend turning them up until it's no longer obvious. Bear in mind the extreme rarity of a perceptible gameplay difference between 'high' and 'ultra' textures.

 

Ambient Occlusion is a controversial topic. I'll argue that the benefits of HBAO over SSAO are obvious when viewing comparison screenshots, but in gameplay they're too subtle to justify the significant FPS cost of HBAO.

 

A great video on why today's 'ultra settings' are a joke-

 

Thank you, Aereldor, this was exactly the kind of information I was looking for.

👍

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Mihle said:

That above highly depend from game to game.

 

Things like Texture quality usually affect your FPS very little unless the memory of your GPU becomes full.

Things like Shadows and Ambient Occlusion affect the FPS much more.

Draw distance have usually maybe an medium effect on FPS.

 

If I had a 27" 1440p monitor I would personally lower other settings to play at 1440p if I had to, and not lower the resolution.

I personally find myself trying to run games at 1800p or 80% at my 32" 4k monitor minimum.

One thing I found in Metro Exodus, putting the monitor setting at 4k and render resolution at 80% looked better than just putting the monitor resolution at 1800p.

I've been using my 1440p 144hz monitor primarily, as I am much more interested in high frame rates over 4k resolution. 

 

I do have a 4k monitor also, but it's only 60hz.

 

It has less input delay, but overall I feel the 1440/144 gives me more flexibility. 

 

So far, I've been prioritising that stable, higher fps over everything else.

 

It is such a revelation playing games where you don't get flick book levels of performance as the ps4 pro struggles to maintain its resolution and frame rate whilst enemies and visual effects smother the screen.

 

I've been playing World War Z - and despite having ridiculous amounts of zombies on screen, which would see the console free fall into the 15fps range, there is no slow down or frame drops no matter how hectic it gets.

 

It's such a contrast.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

57 minutes ago, AldiPrayogi said:

This might be a good video on the topic from Jay:

 

 

Thanks, mate, it was very helpful. 

👍

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×