Jump to content

Game Optimization

JCBiggs

Id like to start a discussion about game engine optimization.     How is the game optimized, to what end is it optimized (just to windows? to specific architecture?  etc)  and id like to theoretically come up with some sort of "optimization score" 

 

What I mean by that is, if you look at "The last of us" on PS3.  That game EKES the most performance out of the PS3 possible. as far as I know, there is no other software (outside of linux) that can use it more effectively.  So lets just use that as our benchmark and call that 100% optimization. 

 

From that 100% standard,  how could you rate a newer AAA title on the most current generation hardware?  how about one architecture step back?      is Fallout 76 in the 95th percentile on an 8700k/1080ti?  85th percentile on a 4770?   Im make the assumption here that the changes in architecture are going to be the biggest contributing factor.  a game can still run on slower hardware but be well optimized.     

 

Is there really any way to know how well a game is optimized?  Is it possible to create a  game engine specific ?bench mark? 

 

lots of questions here... your thoughts....

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, JCBiggs said:

EEKS

"ekes"

Hand, n. A singular instrument worn at the end of the human arm and commonly thrust into somebody’s pocket.

Link to comment
Share on other sites

Link to post
Share on other sites

Console game developers have one advantage that no PC game developer has: standardized hardware and software configuration. A console game developer can start development on such without having to worry about supporting various configurations of hardware and software. There's really only a handful that they have to worry about. And even then, they can still target the minimum base configuration since it should be that any configuration better than that should work. When you know your target platform, from the hardware to software, you know how it ticks so you know what you should and should not do. You also gain knowledge of any "shortcuts" that might be handy.

 

Which brings me to what PC game developers have to face: a practical infinite number of combinations, both hardware and software. And for some cases, it's not as simple as "we're targeting x86-64." Intel and AMD have different implementations of the architecture. So if you're not careful, you may run into "gotchas" that may be fine on AMD platforms, but not so on Intel ones. Even with say AMD or NVIDIA, if you plan on supporting older architectures like TeraScale, Kepler, or Fermi, you have to account for the nuances of those architectures too, despite the fact they all talk through the same API.

 

As for an "optimization" score, I don't think you can really quantify one. First you need some quantifiable factors. Then you need some kind of baseline. And finally, for any of us to understand why such software designs were done, we have to have a deep understanding the system in question. I don't believe that any of us here are qualified to make those calls. One example I like to bring up regarding optimization is Raymond Chen's blog on how optimization can be counter-intuitive.

 

At best you can make claims that some software isn't "optimized" for the platform and present issues with why. But even then I find those claims subjective at best. Let's put it this way: if the software is still running as good as it is on one platform vs. another, I don't think it's poorly optimized. Just because you want faster FPS and all the bells and whistles doesn't mean someone else does or cares.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, M.Yurizaki said:

 

...snip...

Some games are just poorly optimized across all platforms. :P

Think about all the console games that don't meet whatever they were trying to go for (1080p60, 4k30, 1080p30 in some cases) Hate to say it, but if they can't get the consoles to do what they want I'm calling bad optimization. Like you said they're all standard and equal. The only reason it won't hit framerate or resolution targets is because they're trying to cram too much into each frame, they didn't test it very thoroughly, or they just have bad LODs or something.

 

There's programs out there that are designed to render all over the game world and give framerate analysis so devs know where they need to back off or be able to add more stuff. I'm not going to source it because I don't want to look for it, but I remember watching a video of Forza Horizon 2 where they showed the game world with overlays of framerates in a sort of topographical map. They could then look at any areas that were too performance heavy and figure out how to get it stable.

 

Or, you could do what most Ubisoft titles do and do... none of that and have it perform terribly on all systems. Another good sign of poor optimization on PC is either a lack of graphical settings, or a very small difference between the low settings and high settings.

Again, sorry for not referencing, but I was having trouble with Ghost Recon Wildlands (shockingly a Ubisoft title...) and found a site online that had benchmarks of framerates with all the graphical settings independantly and as a whole. From Low to Ultra was very minimal.

Nevermind, it was the first Google result...

https://rog.asus.com/articles/gaming/ghost-recon-wildlands-graphics-performance-guide/

Every option from low to high has practically zero effect on framerate, even though the actual image quality is vastly different.

#Muricaparrotgang

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, JZStudios said:

-Snip-

I'll stress however that we as observers can't really make these calls or give applications "optimization" scores because we're observing everything from the outside and based on our own definition of requirements. Unless you have access to the game developer's repository of requirements or other things, then at best we're just making guesses. If we're lucky, we'll get someone speaking about what sort of performance target they're aiming for. But I don't think these are hard requirements. Saying "we're targeting 30 FPS" doesn't mean "we guarantee 30 FPS." At best all that means is "you'll get 30 FPS on average."

 

Taking OP's example of The Last of Us, what was the optimizations they were talking about? How much detail they could cram while maintaining some performance level? Well how do you quantify that detail without the use of invasive tools? If it's simply a performance metric then we may as well tell the developers that producing beautiful graphics is a waste of time, we're fine playing CS:GO with box models as long as it runs at a bajillion FPS.

 

I would also point out that the console market doesn't really care much about performance as long as it's playable. They just want to play games.

 

Regarding the link to Ghost Recons: Wildlands, the graphs where quality doesn't appear to affect performance was because they ran the setup at 4K which I would expect is crushing regardless of quality. It's also likely they changed only that one setting. The ones below that used presets and different resolutions, and changing the presets do show an appreciable performance difference.

 

Either way, at the end of the day, if you're going to want to keep score on something like this, you have to have quantifiable requirements you can test against. Saying "it must perform well" is a bad requirement, it's not quantifiable. Saying "It must perform 30 FPS at 1080p, 2K textures, 10M polygons, four light sources, 10 render targets, shadowing at half screen resolution, etc. etc." is a quantifiable requirement. And even then I'm not even sure if it's fair to judge a game based entirely on how well it performs.

 

And as a software developer, I don't really like to call an application "poorly" or "well optimized." Rather, I'd like to call these "poorly" or "well designed." An optimization is platform specific. Any well designed game should require minimal optimization to run "well" regardless of platform.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, M.Yurizaki said:

I'll stress however that we as observers can't really make these calls or give applications "optimization" scores because we're observing everything from the outside and based on our own definition of requirements. Unless you have access to the game developer's repository of requirements or other things, then at best we're just making guesses. If we're lucky, we'll get someone speaking about what sort of performance target they're aiming for. But I don't think these are hard requirements. Saying "we're targeting 30 FPS" doesn't mean "we guarantee 30 FPS." At best all that means is "you'll get 30 FPS on average."

 

Taking OP's example of The Last of Us, what was the optimizations they were talking about? How much detail they could cram while maintaining some performance level? Well how do you quantify that detail without the use of invasive tools? If it's simply a performance metric then we may as well tell the developers that producing beautiful graphics is a waste of time, we're fine playing CS:GO with box models as long as it runs at a bajillion FPS.

 

I would also point out that the console market doesn't really care much about performance as long as it's playable. They just want to play games.

 

Regarding the link to Ghost Recons: Wildlands, the graphs where quality doesn't appear to affect performance was because they ran the setup at 4K which I would expect is crushing regardless of quality. It's also likely they changed only that one setting. The ones below that used presets and different resolutions, and changing the presets do show an appreciable performance difference.

 

Either way, at the end of the day, if you're going to want to keep score on something like this, you have to have quantifiable requirements you can test against. Saying "it must perform well" is a bad requirement, it's not quantifiable. Saying "It must perform 30 FPS at 1080p, 2K textures, 10M polygons, four light sources, 10 render targets, shadowing at half screen resolution, etc. etc." is a quantifiable requirement. And even then I'm not even sure if it's fair to judge a game based entirely on how well it performs.

 

And as a software developer, I don't really like to call an application "poorly" or "well optimized." Rather, I'd like to call these "poorly" or "well designed." An optimization is platform specific. Any well designed game should require minimal optimization to run "well" regardless of platform.

It was individual settings, but even still almost none of them had any real discernible amount of difference. For example I currently have a 2gb 960, which that games maxes out on VRam almost instantly, yet lowest settings and high give damn near the same performance. Ultra is more taxing sure, but most other settings don't do much.

I still believe if you're not hitting your framerate target on console you need to figure out how to optimize your engine, or start reducing and cutting back on things.

 

What I'm telling you is that the Forza series aims for 60fps locked and the Horizon series at 30fps locked and it almost (if not never) drops below that, and they keep adding more features and optimizations as time goes on. Don't know what exactly the issue is with PC other than likely the stupid Denuvo.

You can also look at Horizon Zero Dawn which doesn't seem to ever drop below 28, which is pretty uncommon in itself.

I would call that good optimization Vs. Fallouts poor optimization.

If a game has objectively worse visual features, textures, models, lighting, etc... and still performs poorly/worse I'd call that poor optimization.

#Muricaparrotgang

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, JZStudios said:

It was individual settings, but even still almost none of them had any real discernible amount of difference. For example I currently have a 2gb 960, which that games maxes out on VRam almost instantly, yet lowest settings and high give damn near the same performance. Ultra is more taxing sure, but most other settings don't do much.

I still believe if you're not hitting your framerate target on console you need to figure out how to optimize your engine, or start reducing and cutting back on things.

 

What I'm telling you is that the Forza series aims for 60fps locked and the Horizon series at 30fps locked and it almost (if not never) drops below that, and they keep adding more features and optimizations as time goes on. Don't know what exactly the issue is with PC other than likely the stupid Denuvo.

You can also look at Horizon Zero Dawn which doesn't seem to ever drop below 28, which is pretty uncommon in itself.

I would call that good optimization Vs. Fallouts poor optimization.

If a game has objectively worse visual features, textures, models, lighting, etc... and still performs poorly/worse I'd call that poor optimization.

This is a case of specific platform optimization vs. needing to develop for multiple platforms.

Horizon: Zero Dawn makes use of an incredibly specialized codebase to push every damn ounce of power it reasonably can out of the PS4 and the PS4 Pro... which is why the game makes use of checkerboard rendering on the Pro.

Fallout 4 didn't have the kind of benefits of console optimization in numerous ways.

 

1. The game is older than Horizon, quite a bit so. A year or two means a lot in the console space. It's what differentiates Gran Turismo 3: A-spec from Gran Turismo 4 in development eras. GT3 is very clearly an early PS2 game, an attractive one, yes, but an early game. GT4 Prologue is very obviously a later-era PS2 game, much less the final game. Now, those are just exclusives, but the gist can be understood with multi-plats, too.

 

2. Making a game multi-platform automatically means that it won't run as well as, say, if it was only on one platform. First thing is that you don't want to include features one of the consoles can't use because of their APIs or may work notably different in that case. Second, simple hardware quirks that you could pull off on one console could potentially break on another platform.

 

3. Some engines are simply rooted in their origins, and the Source engine is the clearest example of this. Team Fortress 2 has very poor multi-core support in part because of the way the version of the engine it runs on handles multiple cores. By default, multiple features that can make use of multi-threading are disabled and even when those are enabled, the game simply can't push those cores well. TF2's lack of multi-core support is rooted all the way back to Half-Life 2... so why do future Source games handle more than 1 core decently? This you can root to the Orange Box and, more specifically, Left 4 Dead. L4D was a huge push in optimization for the Xbox 360, where maps were cleverly designed and hardware alleviated to make good use of the CPU and the GPU. Fallout 4 was a somewhat messy update of the engine it ran on, which goes back to its roots. It was made more with exploration at many costs in mind, with feature sets that weren't necessarily ready for the engine or were simply way too taxing. Also, modding.

Check out my guide on how to scan cover art here!

Local asshole and 6th generation console enthusiast.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, JZStudios said:

It was individual settings, but even still almost none of them had any real discernible amount of difference.

Because they ran the tests at 4K. The ones where they ran it at lower resolutions did show an appreciable performance impact. Besides, a lot of the settings either aren't dependent on GPU performance (texture detail), were designed with negligible performance impact in mind (FXAA, possibly TAA, ambient occlusion), or are essentially free (anisotropic filtering). But again, these were ran at 4K, which is still a punishing resolution to run at. Otherwise explain the appreciable performance differences in the following 1080p performance graphs:

1490666541310.png

 

1490666544225.png

 

Quote

For example I currently have a 2gb 960, which that games maxes out on VRam almost instantly, yet lowest settings and high give damn near the same performance. Ultra is more taxing sure, but most other settings don't do much.

Ghost Recon: Wildlands is known to be a taxing game on the CPU. So without knowing the rest of your system, this would indicate a performance bottleneck more than bad software development design.

 

Quote

I still believe if you're not hitting your framerate target on console you need to figure out how to optimize your engine, or start reducing and cutting back on things.

And that's what developers do. But that doesn't mean they'll do it all the time just to hit the target 100% of the time.

 

Quote

What I'm telling you is that the Forza series aims for 60fps locked and the Horizon series at 30fps locked and it almost (if not never) drops below that, and they keep adding more features and optimizations as time goes on. Don't know what exactly the issue is with PC other than likely the stupid Denuvo.

Because again, PCs have no standard hardware or software configuration. What little configurations developers do have to test with is likely not what you have. And considering a lot of PC gamers are trying to tweak with whatever they can without understanding exactly the side effects are, the probability of someone running into some issue is effectively 100%. And everything that you can tweak is a factor in whether or not you run into issues.

 

Quote

You can also look at Horizon Zero Dawn which doesn't seem to ever drop below 28, which is pretty uncommon in itself.

I would call that good optimization Vs. Fallouts poor optimization.

It is good optimization, but you can't plop HZD on a PC and expect it to play at the same performance because everything was tuned for the PS4 in mind, not a Windows PC. And it's not even due to the different API or OS being used, but the fact that the PS4 is a different architecture on a system level. For all we know, HZD could be relying heavily on both HSA and that system RAM is GDDR5 RAM.

 

Quote

If a game has objectively worse visual features, textures, models, lighting, etc... and still performs poorly/worse I'd call that poor optimization.

Which is fine and all, but it's still to your standards.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, M.Yurizaki said:

 

snipping..

If by "appreciable" you mean... somewhere around 5 FPS then.. sure. I concede, massively optimized.

I'm not saying GRWL ISN'T CPU heavy, but I can't really fathom why. ARMA 3 is also massively CPU limited and graphics settings don't really have a major impact on FPS, but that game also does a lot of physics effects and other things I know eat CPU resources. None of which GRWL really has. So... I guess yeah it comes down to "Why does it perform so poorly?"

Another example for me would be Steep, another Ubisoft title. It also performs poorly on my system, and let's not beat around the bush here, it's an empty mountainside. Nothing is... super particularly detailed even on high, but at the lowest settings I still only managed a stuttery framerate in the mid 20s. The LODs aren't super far.. it just shouldn't be as "demanding" as it is.

You could also go for Assassins Creed Unity or whatever, another Ubisoft title that performs horrendously. But maybe that's also just my perception of it.

 

You're talking to me like I'm an idiot.

I understand that there's different configurations for PC. Even still when games "aim" for 30 fps on console and instead get low stuttery 20's, that's some dog shit optimization.

The "objective" method of measuring how well optimized something is, is by comparing visuals and framerate. The Witcher 3 is decently optimized, and from low settings to ultra has a huge fps impact, more than the 5-10 of GRWL or Steep. It is however not amazing on console, and I seriously doubt they couldn't drop some form of AO, shadow quality, whatever to get a more stable framerate. Not that that's necessarily the best way to optimize.

 

I'll also add that I played Deus Ex Mankind Divided on my 960, (I'm using it as a comparatively similar time release frame... kind of... my Steam library isn't very large, especially newer games.) which would max out my 960 easy but I could fiddle with the settings and get a decent usable framerate. I have no idea how it performed on console. I mean, it wasn't the best optimized on PC and making a body pile in one section caused the game world to unload around me around the framerate to drop to single digits.

I'm not saying that GRWL should run at ultra on my 960, but when I get 20 fps on ultra settings, and 25-30 on low, it's doing something wrong.

#Muricaparrotgang

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, JZStudios said:

If by "appreciable" you mean... somewhere around 5 FPS then.. sure. I concede, massively optimized.

The graph shows High floats around 75 FPS while Very high is around 60-62 FPS.

Quote

I'm not saying GRWL ISN'T CPU heavy, but I can't really fathom why. ARMA 3 is also massively CPU limited and graphics settings don't really have a major impact on FPS, but that game also does a lot of physics effects and other things I know eat CPU resources. None of which GRWL really has. So... I guess yeah it comes down to "Why does it perform so poorly?"

GRWL is running physics and "other things", whatever those are. It's also simulating basic crowd and population behavior like GTA V.

Quote

Another example for me would be Steep, another Ubisoft title. It also performs poorly on my system, and let's not beat around the bush here, it's an empty mountainside. Nothing is... super particularly detailed even on high, but at the lowest settings I still only managed a stuttery framerate in the mid 20s. The LODs aren't super far.. it just shouldn't be as "demanding" as it is.

You could also go for Assassins Creed Unity or whatever, another Ubisoft title that performs horrendously. But maybe that's also just my perception of it.

And that's what you're doing. You're judging these games based on your hardware. And I don't know what hardware you have other than a GTX 960. If the rest of the system is closer to minimum specifications, then it's not going to run as well as you hope. But as for my system, GRW does this to the CPU:

 

grw_cpu_usage.png.5726bd508afb9dae6d5e040143b0f92c.png

 

This is on the cusp of being a CPU bottleneck by maxing out my CPU. It's not quite there yet, but I certainly won't be surprised if getting a video card upgrade doesn't necessarily boost performance as much as I'd want it to.

 

Quote

You're talking to me like I'm an idiot.

I understand that there's different configurations for PC. Even still when games "aim" for 30 fps on console and instead get low stuttery 20's, that's some dog shit optimization.

And if you have 50 Chrome tabs open all doing something in the background (which I'm not saying that's what you specifically do), that eats into CPU time that could've been spent on the game. But again, I don't know what the rest of your system is, so I can't really make that call.

 

Quote

The "objective" method of measuring how well optimized something is, is by comparing visuals and framerate. The Witcher 3 is decently optimized, and from low settings to ultra has a huge fps impact, more than the 5-10 of GRWL or Steep. It is however not amazing on console, and I seriously doubt they couldn't drop some form of AO, shadow quality, whatever to get a more stable framerate. Not that that's necessarily the best way to optimize.

 

I'll also add that I played Deus Ex Mankind Divided on my 960, (I'm using it as a comparatively similar time release frame... kind of... my Steam library isn't very large, especially newer games.) which would max out my 960 easy but I could fiddle with the settings and get a decent usable framerate. I have no idea how it performed on console. I mean, it wasn't the best optimized on PC and making a body pile in one section caused the game world to unload around me around the framerate to drop to single digits.

All this implies that both games are more sensitive to GPU performance than CPU performance (https://www.techspot.com/review/1006-the-witcher-3-benchmarks/page5.html

https://www.techspot.com/review/1235-deus-ex-mankind-divided-benchmarks/page5.html)

 

Quote

I'm not saying that GRWL should run at ultra on my 960, but when I get 20 fps on ultra settings, and 25-30 on low, it's doing something wrong.

Except it doesn't (taken from https://www.dsogaming.com/pc-performance-analyses/tom-clancys-ghost-recon-wildlands-pc-performance-analysis/2/)

Ghost-Recon-Wildlands-GPU-Low.png

 

Ghost-Recon-Wildlands-GPU-High.png

 

Ghost-Recon-Wildlands-GPU-Custom-Very-Hi

 

And this graph shows that the game is more sensitive to CPU performance than GPU performance:

Ghost-Recon-Wildlands-CPU-Low.png

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×