Jump to content

NVIDIA reportedly working on 'AI-Optimized Drivers’ with improved performance by up to 30%

8 hours ago, LAwLz said:

I don't think the performance claims are that impossible.

The "up to 30% in certain games" claim is very believable. We see that all the time with "game ready" drivers.

 

The "10% average" claim is far more unlikely to be true, but I think we have seen that before as well. Didn't AMD discover that their drivers did something in a very unoptimal way back in like 10 years ago and fixing it gave them a quite significant boost in performance across the board? I can't remember if it had to do with microstuttering or the 1%-lows. I think it was around the time when PCPer started measuring frame consistency and benchmarking sites in general moved away from just measuring average FPS.

 

But like I said earlier, I strongly believe that this is just some leaker trying to drum up engagement by slapping the word "AI" on a rumor that might not even be true to begin with.

There are sometimes games where a driver update can significantly increase the performance but saying that AI will solely be responsible to give up to 10% to 30% higher performance is way too big claim. Those are quite rare cases. And is it 10% to 30% once or consistently as newer and newer drivers will release?

 

There has been driver from NVIDIA recently that significantly improves DX12 game performance in some games, does not apply for all games ofc... but it's by optimizing the DX12 pipeline, see here:

https://www.nvidia.com/en-us/geforce/news/geforce-rtx-4090-game-ready-driver/

 

However, these kind of improvements are rare and it also depends on how much close you've already optimized.

 

Compare that to Intel Arc for example, their HW is capable of quite a bit more of what the actual performance is just because their drivers are far behind AMD or NVIDIA regarding game optimization, they can extract and benefit far more from DX9 / DX11 / DX12 and game specific optimizations just because they are so far behind right now.

 

Compare that to NVIDIA or AMD which still definitely have some room to improve but they are significantly more limited as the room to improve is much smaller because their drivers are far more mature.

 

And this all is ignoring the fact that you may extract 100% performance from game optimization on your GPU and in theory get x% improvement in GPU bound scenarios but if the game is a huge CPU hog it still wont matter because you will never see those improvements and thus the developer of the game is solely responsible for the performance due the optimization on their side either by some game specific things or the game engine which is often the case unfortunately.

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/9/2023 at 12:23 PM, WereCat said:

So how do they know there is another 10% to 30% to be had just from driver improvements of they need the AI to improve it? I call BS until I see it. They may very well use AI as a tool for devs which is fine but I doubt we will see any substantial improvements in performance just from drivers.

NVIDIA had a loooooong streak of bad drivers past few months so I hope they can actually fix their issues for now.

If it were true probably because they have already started the process and have it in Alpha/Beta.

 

I can definitely see "up to" 30% be the case for newly released games, likely getting the same results as they do now but faster with less updates.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Sauron said:

the announcement is really vague and I doubt the optimization was achieved with AI generated code. it's more likely an "AI" system was fed with a bunch of test scenarios to tweak some priority settings. as far as I know the state of AI for code generation is still pretty abysmal beyond short boilerplate snippets.

I'm not so sure anymore. I tried ChatGPT today and it managed to produce 3 working alternatives for the same code, with different potential optimizations for branch misses. It couldn't say which one was the best, but still is pretty nice. Now it is a matter of training it to learn what optimizations do actually perform better than others.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Forbidden Wafer said:

I'm not so sure anymore. I tried ChatGPT today and it managed to produce 3 working alternatives for the same code, with different potential optimizations for branch misses. It couldn't say which one was the best, but still is pretty nice. Now it is a matter of training it to learn what optimizations do actually perform better than others.

I mean again, snippets.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, WereCat said:

 

And this all is ignoring the fact that you may extract 100% performance from game optimization on your GPU and in theory get x% improvement in GPU bound scenarios but if the game is a huge CPU hog it still wont matter because you will never see those improvements and thus the developer of the game is solely responsible for the performance due the optimization on their side either by some game specific things or the game engine which is often the case unfortunately.

cant this be fixed though with drivers?

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, WereCat said:

but saying that AI will solely be responsible to give up to 10% to 30% higher performance is way too big claim.

Nobody is saying, or has said that the AI is solely responsible for all of the performance uplift. 

 

3 hours ago, WereCat said:

And is it 10% to 30% once or consistently as newer and newer drivers will release?

I think it's pretty obvious that if this turns out to be true (big IF), it will be a one time thing. I don't think anyone is reading this and thinking "we will get 10% improvement across the board every driver update from now on!"

 

3 hours ago, WereCat said:

And this all is ignoring the fact that you may extract 100% performance from game optimization on your GPU and in theory get x% improvement in GPU bound scenarios but if the game is a huge CPU hog it still wont matter because you will never see those improvements and thus the developer of the game is solely responsible for the performance due the optimization on their side either by some game specific things or the game engine which is often the case unfortunately.

There will always be caveats, but I think it is silly to point to those extremes and go "see? it doesn't give X performance!".

By the same logic we could argue that the 4090 isn't faster than a 1060 because if you pair it with a Pentium II you probably won't see an improvement when playing the latest CoD. 

If it turns out to be true then it would be an average of 10%, assuming a reasonable PC, when removing outliers. 

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, pas008 said:

cant this be fixed though with drivers?

Nope.

 

Drivers are just the middleware between "DirectX" and the actual hardware's native instructions.

 

So whatever optimizations the game did to extract performance out of an NVIDIA card will not be the same for a AMD RADEON card. And since all DX9 and DX10/11 games were released BEFORE Intel had a card, no game will have optimization paths for it.

 

But let's be serious too, general-purpose game engines like Unreal, Unity, Godot, etc have to make separate optimizations for different GPU's AND different CPU's. Like if a game wants to invoke AVX512 and you have a 12 or 13th gen Intel CPU, SOL, better fallback to something else. You can't just go "sorry this game needs X instructions", that has gone over so well before. We all learned this with "requiring MMX" and "requiring SSE" fiasco's with certain games and (PS3 emulators massively sped up by AVX512.) Certain instructions can massively speed them up, but then you limit the hardware it can run on.

 

On the flip side of that, the Drivers themselves shouldn't be optimizing for specific games, because the games may change and that optimization then ends up being a blocker for not fixing performance issues (eg fixing it, may result in the driver  optimization breaking.) So what the drivers really need is a flag like the "run in Windows 8 compatibility mode" to bypass GPU driver tampering when developing the games, and giving the end user the ability to turn it off to test benchmarks.

 

It's been commonly cited that DX12 or Vulkan is closer to letting the game developer "write their own driver" and as such "AI optimization" may play a better role here in determining if that developer is just kind of poor at using the DX12/Vulkan model.

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, Kisai said:

Nope.

 

Drivers are just the middleware between "DirectX" and the actual hardware's native instructions.

 

So whatever optimizations the game did to extract performance out of an NVIDIA card will not be the same for a AMD RADEON card. And since all DX9 and DX10/11 games were released BEFORE Intel had a card, no game will have optimization paths for it.

 

But let's be serious too, general-purpose game engines like Unreal, Unity, Godot, etc have to make separate optimizations for different GPU's AND different CPU's. Like if a game wants to invoke AVX512 and you have a 12 or 13th gen Intel CPU, SOL, better fallback to something else. You can't just go "sorry this game needs X instructions", that has gone over so well before. We all learned this with "requiring MMX" and "requiring SSE" fiasco's with certain games and (PS3 emulators massively sped up by AVX512.) Certain instructions can massively speed them up, but then you limit the hardware it can run on.

 

On the flip side of that, the Drivers themselves shouldn't be optimizing for specific games, because the games may change and that optimization then ends up being a blocker for not fixing performance issues (eg fixing it, may result in the driver  optimization breaking.) So what the drivers really need is a flag like the "run in Windows 8 compatibility mode" to bypass GPU driver tampering when developing the games, and giving the end user the ability to turn it off to test benchmarks.

 

It's been commonly cited that DX12 or Vulkan is closer to letting the game developer "write their own driver" and as such "AI optimization" may play a better role here in determining if that developer is just kind of poor at using the DX12/Vulkan model.

so you are saying drivers never bandaid fixed developer code?

because seems like they do that all the time

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, pas008 said:

so you are saying drivers never bandaid fixed developer code?

because seems like they do that all the time

Nvidia isn't decompiling the game to figure out what to optimize. Nvidia drivers do not do that. Nvidia just goes "oh brokengame-Release-Win64.exe" is running, let's use this alternate shadow/fog generation system that performs better but looks uglier. That's about the extent of it.

 

Using AI here, would be like going "well 24% of games don't use X performance feature, so let's turn that feature on for 100% of games and turn it off if the performance is reduced" AI is auto-complete, not magic.

 

At no point do drivers see into the game binary. Even if they're using Nvidia Gameworks libraries, Nvidia GPU drivers aren't tampering with that. There's the potential for Nvidia to replace/override the version of the gameworks library used, but they're not changing the game code.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Kisai said:

Nvidia isn't decompiling the game to figure out what to optimize. Nvidia drivers do not do that. Nvidia just goes "oh brokengame-Release-Win64.exe" is running, let's use this alternate shadow/fog generation system that performs better but looks uglier. That's about the extent of it.

 

Using AI here, would be like going "well 24% of games don't use X performance feature, so let's turn that feature on for 100% of games and turn it off if the performance is reduced" AI is auto-complete, not magic.

 

At no point do drivers see into the game binary. Even if they're using Nvidia Gameworks libraries, Nvidia GPU drivers aren't tampering with that. There's the potential for Nvidia to replace/override the version of the gameworks library used, but they're not changing the game code.

then why are they game specific? not just talking new game ready why do fixes come months later and performance increases later too, as an ex sli user i seen some sli profiles show up randomly for non sli games and sli performance increase

with what you are saying that shouldnt be possible then

being middleware they cant fix those things like flickering and shit that happened on any card before but their new driver fixed it for just nvidias with xxx.xx driver version or newer?

 

meaning they can take part of code and replace it with their fix being middleware?

 

just lost here how they can keep getting performance increases and why specific drivers are needed for this and fixes

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, pas008 said:

then why are they game specific? not just talking new game ready why do fixes come months later and performance increases later too,

You fundamentally don't understand how software works. Games communicate with API's between each other, they are not changing instructions. Those performance changes are on the driver-side, usually by disabling or reducing the precision of something. Like when a driver update fixes "flickering" or "pop-in", that is deferring texture unloads or z-buffer aggressiveness.

 

42 minutes ago, pas008 said:

meaning they can take part of code and replace it with their fix being middleware?

They are not doing that. Because many games have a few hundred DLL files, it's possible to force the use of a specific version of a DLL file. It goes: game binary static -> DLL's in the game's directory  -> DLL's in the operating system.

 

That is how mods for Unity games and other game engines tend to work, in order to tamper with the code they setup code trampolines so you might have an instruction like

displaytext(string){ dothething();}

but the trampoline renames displaytext to displaytext_12345, and places it's own displaytext(string) function that then calls displaytext_12345(string) after it has tampered with the function.

 

Drivers do not do that. EVER. That's a quick way to change a function in a game that results in catastrophic consequences, especially if it affects a timing loop. It could crash the game or the computer.  The driver only changes what it's being told.

 

In the case of DX12 and Vulkan, a lot of power over the "driver" is ceeded to the developer.

 

What isn't covered here when developers actually contact NVIDIA or AMD to debug something. Like if you go look at release notes for nvidia drivers you'll notice that certain fixes have nothing to do with the actual game or application's performance and mostly deal with unwanted behavior.

https://www.nvidia.com/en-us/drivers/results/198362/

Quote
  • Portal RTX hang during resolution/mode change and GFE recording [3894168]
  • [DirectX 12] Shadowplay recordings may appear over exposed when Use HDR is enabled from the Windows display settings. [200742937]
  • AVS4You monochrome video preview [3890225]
  • Players report black/grey screens in Outer Wilds with 522.25 driver [3841593]
  • Lumion Pro 12.3 - Heavy corruption observed on app window.[3784371]
  • Fixed brightness issue on some Notebooks [3765244]

These are things the driver has control over, and the game or application has limited or no control over.

 

Like for example the HDR settings. At most, Windows has control over turning it on or off, but the processing of HDR is on the GPU. If you do not have a HDR monitor plugged in, then that code path is not used, EVEN if the game or application is outputting HDR.

 

Link to comment
Share on other sites

Link to post
Share on other sites

You are right drivers never incepted and injected their own code on shaders cuda rt and dlss now

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/13/2023 at 7:32 AM, pas008 said:

You are right drivers never incepted and injected their own code on shaders cuda rt and dlss now

Source? Because a while ago Nvidia specifically said they do this. The claims getting thrown around in this thread that they reduce quality by detecting which exe is running is complete bullshit as well.

That was something that was done like 20 years ago but not anymore. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×