Jump to content

NVIDIA reportedly working on 'AI-Optimized Drivers’ with improved performance by up to 30%

Summary

Today we have a new rumor that NVIDIA will use AI to optimize its driver performance to reach grounds that the human workforce can not. According to CapFrameX, NVIDIA is allegedly preparing special drivers with optimizations done by AI algorithms. As the source claims, the average improvement will yield a 10% performance increase with up to 30% in best-case scenarios.

 

 

Quotes

Quote

It is claimed that further AI optimizations may be added to the graphics driver focusing on “instructions, throughput, hardware utilization, threading, settings”.

 

Unfortunately, the report doesn't state which GPUs will benefit from these improvements. Hopefully, it will be for all GPU generations currently supported by Nvidia. Still, we wouldn't be surprised if the green team started with a soft launch for their latest RTX cards, with older generations receiving the updates later, but that's pure speculation right now.

 

CapFrameX says that this AI graphics driver tuning might happen this year, maybe even in Q1, which could mean we see something in the next couple of months.

 

My thoughts

This is cool, as it looks like AMD's Fine Wine will have a competitor in the driver maturation area. I think up to 30% is quite a bold claim, but who knows what is being left on the table, performance wise, from human done coding. I think the goal is always to bring performance closer to the metal, and it appears that's what NVIDIA is trying to do here with the AI Optimizations. As always, this is a rumor with vague implications being made, therefore take it with a grain a salt. Either or, if true, this could be huge for increasing performance on NVIDIA cards, as even 10% improvement through drivers is welcomed. 

 

Sources

https://www.kitguru.net/components/graphic-cards/joao-silva/nvidia-reportedly-developing-ai-optimised-graphics-drivers-up-to-30-performance-gains/

https://www.techpowerup.com/303308/nvidia-could-release-ai-optimized-drivers-improving-overall-performance

https://videocardz.com/newz/nvidia-reportedly-working-on-ai-optimized-drivers-with-improved-performance

https://www.techradar.com/news/watch-out-amd-nvidia-could-boost-gpu-performance-by-up-to-30-with-ai

Link to comment
Share on other sites

Link to post
Share on other sites

So how do they know there is another 10% to 30% to be had just from driver improvements of they need the AI to improve it? I call BS until I see it. They may very well use AI as a tool for devs which is fine but I doubt we will see any substantial improvements in performance just from drivers.

NVIDIA had a loooooong streak of bad drivers past few months so I hope they can actually fix their issues for now.

Link to comment
Share on other sites

Link to post
Share on other sites

I vaguely recall CapFrameX guy has said some really stupid things in the past so ensure you have unsafe sodium levels on standby.

 

Automated optimisation of software is a thing for a while. I know of an example I wanted to give, but I'm struggling to find a specific reference for it. Suffice to say, it was an automated trying "things" trying to find combinations of execution that gave the best performance. It wouldn't surprise me if "AI" could be used to optimise that process, maybe not necessarily to give better results, but to get those results sooner than otherwise.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, porina said:

It wouldn't surprise me if "AI" could be used to optimise that process, maybe not necessarily to give better results, but to get those results sooner than otherwise.

That would be my guess as well.

 

I suspect most game specific optimization that make it into GPU drivers come down to fiddling with various trade-off switches and dials. An AI should be able to test all possible combinations much faster, observing both performance and loss of visual quality. Train it to pick the best performance uplift with the least loss of visual quality.

Remember to either quote or @mention others, so they are notified of your reply

Link to comment
Share on other sites

Link to post
Share on other sites

Did someone just put nvidia's driver into chatGPT and ask it to optimize the code?

 

This also feels like an 'only for 40 series' begging for people to buy 40series cards. If the old cards perform 10-30% better, why upgrade to the overpriced new gen?

The best gaming PC is the PC you like to game on, how you like to game on it

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, GhostRoadieBL said:

This also feels like an 'only for 40 series' begging for people to buy 40series cards. If the old cards perform 10-30% better, why upgrade to the overpriced new gen?

They already released a driver at the end of last year with general performance uplifts, specifically for RTX GPUs running DX12 titles: https://www.nvidia.com/en-us/geforce/news/geforce-rtx-4090-game-ready-driver/

 

More recently they announced RTX Video coming next month would be 30/40 series only. It remains to be seen why 20 series is left out, but it could be like the hardware difference of the Optical Flow processor needed for DLSS3.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think this sounds unreasonable.

 

Chances are they are already using "AI" tools to do certain tasks when developing. As @Eigenvektor said, this might just be a faster way of testing various settings to find the optimal ones. I am not entirely sure what Nvidia does to optimize their drivers, but their drivers include a lot of game specific settings that tweaks things depending on the game you play. 

We have already seen countless of "30% performance increase" drivers gets released over the decades. The november driver increase performance in Assassin's Creed Valhalla 1080p by 24% for example (in Nvidia's own tests)

I find the "10% average performance increase" to be a bit sketchy. That sounds too good to be true.

 

My guess is that this won't really be anything revolutionary, and someone in the rumor mill decided to slap the words "AI" to get more engagement.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, porina said:

More recently they announced RTX Video coming next month would be 30/40 series only

whats that?  i tried googling but no chance with a name like that... improved recording etc? 

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

if nvidia is true to their marketing, or if its not as great as shown.

if its more of a denoise, than something like DLSS with reconstruction + upscale or just upscale? if there was depth data in the video content or create their own depth data. (like filters for your phone or images to 3D objects)

but some of that sounds like something that will be more prone for faults and visual issues.

 

2kliksphilip

Link to comment
Share on other sites

Link to post
Share on other sites

My drivers are augmented

 

JCDenton2.webp.2595138def75203865400818a2b6f814.webp

AMD Ryzen R7 1700 (3.8ghz) w/ NH-D14, EVGA RTX 2080 XC (stock), 4*4GB DDR4 3000MT/s RAM, Gigabyte AB350-Gaming-3 MB, CX750M PSU, 1.5TB SDD + 7TB HDD, Phanteks enthoo pro case

Link to comment
Share on other sites

Link to post
Share on other sites

 AMD should have taken my advice when I gave it.

 

Joking aside, 30% is incredible if to be believed. I wonder if compilers could be further optimized via AI too.

Link to comment
Share on other sites

Link to post
Share on other sites

I just worry about AI synthesized instability

"If a Lobster is a fish because it moves by jumping, then a kangaroo is a bird" - Admiral Paulo de Castro Moreira da Silva

"There is nothing more difficult than fixing something that isn't all the way broken yet." - Author Unknown

Spoiler

Intel Core i7-3960X @ 4.6 GHz - Asus P9X79WS/IPMI - 12GB DDR3-1600 quad-channel - EVGA GTX 1080ti SC - Fractal Design Define R5 - 500GB Crucial MX200 - NH-D15 - Logitech G710+ - Mionix Naos 7000 - Sennheiser PC350 w/Topping VX-1

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, GhostRoadieBL said:

Did someone just put nvidia's driver into chatGPT and ask it to optimize the code?

 

This also feels like an 'only for 40 series' begging for people to buy 40series cards. If the old cards perform 10-30% better, why upgrade to the overpriced new gen?

nGreedia will probably charge for upgrade to a virtual 40 series from a 30 card. A "virtual trade-up" plan.

I'm bad about giving ideas. Sorry.

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, WereCat said:

So how do they know there is another 10% to 30% to be had just from driver improvements of they need the AI to improve it? I call BS until I see it. They may very well use AI as a tool for devs which is fine but I doubt we will see any substantial improvements in performance just from drivers.

NVIDIA had a loooooong streak of bad drivers past few months so I hope they can actually fix their issues for now.

 

If they're using "AI" actively, that means it's coming out of the CUDA budget. So don't expect to see an improvement on a low-end card. Like even if it's true, I suspect we're not talking about frame rates but texture or model optimization in a way acts like a smarter z-buffer.

 

That said, every nvidia driver since adding support for directstorage between October and this January was resulting in a lot of SYSTEM_PTE_MISUSE errors. Another thing that has happened is that several programs that invoke the camera, lock up with current drivers.

 

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, BiG StroOnZ said:

As the source claims, the average improvement will yield a 10% performance increase with up to 30% in best-case scenarios.

lololololololol

 

Somehow this seems impossible to even slightly believe.

 

Fixing a known bug with a game is one thing, having in-built game profiles that change game graphical settings is another (Nvidia and AMD have this) however I would not be calling either what this story is trying to imply.

 

Fixing a bug certainly is a performance improvement, automatically changing graphical settings is not increasing "performance" through "driver optimizations", GPU performance has in fact stayed the same you just changed what the GPU is rendering changing game frame rate also know as performance. I get a strong sense of people playing fast and loose with words and meanings here.

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, leadeater said:

Somehow this seems impossible to even slightly believe.

I'm split. The source of this rumour is not one I'd consider credible based on their past actions, and they may have an ulterior motive of increasing awareness of their performance measurement offering.

 

That aside, we have seen driver updates provide notable uplifts across wider games. I linked an nvidia RTX/DX12 example earlier. Also recently Intel updated DX9 performance on their Arc GPUs. In concept I wouldn't rule out AI or non-AI methods to brute force performance updates in general, if the code allows that potential. The specific example I wanted to give was a Ryzen optimisation in y-cruncher was not only done by hand, but included a machine based optimisation. I recall the author said words to the effect of not knowing what exactly it did but it improved and still works, so ran with it. Just wish I could find a reference to that again.

 

Note the assumption I'm running with here is that this is done to make the driver package, and it isn't something that runs locally.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, porina said:

That aside, we have seen driver updates provide notable uplifts across wider games. I linked an nvidia RTX/DX12 example earlier.

The problem with that is large performance increase were the exception, addressing game specific bugs or issues and included game profile changes like enabling Re-BAR so these don't fit in to just optimizing the driver itself leading to better performance at the same settings and configuration doing the same thing. The majority of games gained less than 10% and likely mostly due to Re-BAR.

 

13 minutes ago, porina said:

Also recently Intel updated DX9 performance on their Arc GPUs

If this was something on the level of what Intel did there to achieve that then how it would be being talked about would be very different. That was no mere driver optimization, what Intel did was fundamentally change how DX9 games were run entirely. I would still put that under the umbrella of driver optimizations though, however not portray it like this story.

 

I like to be mindful of how something was achieved and why and attribute to those things clearly.

 

13 minutes ago, porina said:

The specific example I wanted to give was a Ryzen optimisation in y-cruncher was not only done by hand, but included a machine based optimisation. I recall the author said words to the effect of not knowing what exactly it did but it improved and still works, so ran with it. Just wish I could find a reference to that again.

Yea I remember that and if it's something near identical to this then I would be crediting AI driver optimizations, but I'll say it has to be this and not something else. And somehow I find it doubtful it would be widely applicable cross many games. Of course that doesn't rule out Nvidia spending up large with there massive HPC cluster and trying to optimize specific games this way and many of them. They could have done that for sure, they have the computational power to do that, problem is that assumes there is this much to be gain without touching the game itself and that second part is where it sticks as a big problem for me.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, leadeater said:

They could have done that for sure, they have the computational power to do that, problem is that assumes there is this much to be gain without touching the game itself and that second part is where it sticks as a big problem for me.

One further thought along this line. It'll probably be a one-time improvement if it happens at all. Find a way to improve, and it becomes the new normal. We're really speculating. Is it per-game, or a general driver optimisation thing? Maybe with further advances minor incremental updates are possible but I'd guess low hanging fruit will be picked and after that improvements become relatively insignificant. Can only wait and see if anything happens at all.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Quackers101 said:

if nvidia is true to their marketing

 

Marketing:

image.png.173f202761443d834f8dc6452f76adba.png

 

Reality:

image.thumb.png.40fb3bb8bef3da3feb3f1552823f423c.png

 

Yeah... We're at a point where you shouldn't even take marketing with a grain of salt. You should be expecting that they straight up lie to you.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, BiG StroOnZ said:

 

I believe it's called RTX Video Super Resolution:

 

 

Damn that is so cool!!! 

Maybe a newer version than the upscaler found on the Shield and Shield Pro TVs? 

PC Setup: 

HYTE Y60 White/Black + Custom ColdZero ventilation sidepanel

Intel Core i7-10700K + Corsair Hydro Series H100x

G.SKILL TridentZ RGB 32GB (F4-3600C16Q-32GTZR)

ASUS ROG STRIX RTX 3080Ti OC LC

ASUS ROG STRIX Z490-G GAMING (Wi-Fi)

Samsung EVO Plus 1TB

Samsung EVO Plus 1TB

Crucial MX500 2TB

Crucial MX300 1TB

Corsair HX1200i

 

Peripherals: 

Samsung Odyssey Neo G9 G95NC 57"

ASUS ROG Harpe Ace Aim Lab Edition Wireless

ASUS ROG Claymore II Wireless

ASUS ROG Sheath BLK LTD'

Corsair SP2500

Beyerdynamic DT 770 PRO X (Limited Editon) & Beyerdynamic TYGR 300R + FiiO K7 DAC/AMP

RØDE VideoMic II + Elgato WAVE Mic Arm

 

Racing SIM Setup: 

Sim-Lab GT1 EVO Sim Racing Cockpit + Sim-Lab GT1 EVO Single Screen holder

Svive Racing D1 Seat

Samsung Odyssey G9 49"

Simagic Alpha Mini

Simagic GT4 (Dual Clutch)

CSL Elite Pedals V2

Logitech K400 Plus

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Stahlmann said:

 

Marketing:

image.png.173f202761443d834f8dc6452f76adba.png

 

Reality:

image.thumb.png.40fb3bb8bef3da3feb3f1552823f423c.png

 

Yeah... We're at a point where you shouldn't even take marketing with a grain of salt. You should be expecting that they straight up lie to you.

rt overdrive hasnt been released yet but not sure why the 3090ti wouldnt benefit though

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, pas008 said:

rt overdrive hasnt been released yet but not sure why the 3090ti wouldnt benefit though

Point is, these marketing statements are cherry picked and in most cases have little to nothing to do with a real world use case. So until you get 3rd party confirmation from reputable sources, expect lies. They usually try to excite customers because when they're excited, logical thinking is severely limited and they're more likely to buy stuff they don't need.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×