Jump to content

NVIDIA reportedly working on 'AI-Optimized Drivers’ with improved performance by up to 30%

12 minutes ago, Stahlmann said:

Point is, these marketing statements are cherry picked and in most cases have little to nothing to do with a real world use case. So until you get 3rd party confirmation from reputable sources, expect lies. They usually try to excite customers because when they're excited, logical thinking is severely limited and they're more likely to buy stuff they don't need.

exactly has been that way since i could remember by all companies

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, pas008 said:

exactly has been that way since i could remember by all companies

Distrust was always better than trust, but it hasn't been this bad with 3x apple vs. orange claims. Or at least i can't remember when i last saw marketing material this stupid. And normally we shouldn't expect lies, because that's actually illegal. Yet somehow they find loopholes to serve lies as marketing.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Stahlmann said:

Distrust was always better than trust, but it hasn't been this bad with 3x apple vs. orange claims. Or at least i can't remember when i last saw marketing material this stupid. And normally we shouldn't expect lies, because that's actually illegal. Yet somehow they find loopholes to serve lies as marketing.

as I said we dont know anything with cyberpunk rt override yet

on nvidias website they have 4090 4x over 3090ti for rt override 

 

https://www.nvidia.com/en-us/geforce/graphics-cards/40-series/rtx-4090/

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, pas008 said:

as I said we dont know anything with cyberpunk rt override yet

on nvidias website they have 4090 4x over 3090ti for rt override 

 

https://www.nvidia.com/en-us/geforce/graphics-cards/40-series/rtx-4090/

That's why i said apple vs. orange claims. They're running at different settings. But most consumers just look at that chart and see a 3x improvement. It's simply misleading advertising.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Stahlmann said:

That's why i said apple vs. orange claims. They're running at different settings. But most consumers just look at that chart and see a 3x improvement. It's simply misleading advertising.

4k series does have their improvement or new of tensor rt optical flow along with av1

dont know why this hardware is denied though

 

which could be reason why we could only see improvements on certain cards

they are finally utilizing all the card with these drivers

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Stahlmann said:

i can't remember when i last saw marketing material this stupid

John Romeor's Daikatana?

 

NOTE: I no longer frequent this site. If you really need help, PM/DM me and my e.mail will alert me. 

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, StDragon said:

 AMD should have taken my advice when I gave it.

 

Joking aside, 30% is incredible if to be believed. I wonder if compilers could be further optimized via AI too.

I don't doubt this is possible, if the AI is at a front-end pass that manages to "understand" what the programmer wanted, manages to transform that into an optimized form, which then gets compiled. e.g. a common bottleneck is processing stuff at a finer level instead of a coarser level. If it manages to see the big picture, it could result in great speedups.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, BetteBalterZen said:

Maybe a newer version than the upscaler found on the Shield and Shield Pro TVs? 

 

You might be right here, as on NVIDIA's website it says the following:

 

Quote

SHIELD delivers AI powered upscaling, capable of intelligently scaling lower resolution video to your TV’s native resolution. With this new feature, HD movies on Netflix or Prime Video will look sharper on your 4K display.

 

https://www.nvidia.com/en-us/shield/support/shield-tv/ai-upscaling/

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, leadeater said:

lololololololol

 

Somehow this seems impossible to even slightly believe.

 

Fixing a known bug with a game is one thing, having in-built game profiles that change game graphical settings is another (Nvidia and AMD have this) however I would not be calling either what this story is trying to imply.

 

Fixing a bug certainly is a performance improvement, automatically changing graphical settings is not increasing "performance" through "driver optimizations", GPU performance has in fact stayed the same you just changed what the GPU is rendering changing game frame rate also know as performance. I get a strong sense of people playing fast and loose with words and meanings here.

 

I saw on another website someone mentioned that it could be possible the AI optimalizations take better advantage of Integer and Floating Point units on Ampere/Ada Lovelace. Being that there is ludicrous amounts of power in these cards, it seems plausible.

 

However, someone mentioned in response to that person, that these are pre-compiled optimizations, not real-time ones. Meaning, AI is being used to figure out how to optimize the drivers for a specific game, which then are hard-coded into the drivers.

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Stahlmann said:

Distrust was always better than trust, but it hasn't been this bad with 3x apple vs. orange claims. Or at least i can't remember when i last saw marketing material this stupid. And normally we shouldn't expect lies, because that's actually illegal. Yet somehow they find loopholes to serve lies as marketing.

It's double stupid since they claimed that RTX 4080 is 3x 3080ti

 

but now that RTX 4070ti is 3x 3090ti

 

so it only makes sense that according to the PR the 4070ti is faster than 4080.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, BiG StroOnZ said:

However, someone mentioned in response to that person, that these are pre-compiled optimizations, not real-time ones. Meaning, AI is being used to figure out how to optimize the drivers for a specific game, which then are hard-coded into the drivers.

Yea that's the only plausible method I could think of, Nvidia has one of the largest top 500 clusters in the world so it's definitely something they could throw resources at to do that. I just think to get such large general gains on games optimizing the game is required also.

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, leadeater said:

I just think to get such large general gains on games optimizing the game is required also.

... that's what GPU drivers do. "Game Ready" drivers are just that - optimising the game to work better on the hardware by fixing bugs and allowing the game more direct access to the GPU. (In the past this also included stuff like SLI profilles.) Try running a game on a driver from before it's release and you'll find its generally much buggier - and often slower - than the ones that follow. Hence the first steps if you're experiencing problems in-game are "have you updated your game and drivers?"

 

Here's a video from Nvidia that gives some insight as to the kinds of things they do:

 

Sounds to me like they're simply trying to optimise this process with AI.

CPU: i7 4790k, RAM: 16GB DDR3, GPU: GTX 1060 6GB

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, tim0901 said:

... that's what GPU drivers do. "Game Ready" drivers are just that - optimising the game to work better on the hardware by fixing bugs and allowing the game more direct access to the GPU. (In the past this also included stuff like SLI profilles.) Try running a game on a driver from before it's release and you'll find its generally much buggier - and often buggier - than the ones that follow. Hence the first steps if you're experiencing bugs in-game is "have you updated your game and drivers?"

It's very rare to get anything above 10% performance increase without there being a rather specific bug. Game Ready drivers do not allow more direct access to the GPU, drivers are drivers. Unless the driver or DirectX implements a new standard/feature access to the hardware does not change. Making sure calls between software to the driver and the driver to the hardware are happening optimally is where performance gains can come in to play, there first has to be something to optimize and and improve.

 

I've run plenty games on massively outdate drivers without any issue, it's a mixed bag as to if Game Ready really matters. It matters most when when technologies get introduced like RTX.

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, leadeater said:

I've run plenty games on massively outdate drivers without any issue, it's a mixed bag as to if Game Ready really matters. It matters most when when technologies get introduced like RTX.

I've played new games forgetting to update my drivers. Didn't have issues while playing, but would go back and update them...I'm not sure if they specifically help, especially on older hardware(I still have a GTX 1070). I might get some of the optimizations, but I imagine they're mostly for newer cards, but they might naturally sometimes trickle down to older GPUs. Might be worth doing tests to see if they really help. 

"It pays to keep an open mind, but not so open your brain falls out." - Carl Sagan.

"I can explain it to you, but I can't understand it for you" - Edward I. Koch

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, Godlygamer23 said:

I've played new games forgetting to update my drivers. Didn't have issues while playing, but would go back and update them...I'm not sure if they specifically help, especially on older hardware(I still have a GTX 1070). I might get some of the optimizations, but I imagine they're mostly for newer cards, but they might naturally sometimes trickle down to older GPUs. Might be worth doing tests to see if they really help. 

more so if there is new tech or features, physic updates etc. but do wonder too, in which updates help that much. Although you do sometimes have those games that either runs fine and gets broken and needs another update to fix, or starts broken and gets better.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, BiG StroOnZ said:

 

You might be right here, as on NVIDIA's website it says the following:

 

 

https://www.nvidia.com/en-us/shield/support/shield-tv/ai-upscaling/

Exactly. Some of the same code is probably used 😄

PC Setup: 

HYTE Y60 White/Black + Custom ColdZero ventilation sidepanel

Intel Core i7-10700K + Corsair Hydro Series H100x

G.SKILL TridentZ RGB 32GB (F4-3600C16Q-32GTZR)

ASUS ROG STRIX RTX 3080Ti OC LC

ASUS ROG STRIX Z490-G GAMING (Wi-Fi)

Samsung EVO Plus 1TB

Samsung EVO Plus 1TB

Crucial MX500 2TB

Crucial MX300 1TB

Corsair HX1200i

 

Peripherals: 

Samsung Odyssey Neo G9 G95NC 57"

ASUS ROG Harpe Ace Aim Lab Edition Wireless

ASUS ROG Claymore II Wireless

ASUS ROG Sheath BLK LTD'

Corsair SP2500

Beyerdynamic DT 770 PRO X (Limited Editon) & Beyerdynamic TYGR 300R + FiiO K7 DAC/AMP

RØDE VideoMic II + Elgato WAVE Mic Arm

 

Racing SIM Setup: 

Sim-Lab GT1 EVO Sim Racing Cockpit + Sim-Lab GT1 EVO Single Screen holder

Svive Racing D1 Seat

Samsung Odyssey G9 49"

Simagic Alpha Mini

Simagic GT4 (Dual Clutch)

CSL Elite Pedals V2

Logitech K400 Plus

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, leadeater said:

I've run plenty games on massively outdate drivers without any issue, it's a mixed bag as to if Game Ready really matters.

It could be an interesting test to pick an older GPU like 1080Ti which is getting close to its 5th birthday, so that older drivers are available. Test it with recent games comparing it with current driver.

 

As Intel are learning now that they're providing better gaming GPUs, drivers do make a big difference on performance. Whatever gaming optimisations AMD/nvidia have learnt over the years, Intel have to go through too. 

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Alienware AW3225QF (32" 240 Hz OLED)
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, iiyama ProLite XU2793QSU-B6 (27" 1440p 100 Hz)
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/9/2023 at 6:57 PM, BiG StroOnZ said:

who knows what is being left on the table, performance wise, from human done coding.

the announcement is really vague and I doubt the optimization was achieved with AI generated code. it's more likely an "AI" system was fed with a bunch of test scenarios to tweak some priority settings. as far as I know the state of AI for code generation is still pretty abysmal beyond short boilerplate snippets.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think the performance claims are that impossible.

The "up to 30% in certain games" claim is very believable. We see that all the time with "game ready" drivers.

 

The "10% average" claim is far more unlikely to be true, but I think we have seen that before as well. Didn't AMD discover that their drivers did something in a very unoptimal way back in like 10 years ago and fixing it gave them a quite significant boost in performance across the board? I can't remember if it had to do with microstuttering or the 1%-lows. I think it was around the time when PCPer started measuring frame consistency and benchmarking sites in general moved away from just measuring average FPS.

 

But like I said earlier, I strongly believe that this is just some leaker trying to drum up engagement by slapping the word "AI" on a rumor that might not even be true to begin with.

Link to comment
Share on other sites

Link to post
Share on other sites

NVIDIA is a juggernaut in the field of AI both in hardware and software.

You can buy a NVIDIA Jetson and do all sorts of cool AI stuff with it.

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

@ZeroStrat Last login in 2021. Sadge. 

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, LAwLz said:

I don't think the performance claims are that impossible.

The "up to 30% in certain games" claim is very believable. We see that all the time with "game ready" drivers.

 

The "10% average" claim is far more unlikely to be true, but I think we have seen that before as well. Didn't AMD discover that their drivers did something in a very unoptimal way back in like 10 years ago and fixing it gave them a quite significant boost in performance across the board? I can't remember if it had to do with microstuttering or the 1%-lows. I think it was around the time when PCPer started measuring frame consistency and benchmarking sites in general moved away from just measuring average FPS.

 

But like I said earlier, I strongly believe that this is just some leaker trying to drum up engagement by slapping the word "AI" on a rumor that might not even be true to begin with.

think amd discovered that couple times

hd5k series and 200 series for sure, amd finewine at the time

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, leadeater said:

Game Ready drivers do not allow more direct access to the GPU, drivers are drivers. Unless the driver or DirectX implements a new standard/feature access to the hardware does not change. Making sure calls between software to the driver and the driver to the hardware are happening optimally is where performance gains can come in to play, there first has to be something to optimize and and improve

That's my understanding as well, that Game Ready is just a platform that provides a template of predefined optimal game settings based on the PC profile, such as what type of CPU and GPU you have. Maintaining such a database on the backend would be trivial.

But, if AI is being used to optimizing code without actually butchering the drivers for other games, that stands to reason they're using shims. Basically, a modified subset of the driver that's wedged in between of the driver and game code being executed. So, they would either maintain a monolithic shim, or per game downloaded from Nvidia (because it would really bloat the unified driver installer if preloaded there)  

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, StDragon said:

But, if AI is being used to optimizing code without actually butchering the drivers for other games, that stands to reason they're using shims. Basically, a modified subset of the driver that's wedged in between of the driver and game code being executed. So, they would either maintain a monolithic shim, or per game downloaded from Nvidia (because it would really bloat the unified driver installer if preloaded there)  

Also if using new cards with AI use, maybe some AI telemetry can be used.

finding and collecting what it the AI thinks can be a "bottleneck"? but how to run that and to run that on everyone's GPU and if without people knowing? that would suck and if breaking some privacy rules. So collecting reports from all others out there, do wonder how much could be found.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, StDragon said:

That's my understanding as well, that Game Ready is just a platform that provides a template of predefined optimal game settings based on the PC profile, such as what type of CPU and GPU you have. Maintaining such a database on the backend would be trivial.

Providing templates with predefined game settings is what GeForce Experience does, but that's a separate thing from "game ready drivers".

The "game ready drivers" improve performance even when the settings remain the same. For example Nvidia stated that their 522.25 WHQL driver improved performance in Assassin's Creed Valhalla by up to 24% when running at 1080p and max graphics settings.

Here is an article where someone benchmarked a 2080 Ti and 3080 using two different drivers (517.48 and 522.25) and did find some significant uplifts in some games. Most of the games were more or less the same though. They measured a 13.85% increase in average FPS in AC:V on both the older and new (at the time) new card.

 

 

I can't find much information about exactly what changes in a driver to allow this performance boost, but it's not just setting game graphics settings to predefined defaults.

The only solid info I could find was that the drivers among other things include "shader compilation optimizations". It seems like the graphics driver sometimes just overrides the calls made by the game. I did however find some evidence that Intel in their open source driver sometimes "hijacks" API calls that are "not valid", and rewrites them to be valid. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×