Jump to content

nvidia 436.02 drivers, draws inspiration from AMD, Intel

porina
45 minutes ago, TVwazhere said:

Generally a bad idea. Keeping your drivers "up to date" for folding purposes only is generally not wise as often these optimizations only benefit newer games, while having the sometimes opposite affect on compute/folding systems. 

Fixed as I've had performance regressions in some older titles, albeit a necessary evil as the newer drivers fixed some crucial bugs for me in GC emulation.

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, ivan134 said:

Watch the video when you have time. No point in arguing this

Except that is EXACTLY what DLSS is supposed to be doing.  It doesn't matter of some YouTuber compares the two and thinks one is better despite being a completely different thing.  Its their opinion and doesn't necessarily reflect the intent of the function.

Yes its useful to know if one thing might perceptually look better despite the fact it probably shouldn't, but it shouldn't be your only consideration.

You mileage will vary from game to game, monitor to monitor, eyeballs to eyeballs.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Alex Atkin UK said:

Except that is EXACTLY what DLSS is supposed to be doing.  It doesn't matter of some YouTuber compares the two and thinks one is better despite being a completely different thing.  Its their opinion and doesn't necessarily reflect the intent of the function.

Yes its useful to know if one thing might perceptually look better despite the fact it probably shouldn't, but it shouldn't be your only consideration.

You mileage will vary from game to game, monitor to monitor, eyeballs to eyeballs.

No, it's a fact. Lmao at "some youtuber". Fanboys are getting more and more insane

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

MFW someone calls Hardware Unboxed "some youtuber"

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, ivan134 said:

No, it's a fact. Lmao at "some youtuber". Fanboys are getting more and more insane

Ikr? What's the valid source here? Last I checked youtubers can do solid testing too, them being "some youtuber" doesn't make everything they say bogus. Tech Jesus (GamersNexus' Steve) manages to make hella in depth and very trustworthy content and he's a youtuber.  I think we're a bit past the "youtube is kids and low tier content" phase, have been for... quite a while now. 

 

1 minute ago, ivan134 said:

MFW someone calls Hardware Unboxed "some youtuber"

OOF 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Zando Bob said:

Tech Jesus (GamersNexus' Steve) manages to make hella in depth and very trustworthy content and he's a youtuber. 

He runs a website that just happens to have a YouTube channel, because sometimes listening to someone is better than reading a wall of text :3

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, ivan134 said:

MFW someone calls Hardware Unboxed "some youtuber"

Yes, because however good he is - his opinion is not the be all and end all.  What looks better is often subjective.

Don't get me wrong, DLSS so far has been junk but as people rightly pointed out - what AMD has implemented is merely a post-process filter.  A good one perhaps and its great they make it public so any developer can use it, but its not even in the same league of what nVidia were trying to do, even if they have so far failed.

There is no one size fits all solution.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Alex Atkin UK said:

Yes, because however good he is - his opinion is not the be all and end all.

Don't get me wrong, DLSS so far has been junk but as people rightly pointed out - what AMD has implemented is merely a post-process filter.  A good one perhaps and its great they make it public so any developer can use it, but its not even in the same league of what nVidia were trying to do, even if they have so far failed.

There is no one size fits all solution.

I don't think you know what an opinion is

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, ivan134 said:

I don't think you know what an opinion is

I think one of the comments on the video says it better:
 

Quote

Image Sharpening may look amazing in this video but in reality it's got a lot of issues. Mainly oversharpening which will cause a grainy effect which can cause flickering and other artifacts. In some situations it enhances. In some situations it makes things worse. Same for DLSS but that truly is Deep Learning and the Ground Truth training has a lot of potential to improve... I can't predict if DLSS will work long-term (though it probably will) though I can say Image Sharpening isn't going to improve EVERYTHING... there are great ways to use image sharpening in newer games that plan for it by default with AMD's Fidelity FX tools though. I believe they allow to process certain things at a lower quality to improve performance then sharpen after but that works because they already know what they are sharpening and can optimize for that. Whereas again it's a bit hit and miss depending on the game.

My point from the start was that comparing a fairly simple (compared to DLSS)  sharpening filter to DLSS is flawed, at least if you are going to assume one is always better than the other.

What nVidia were trying to do was ambitious and hopefully it will pay off eventually.  Was it worth the cost to the consumer?  Arguably no, but they had to take a gamble that it would make RTX more useful in the short-term and indeed they failed.

If you've been a PC gamer for a few decades this is nothing new.  Pushing big leaps in graphical rendering always is a gamble.  Sometimes the only way to see what works is to throw it all into the pot and see what developers can make of it and you can't do that without releasing it to consumers, as developers wont bother to implement something nobody can use.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, Mira Yurizaki said:

He runs a website that just happens to have a YouTube channel, because sometimes listening to someone is better than reading a wall of text :3

In that specific case, I'd take the wall of text :D 

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

NVIDIA 436.02 Installer Buggy, Always Installs GeForce Experience, No GDPR Consent:

 

Quote

The installer of these drivers appears to have a major bug that forces the installation of GeForce Experience without obtaining GDPR-compliant consent from the user. With the ratification of GDPR, NVIDIA driver installers present a selection screen right at the start of the installation, which lets users opt to install GeForce Experience (and give their GDPR consent in doing so), but a second option lets users decline GDPR consent, forcing the installer to install GeForce drivers without GeForce Experience. A bug with the installer of GeForce 436.02 WHQL disregards the user's choice at this screen, and installs GeForce Experience without the GDPR-mandated user-consent.

 

Making matters far worse is the fact that you cannot deselect GeForce Experience from the list of components in the Custom Install screen. The Custom Install list lets you make the installer skip installation of optional components that are otherwise installed by default in Express Install (GeForce Experience features in this list only if a user gives GDPR consent in the previous screen). 

 

Source: https://www.techpowerup.com/258446/nvidia-436-02-installer-buggy-always-installs-geforce-experience-no-gdpr-consent

 

Pretty sure because of this, they took the link down on the NVIDIA website; as I'm not able to download the new driver anymore. Takes me to a "404 - Not Found" page. While the links for the other previous drivers are working fine.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, BiG StroOnZ said:

NVIDIA 436.02 Installer Buggy, Always Installs GeForce Experience, No GDPR Consent:

 

 

Source: https://www.techpowerup.com/258446/nvidia-436-02-installer-buggy-always-installs-geforce-experience-no-gdpr-consent

 

Pretty sure because of this, they took the link down on the NVIDIA website; as I'm not able to download the new driver anymore. Takes me to a "404 - Not Found" page. While the links for the other previous drivers are working fine.

 

Yeah, that's rubbish. I don't want GF "experience" anywhere near my system. I'll happily wait for the correct version, and then see if the performance improvements are there for Volta also.

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Alex Atkin UK said:

I think one of the comments on the video says it better:
 

My point from the start was that comparing a fairly simple (compared to DLSS)  sharpening filter to DLSS is flawed, at least if you are going to assume one is always better than the other.

What nVidia were trying to do was ambitious and hopefully it will pay off eventually.  Was it worth the cost to the consumer?  Arguably no, but they had to take a gamble that it would make RTX more useful in the short-term and indeed they failed.

If you've been a PC gamer for a few decades this is nothing new.  Pushing big leaps in graphical rendering always is a gamble.  Sometimes the only way to see what works is to throw it all into the pot and see what developers can make of it and you can't do that without releasing it to consumers, as developers wont bother to implement something nobody can use.

How it's done is irrelevant. What matters is the outcome, and AMD's implementation of objectively superior.

 

Also hilarious that you call Hardware Unboxed a random YouTuber, but you want me to consider the OPINION of an actual random YouTube post.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

I've always said DLSS is stupid, especially how they made it like you need some special Ai nonsense and special profiles for each game. Instead of just making reverse DSR. Which is what this essentially is. Pointless waste of resources, especially since everyone is raving about 4K and "ultimate sharpness" and then people with stupid high end cards are gonna make image more blurry. WHY?! It makes absolutely no sense. I like AMD's approach much more. It's a subtle sharpening effect, but it's there and it's essentially free. So, excellent! I just wish it would be 100% post processing instead of having games to implement it on engine level, free and open source or not, it's a stupid tacky middle layer that's unnecessary and just limits the adoption rate.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, RejZoR said:

I just wish it would be 100% post processing instead of having games to implement it on engine level, free and open source or not, it's a stupid tacky middle layer that's unnecessary and just limits the adoption rate.

You can use AMD FidelityFx with Reshade, someone ported it. 

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

Just checked on my game system at home. Driver is available via GFE. Haven't checked manual download again.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, xAcid9 said:

You can use AMD FidelityFx with Reshade, someone ported it. 

Interesting, but too fiddly and prone to getting you banned since it hooks on things and anti cheats don't like that.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, ivan134 said:

How it's done is irrelevant. What matters is the outcome, and AMD's implementation of objectively superior.

 

Also hilarious that you call Hardware Unboxed a random YouTuber, but you want me to consider the OPINION of an actual random YouTube post.

 If they think simple image sharpening is better that is their opinion, you should take whatever a youtuber says in a subjective comparison test as their opinion. Both implementations over process the image and make the game look like crap IMO.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Blademaster91 said:

 If they think simple image sharpening is better that is their opinion, you should take whatever a youtuber says in a subjective comparison test as their opinion. Both implementations over process the image and make the game look like crap IMO.

Exactly my point, thank you.

The same way some people think RTX is not worth the performance cost, some of us are happy to have it.  Especially when you've been around PC gaming during every major transition since 3D cards were first developed so have seen it all before, you have to start somewhere.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, Alex Atkin UK said:

Exactly my point, thank you.

The same way some people think RTX is not worth the performance cost, some of us are happy to have it.  Especially when you've been around PC gaming during every major transition since 3D cards were first developed so have seen it all before, you have to start somewhere.

Some people are just hell bent on hating Nvidia.  What's happened here is that Nvidia have simply added more options yet as always people are still crying like they are killing babies.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

So does this mean you can finally output 1080p to a 4K display at 1:4 and have it look essentially the same as it would on a native 1080p screen without having it blur horribly for no reason, or am I misunderstanding what this accomplishes?

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

Was just about to buy an AMD card. Is Nvidia's low latency mode the same as AMD's or worse? The entire reason I was going to switch to AMD was because I heard it had less latency because of this.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Ryan_Vickers said:

So does this mean you can finally output 1080p to a 4K display at 1:4 and have it look essentially the same as it would on a native 1080p screen without having it blur horribly for no reason, or am I misunderstanding what this accomplishes?

I went to test it, but failed on account I was trying it on a Pascal card. Currently a Turing only feature. Doh!

 

1 hour ago, Paranoid Kami said:

Was just about to buy an AMD card. Is Nvidia's low latency mode the same as AMD's or worse? The entire reason I was going to switch to AMD was because I heard it had less latency because of this.

Shall have to wait for others to test it. It might be of some benefit to insane fps gamers (way above monitor refresh), but if you are already running within G-sync/Freesync range I don't personally see value in this.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Paranoid Kami said:

Was just about to buy an AMD card. Is Nvidia's low latency mode the same as AMD's or worse? The entire reason I was going to switch to AMD was because I heard it had less latency because of this.

My understanding of how AMD's anti-lag system works is it dynamically learns how long it takes to render a frame, and if it then knows that your next monitor refresh is in 16 ms and it takes only 6 ms to render the frame, rather than starting immediately the way it normally would, it will wait 10 ms, allowing new things to happen in the game, like the user making an input, etc., then render and deliver the finished frame at the last second so when you finally see it, it's only 6 ms old instead of 16, thus effectively shaving 10 ms off the total input lag.

 

I actually had the idea to do this myself once not too long ago but dismissed it as impractical and ineffective.  I assume they did some testing and found otherwise, but to be honest, I don't actually know.  I've yet to see any third party reviews or tests of it, so I'm not sure we can actually say with any confidence whether AMD's system or nvidia's system is better, or whether either is actually any good at all.  If anyone does have info, please let us know xD

 

One of the reasons I suspected it would not work well is that the time to render a frame is not constant - the demand of a game changes constantly from minute to minute, second to second, and even frame to frame, and if you intentionally wait, anticipating you'll have enough time, only to experience a sudden increase in load that causes you to not have enough time when you otherwise would have, that's going to result in stutter and lower frame rates that otherwise would not have been an issue.  Another reason I wonder about the usefulness is the fact that if you are able to render well above your target display rate (say, 200+ FPS on a 60 Hz monitor), what happens is after rendering that first frame, it doesn't just stop and wait for the screen to refresh - that's how vsync works, more or less - no, it starts rendering a new frame, and if it finishes in time, you get that newer one instead, and if not, at least it has the previous one to fall back on.  This means that you're already seeing "recent" information, and this system isn't likely going to be able to improve on that significantly.

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×