Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
porina

nvidia 436.02 drivers, draws inspiration from AMD, Intel

Recommended Posts

45 minutes ago, TVwazhere said:

Generally a bad idea. Keeping your drivers "up to date" for folding purposes only is generally not wise as often these optimizations only benefit newer games, while having the sometimes opposite affect on compute/folding systems. 

Fixed as I've had performance regressions in some older titles, albeit a necessary evil as the newer drivers fixed some crucial bugs for me in GC emulation.


The pursuit of knowledge for the sake of knowledge.

Forever in search of my reason to exist.

Link to post
Share on other sites
3 minutes ago, ivan134 said:

Watch the video when you have time. No point in arguing this

Except that is EXACTLY what DLSS is supposed to be doing.  It doesn't matter of some YouTuber compares the two and thinks one is better despite being a completely different thing.  Its their opinion and doesn't necessarily reflect the intent of the function.

Yes its useful to know if one thing might perceptually look better despite the fact it probably shouldn't, but it shouldn't be your only consideration.

You mileage will vary from game to game, monitor to monitor, eyeballs to eyeballs.


Router: i5-7200U appliance running pfSense.
ISP: Zen Unlimited Fibre 2 (66Mbit) + Plusnet Unlimited Fibre Extra. (56Mbit)

Link to post
Share on other sites
6 minutes ago, Alex Atkin UK said:

Except that is EXACTLY what DLSS is supposed to be doing.  It doesn't matter of some YouTuber compares the two and thinks one is better despite being a completely different thing.  Its their opinion and doesn't necessarily reflect the intent of the function.

Yes its useful to know if one thing might perceptually look better despite the fact it probably shouldn't, but it shouldn't be your only consideration.

You mileage will vary from game to game, monitor to monitor, eyeballs to eyeballs.

No, it's a fact. Lmao at "some youtuber". Fanboys are getting more and more insane


CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to post
Share on other sites

MFW someone calls Hardware Unboxed "some youtuber"


CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to post
Share on other sites
1 minute ago, ivan134 said:

No, it's a fact. Lmao at "some youtuber". Fanboys are getting more and more insane

Ikr? What's the valid source here? Last I checked youtubers can do solid testing too, them being "some youtuber" doesn't make everything they say bogus. Tech Jesus (GamersNexus' Steve) manages to make hella in depth and very trustworthy content and he's a youtuber.  I think we're a bit past the "youtube is kids and low tier content" phase, have been for... quite a while now. 

 

1 minute ago, ivan134 said:

MFW someone calls Hardware Unboxed "some youtuber"

OOF 


X58-X79-X99-X299 lads: Intel HEDT Xeon/i7 Megathread

 

Big Rig (Completed) - (Current) - i7 5960X - 4.7Ghz/3.7Ghz ~ 1.3v/1.1v core/uncore - 76-78C under RealBench load- Custom Loop: 2x 360GTS with EK-ZMT/Stubbies and EK D5 pump/res combo - EVGA X99 Classified - 32GB (4x8GB) HyperX Predator DDR4 - 3200MHz CL16 - AMD Radeon VII (best TimeSpy so far: here) - 1TB 970 Evo - Corsair RM1000i - Phanteks Enthoo Evolv ATX TG - 6x iPPC NF-F12 2000 - 45" 4K LG TV

 

Planned Desk Rig - i7 5820K - Noctua NH-L12S - EVGA X99 Micro 2 - 16GB (4x4GB) EVGA SSC DDR4 - EVGA XC Ultra 1660 Ti - 250GB 960 Evo - Seagate Firecuda 2TB - Seagate BarraCuda Pro 1TB - Corsair CX550 - Fractal Design Meshify C Mini - LG 25UM56-P - 25" 2560x1080 at 75Hz

X79 (waiting on mobo/CPU/RAM) - i7 4930K - EVGA CLC 280 - EVGA X79 Dark - 16GBGB (4x4GB) Corsair Vengeance DDR3 - 2x EVGA Classified 780s - MX500 1TB - EVGA 1600W T2 - Corsair Air 540 - 3x NF-P12 Redux 

 

Planned X58 rig - Xeon X5670 - NH-D15S - EVGA X58 Classified SLI 4-Way - 24GB (3x8GB) HyperX Savage Red DDR3 - Undecided GPUS - probably a basic SSD - EVGA 1000W G3 - Undecided Case

 

I lowkey enjoy HEDT

 

Link to post
Share on other sites
2 minutes ago, Zando Bob said:

Tech Jesus (GamersNexus' Steve) manages to make hella in depth and very trustworthy content and he's a youtuber. 

He runs a website that just happens to have a YouTube channel, because sometimes listening to someone is better than reading a wall of text :3

Link to post
Share on other sites
6 minutes ago, ivan134 said:

MFW someone calls Hardware Unboxed "some youtuber"

Yes, because however good he is - his opinion is not the be all and end all.  What looks better is often subjective.

Don't get me wrong, DLSS so far has been junk but as people rightly pointed out - what AMD has implemented is merely a post-process filter.  A good one perhaps and its great they make it public so any developer can use it, but its not even in the same league of what nVidia were trying to do, even if they have so far failed.

There is no one size fits all solution.


Router: i5-7200U appliance running pfSense.
ISP: Zen Unlimited Fibre 2 (66Mbit) + Plusnet Unlimited Fibre Extra. (56Mbit)

Link to post
Share on other sites
6 minutes ago, Alex Atkin UK said:

Yes, because however good he is - his opinion is not the be all and end all.

Don't get me wrong, DLSS so far has been junk but as people rightly pointed out - what AMD has implemented is merely a post-process filter.  A good one perhaps and its great they make it public so any developer can use it, but its not even in the same league of what nVidia were trying to do, even if they have so far failed.

There is no one size fits all solution.

I don't think you know what an opinion is


CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to post
Share on other sites
1 minute ago, ivan134 said:

I don't think you know what an opinion is

I think one of the comments on the video says it better:
 

Quote

Image Sharpening may look amazing in this video but in reality it's got a lot of issues. Mainly oversharpening which will cause a grainy effect which can cause flickering and other artifacts. In some situations it enhances. In some situations it makes things worse. Same for DLSS but that truly is Deep Learning and the Ground Truth training has a lot of potential to improve... I can't predict if DLSS will work long-term (though it probably will) though I can say Image Sharpening isn't going to improve EVERYTHING... there are great ways to use image sharpening in newer games that plan for it by default with AMD's Fidelity FX tools though. I believe they allow to process certain things at a lower quality to improve performance then sharpen after but that works because they already know what they are sharpening and can optimize for that. Whereas again it's a bit hit and miss depending on the game.

My point from the start was that comparing a fairly simple (compared to DLSS)  sharpening filter to DLSS is flawed, at least if you are going to assume one is always better than the other.

What nVidia were trying to do was ambitious and hopefully it will pay off eventually.  Was it worth the cost to the consumer?  Arguably no, but they had to take a gamble that it would make RTX more useful in the short-term and indeed they failed.

If you've been a PC gamer for a few decades this is nothing new.  Pushing big leaps in graphical rendering always is a gamble.  Sometimes the only way to see what works is to throw it all into the pot and see what developers can make of it and you can't do that without releasing it to consumers, as developers wont bother to implement something nobody can use.


Router: i5-7200U appliance running pfSense.
ISP: Zen Unlimited Fibre 2 (66Mbit) + Plusnet Unlimited Fibre Extra. (56Mbit)

Link to post
Share on other sites
Posted · Original PosterOP
40 minutes ago, Mira Yurizaki said:

He runs a website that just happens to have a YouTube channel, because sometimes listening to someone is better than reading a wall of text :3

In that specific case, I'd take the wall of text :D 


Main rig: Asus Maximus VIII Hero, i7-6700k stock, Noctua D14, G.Skill Ripjaws V 3200 2x8GB, Gigabyte GTX 1650, Corsair HX750i, In Win 303 NVIDIA, Samsung SM951 512GB, WD Blue 1TB, HP LP2475W 1200p wide gamut

Gaming system: Asrock Z370 Pro4, i7-8086k stock, Noctua D15, Corsair Vengeance LPX RGB 3000 2x8GB, Gigabyte RTX 2070, Fractal Edison 550W PSU, Corsair 600C, Optane 900p 280GB, Crucial MX200 1TB, Sandisk 960GB, Acer Predator XB241YU 1440p 144Hz G-sync

Ryzen rig: Asrock B450 ITX, R5 3600, Noctua D9L, G.SKill TridentZ 3000C14 2x8GB, Gigabyte RTX 2070, Corsair CX450M, NZXT Manta, WD Green 240GB SSD, LG OLED55B9PLA

VR rig: Asus Z170I Pro Gaming, i7-6700T stock, Scythe Kozuti, Kingston Hyper-X 2666 2x8GB, Zotac 1070 FE, Corsair CX450M, Silverstone SG13, Samsung PM951 256GB, Crucial BX500 1TB, HTC Vive

Gaming laptop: Asus FX503VD, i5-7300HQ, 2x8GB DDR4, GTX 1050, Sandisk 256GB + 480GB SSD

Total CPU heating: i7-8086k, i3-8350k, i7-7920X, 2x i7-6700k, i7-6700T, i5-6600k, i3-6100, i7-5930k, i7-5820k, i7-5775C, i5-5675C, 2x i7-4590, i5-4570S, 2x i3-4150T, E5-2683v3, 2x E5-2650, E5-2667, R7 3700X, R5 3600, R5 2600, R7 1700

Link to post
Share on other sites

NVIDIA 436.02 Installer Buggy, Always Installs GeForce Experience, No GDPR Consent:

 

Quote

The installer of these drivers appears to have a major bug that forces the installation of GeForce Experience without obtaining GDPR-compliant consent from the user. With the ratification of GDPR, NVIDIA driver installers present a selection screen right at the start of the installation, which lets users opt to install GeForce Experience (and give their GDPR consent in doing so), but a second option lets users decline GDPR consent, forcing the installer to install GeForce drivers without GeForce Experience. A bug with the installer of GeForce 436.02 WHQL disregards the user's choice at this screen, and installs GeForce Experience without the GDPR-mandated user-consent.

 

Making matters far worse is the fact that you cannot deselect GeForce Experience from the list of components in the Custom Install screen. The Custom Install list lets you make the installer skip installation of optional components that are otherwise installed by default in Express Install (GeForce Experience features in this list only if a user gives GDPR consent in the previous screen). 

 

Source: https://www.techpowerup.com/258446/nvidia-436-02-installer-buggy-always-installs-geforce-experience-no-gdpr-consent

 

Pretty sure because of this, they took the link down on the NVIDIA website; as I'm not able to download the new driver anymore. Takes me to a "404 - Not Found" page. While the links for the other previous drivers are working fine.


                                                                                                                              .:. Y Gwir Yn Erbyn Y Byd ! .:.

                                                                                                                                     ] Vittoria, o moriamo tutti ! [

                                                         

Spoiler

                                                                            How to free up space on your SSD                                                          

Spoiler
Spoiler

                                                                                          Kymatica Revision:

 

CPU: Intel Core i7-2600k @ 4.3GHz Motherboard: ASRock Z68 Extreme4 Gen3 GPU: Gigabyte GeForce GTX 1660 Ti OC 6G 2x Windforce Memory: G.Skill Ripjaws X Series 16GB @ 2133MHz @ 9-10-11-28 SSD: Crucial M500 240GB (OS/Programs/Path of Exile/Grim Dawn) HDD1: WD 1TB Blue (Diablo III/Other Games/Storage/Media) HDD2: Seagate Barracuda 7.2K 500GB (Backup) HDD3: WD Caviar 7.2K 500GB (Backup) HDD4: WD Elements 4TB External WDBWLG0040HBK-NESN (Backup/Additional Storage)  CPU Cooling: Corsair Hydro Series H100 in Pull (w/ 2x Delta FFB1212EH 120mm) Case Fans: Noctua NF F12 industrialPPC-2000 (x3 120mm) PSU: Seasonic X-Series X-1050 1050W Case: Cooler Master HAF 922 Monitor: Samsung C27F396 Curved 27-Inch Freesync Monitor (@ 1440p @ 72Hz) Keyboard: Cooler Master Storm Trigger-Z (Cherry MX Brown Switches) Mouse: Roccat Kone XTD Mousepad: Corsair MM350 Premium Audio: Logitech X-530 5.1 Speaker System Headset: Corsair VOID Stereo Gaming Headset (w/ Sennheiser 3D G4ME 7.1 Surround Amplifier) OS: Windows 10 Professional (Version 1903 OS Build 18362.592)

 

                                                                                                       

Link to post
Share on other sites
5 minutes ago, BiG StroOnZ said:

NVIDIA 436.02 Installer Buggy, Always Installs GeForce Experience, No GDPR Consent:

 

 

Source: https://www.techpowerup.com/258446/nvidia-436-02-installer-buggy-always-installs-geforce-experience-no-gdpr-consent

 

Pretty sure because of this, they took the link down on the NVIDIA website; as I'm not able to download the new driver anymore. Takes me to a "404 - Not Found" page. While the links for the other previous drivers are working fine.

 

Yeah, that's rubbish. I don't want GF "experience" anywhere near my system. I'll happily wait for the correct version, and then see if the performance improvements are there for Volta also.


5820K 4.0GHz | NH D15S | 32 GB RAM | Titan V | ASUS PG348Q+MG278Q

 

Link to post
Share on other sites
1 hour ago, Alex Atkin UK said:

I think one of the comments on the video says it better:
 

My point from the start was that comparing a fairly simple (compared to DLSS)  sharpening filter to DLSS is flawed, at least if you are going to assume one is always better than the other.

What nVidia were trying to do was ambitious and hopefully it will pay off eventually.  Was it worth the cost to the consumer?  Arguably no, but they had to take a gamble that it would make RTX more useful in the short-term and indeed they failed.

If you've been a PC gamer for a few decades this is nothing new.  Pushing big leaps in graphical rendering always is a gamble.  Sometimes the only way to see what works is to throw it all into the pot and see what developers can make of it and you can't do that without releasing it to consumers, as developers wont bother to implement something nobody can use.

How it's done is irrelevant. What matters is the outcome, and AMD's implementation of objectively superior.

 

Also hilarious that you call Hardware Unboxed a random YouTuber, but you want me to consider the OPINION of an actual random YouTube post.


CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to post
Share on other sites

I've always said DLSS is stupid, especially how they made it like you need some special Ai nonsense and special profiles for each game. Instead of just making reverse DSR. Which is what this essentially is. Pointless waste of resources, especially since everyone is raving about 4K and "ultimate sharpness" and then people with stupid high end cards are gonna make image more blurry. WHY?! It makes absolutely no sense. I like AMD's approach much more. It's a subtle sharpening effect, but it's there and it's essentially free. So, excellent! I just wish it would be 100% post processing instead of having games to implement it on engine level, free and open source or not, it's a stupid tacky middle layer that's unnecessary and just limits the adoption rate.

Link to post
Share on other sites
9 minutes ago, RejZoR said:

I just wish it would be 100% post processing instead of having games to implement it on engine level, free and open source or not, it's a stupid tacky middle layer that's unnecessary and just limits the adoption rate.

You can use AMD FidelityFx with Reshade, someone ported it. 


| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to post
Share on other sites
Posted · Original PosterOP

Just checked on my game system at home. Driver is available via GFE. Haven't checked manual download again.


Main rig: Asus Maximus VIII Hero, i7-6700k stock, Noctua D14, G.Skill Ripjaws V 3200 2x8GB, Gigabyte GTX 1650, Corsair HX750i, In Win 303 NVIDIA, Samsung SM951 512GB, WD Blue 1TB, HP LP2475W 1200p wide gamut

Gaming system: Asrock Z370 Pro4, i7-8086k stock, Noctua D15, Corsair Vengeance LPX RGB 3000 2x8GB, Gigabyte RTX 2070, Fractal Edison 550W PSU, Corsair 600C, Optane 900p 280GB, Crucial MX200 1TB, Sandisk 960GB, Acer Predator XB241YU 1440p 144Hz G-sync

Ryzen rig: Asrock B450 ITX, R5 3600, Noctua D9L, G.SKill TridentZ 3000C14 2x8GB, Gigabyte RTX 2070, Corsair CX450M, NZXT Manta, WD Green 240GB SSD, LG OLED55B9PLA

VR rig: Asus Z170I Pro Gaming, i7-6700T stock, Scythe Kozuti, Kingston Hyper-X 2666 2x8GB, Zotac 1070 FE, Corsair CX450M, Silverstone SG13, Samsung PM951 256GB, Crucial BX500 1TB, HTC Vive

Gaming laptop: Asus FX503VD, i5-7300HQ, 2x8GB DDR4, GTX 1050, Sandisk 256GB + 480GB SSD

Total CPU heating: i7-8086k, i3-8350k, i7-7920X, 2x i7-6700k, i7-6700T, i5-6600k, i3-6100, i7-5930k, i7-5820k, i7-5775C, i5-5675C, 2x i7-4590, i5-4570S, 2x i3-4150T, E5-2683v3, 2x E5-2650, E5-2667, R7 3700X, R5 3600, R5 2600, R7 1700

Link to post
Share on other sites
6 minutes ago, xAcid9 said:

You can use AMD FidelityFx with Reshade, someone ported it. 

Interesting, but too fiddly and prone to getting you banned since it hooks on things and anti cheats don't like that.

Link to post
Share on other sites
1 hour ago, ivan134 said:

How it's done is irrelevant. What matters is the outcome, and AMD's implementation of objectively superior.

 

Also hilarious that you call Hardware Unboxed a random YouTuber, but you want me to consider the OPINION of an actual random YouTube post.

 If they think simple image sharpening is better that is their opinion, you should take whatever a youtuber says in a subjective comparison test as their opinion. Both implementations over process the image and make the game look like crap IMO.

Link to post
Share on other sites
3 minutes ago, Blademaster91 said:

 If they think simple image sharpening is better that is their opinion, you should take whatever a youtuber says in a subjective comparison test as their opinion. Both implementations over process the image and make the game look like crap IMO.

Exactly my point, thank you.

The same way some people think RTX is not worth the performance cost, some of us are happy to have it.  Especially when you've been around PC gaming during every major transition since 3D cards were first developed so have seen it all before, you have to start somewhere.


Router: i5-7200U appliance running pfSense.
ISP: Zen Unlimited Fibre 2 (66Mbit) + Plusnet Unlimited Fibre Extra. (56Mbit)

Link to post
Share on other sites
33 minutes ago, Alex Atkin UK said:

Exactly my point, thank you.

The same way some people think RTX is not worth the performance cost, some of us are happy to have it.  Especially when you've been around PC gaming during every major transition since 3D cards were first developed so have seen it all before, you have to start somewhere.

Some people are just hell bent on hating Nvidia.  What's happened here is that Nvidia have simply added more options yet as always people are still crying like they are killing babies.


QuicK and DirtY. Read the CoC it's like a guide on how not to be moron.  Also I don't have an issue with the VS series.

Sometimes I miss contractions like n't on the end of words like wouldn't, couldn't and shouldn't.    Please don't be a dick,  make allowances when reading my posts.

Link to post
Share on other sites

So does this mean you can finally output 1080p to a 4K display at 1:4 and have it look essentially the same as it would on a native 1080p screen without having it blur horribly for no reason, or am I misunderstanding what this accomplishes?

Link to post
Share on other sites

Was just about to buy an AMD card. Is Nvidia's low latency mode the same as AMD's or worse? The entire reason I was going to switch to AMD was because I heard it had less latency because of this.

Link to post
Share on other sites
Posted · Original PosterOP
2 hours ago, Ryan_Vickers said:

So does this mean you can finally output 1080p to a 4K display at 1:4 and have it look essentially the same as it would on a native 1080p screen without having it blur horribly for no reason, or am I misunderstanding what this accomplishes?

I went to test it, but failed on account I was trying it on a Pascal card. Currently a Turing only feature. Doh!

 

1 hour ago, Paranoid Kami said:

Was just about to buy an AMD card. Is Nvidia's low latency mode the same as AMD's or worse? The entire reason I was going to switch to AMD was because I heard it had less latency because of this.

Shall have to wait for others to test it. It might be of some benefit to insane fps gamers (way above monitor refresh), but if you are already running within G-sync/Freesync range I don't personally see value in this.


Main rig: Asus Maximus VIII Hero, i7-6700k stock, Noctua D14, G.Skill Ripjaws V 3200 2x8GB, Gigabyte GTX 1650, Corsair HX750i, In Win 303 NVIDIA, Samsung SM951 512GB, WD Blue 1TB, HP LP2475W 1200p wide gamut

Gaming system: Asrock Z370 Pro4, i7-8086k stock, Noctua D15, Corsair Vengeance LPX RGB 3000 2x8GB, Gigabyte RTX 2070, Fractal Edison 550W PSU, Corsair 600C, Optane 900p 280GB, Crucial MX200 1TB, Sandisk 960GB, Acer Predator XB241YU 1440p 144Hz G-sync

Ryzen rig: Asrock B450 ITX, R5 3600, Noctua D9L, G.SKill TridentZ 3000C14 2x8GB, Gigabyte RTX 2070, Corsair CX450M, NZXT Manta, WD Green 240GB SSD, LG OLED55B9PLA

VR rig: Asus Z170I Pro Gaming, i7-6700T stock, Scythe Kozuti, Kingston Hyper-X 2666 2x8GB, Zotac 1070 FE, Corsair CX450M, Silverstone SG13, Samsung PM951 256GB, Crucial BX500 1TB, HTC Vive

Gaming laptop: Asus FX503VD, i5-7300HQ, 2x8GB DDR4, GTX 1050, Sandisk 256GB + 480GB SSD

Total CPU heating: i7-8086k, i3-8350k, i7-7920X, 2x i7-6700k, i7-6700T, i5-6600k, i3-6100, i7-5930k, i7-5820k, i7-5775C, i5-5675C, 2x i7-4590, i5-4570S, 2x i3-4150T, E5-2683v3, 2x E5-2650, E5-2667, R7 3700X, R5 3600, R5 2600, R7 1700

Link to post
Share on other sites
1 hour ago, Paranoid Kami said:

Was just about to buy an AMD card. Is Nvidia's low latency mode the same as AMD's or worse? The entire reason I was going to switch to AMD was because I heard it had less latency because of this.

My understanding of how AMD's anti-lag system works is it dynamically learns how long it takes to render a frame, and if it then knows that your next monitor refresh is in 16 ms and it takes only 6 ms to render the frame, rather than starting immediately the way it normally would, it will wait 10 ms, allowing new things to happen in the game, like the user making an input, etc., then render and deliver the finished frame at the last second so when you finally see it, it's only 6 ms old instead of 16, thus effectively shaving 10 ms off the total input lag.

 

I actually had the idea to do this myself once not too long ago but dismissed it as impractical and ineffective.  I assume they did some testing and found otherwise, but to be honest, I don't actually know.  I've yet to see any third party reviews or tests of it, so I'm not sure we can actually say with any confidence whether AMD's system or nvidia's system is better, or whether either is actually any good at all.  If anyone does have info, please let us know xD

 

One of the reasons I suspected it would not work well is that the time to render a frame is not constant - the demand of a game changes constantly from minute to minute, second to second, and even frame to frame, and if you intentionally wait, anticipating you'll have enough time, only to experience a sudden increase in load that causes you to not have enough time when you otherwise would have, that's going to result in stutter and lower frame rates that otherwise would not have been an issue.  Another reason I wonder about the usefulness is the fact that if you are able to render well above your target display rate (say, 200+ FPS on a 60 Hz monitor), what happens is after rendering that first frame, it doesn't just stop and wait for the screen to refresh - that's how vsync works, more or less - no, it starts rendering a new frame, and if it finishes in time, you get that newer one instead, and if not, at least it has the previous one to fall back on.  This means that you're already seeing "recent" information, and this system isn't likely going to be able to improve on that significantly.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×