Jump to content

nvidia 436.02 drivers, draws inspiration from AMD, Intel

porina
2 hours ago, Otto_iii said:

I don't think its cool people are dogpiling on you, but honestly you kinda deserved it.  Its not just Hardware Unboxed, but iirc Steve from Gamers Nexus, Wendell from Level1Techs(and or EposVox), Anthony himself from LTT, i could go on and on just listing all the tech youtubers i know there wasn't a single one that didnt praise the feature compared to Nvideas half baked performance-tanking DLSS 

Maybe DLSS seemed technically neat, but so was the turbine car, yet there is good reason nobody adopted it.

Like I said, the issue I had wasn't that adaptive filtering/sharpening isn't useful or in fact better than DLSS right now.  Its that like RTX, people are dismissing the feature outright and are even angry at NVIDIA for including it in their GPU.

This can be directly compared to the same reaction people had when they introduced programmable shaders too, as fixed-function hardware also suffered at that point as the transistor space was dedicated to something which no game used at the time, rather than something they could.

Even the respected YouTubers you mentioned do admit that RTX at least, and possibly DLSS, has potential for the future.  Though they do not always remember to mention that as time has shown, those things HAVE to be put into a commercial product FIRST, before developers will try to find a way to make use of them.  Its also not unusual for first-generation versions to be a bit under-powered, you have to start somewhere and its impossible to know how much raw power you need until developers start trying to optimise for it.

Do you not see the irony that the very thing people complained about before, programmable shaders, is EXACTLY what is powering the thing people are saying is so much better than DLSS today?  Its the same narrow-mindedness and thus why I feel the need to call it out.

Yes DLSS is junk today, but all this aggression towards NVIDIA for attempting it is extremely short sighted.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Otto_iii said:

So far as i am aware when the US Air Force (i believe it was them) did tests about this in the 1980 or so people could pick out varied flashes at atleast about 200-250hz, and ghosting blur from flashes (less precise) at 1000hz.  I can't source this but im fairly sure the data for that is out there.  I suppose in regards to perfectly crisp motion picture 80fps might be a relevant number, but monitors clearly work different enough that that same "80fps" looks far worse on a computer monitor then whatever was used for the study you seem to be referencing. 

 

If we are talking the same thing that test was something like 899 black frames and one frame of an aircraft, which is the problem when trying to argue you can discern between similar images.

2 hours ago, Otto_iii said:


Would be interesting to compose a bunch of links of academic studies related to this.  

 

 

 There are (or at least was) about 7 peer reviewed papers that went into great detail about the visual processing system and FPS, I had linked them in previous discussions on these forums (so the links are still here somewhere), however over the last 7 years as articles get moved from journals and go in and out of publication the links are all ineffective.

2 hours ago, Otto_iii said:


All i know is as somebody whose done high-speed sports in my past im always amazed at people who think 75hz looks fine, but i guess everybody perceives things a bit different. 

You just have to wave your arm in front of your face to see the blur and stagger in motion as the eyes can't properly capture all the photons.

 

 

2 hours ago, porina said:

My gut feeling is that passive observation and an active influence would lead to very different results here. I use a 144 Hz monitor and just by looking at how the mouse pointer moves you can tell roughly what refresh rate it is set on. Even 144 Hz is not perfectly smooth and it is easy to see the movement steps between frames. I don't know how much faster it would have to be to give that impression.

 

That's because you have distance across the screen and time and a hard edge to observe.  As Ryan pointed out, in order for the pointer to fill every pixel as it moves across the screen at that speed you'd need stupid high FPS.  And even then you'd still likely see blur and not be able to accurately pin point where the cursor was due to said visual persistence. 

 

 

In all of these discussion I think the one thing that never gets qualified is distance of travel, in fact I think Ryan is the only person to have raised it that I know of. Which actually puts into perspective many discussions where people are adamant the human eye is analogue and a perfect stream of information.  Because none of the research tells us this, and many people just don't understand the visual system well enough too understand why.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

With low latency what are others impressions with "On" vs "Ultra" ?  Gonna give it a shot, see how it goes 
 

1 hour ago, mr moose said:

-snip-

May i surmise that atleast in part what you are saying is reduction in motion blur, grey to grey response rate etc is as or more important then say 120 vs 240hz?  I think most people would agree with that, its just the case that often (not always ofc) the higher the hz the better these things become.  I often talk about how i remember or even lust after the days i had growing up with CRT gaming (800x600 days ) because i could look around at any given speed and blur never happened, even if the monitor or refresh was say 30-60fps or so.

One specific memory i personally have from this era that is still relevant imo, is when spectating each other in games my friends had already started going to the contemporary practice of snap-looking, i was one of the few on CRT so i could look around quickly rather then forced to snap, and everybody found that odd or interesting watching me, only in recent years have a realized a big part of it was the fact i was using a CRT for a lot longer then most my friends, who had very early flat-screens with god knows how bad of blur, so the meta with those was not just snap-shotting, but snapping to look.  You still see snap-to-look with a lot of people on 144hz even, so although its all improved not everything has changed from the early days of flat-screen LCDs or whatever it was. 

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, Otto_iii said:



 

May i surmise that atleast in part of what you are saying is reduction in motion blur, grey to grey response rate etc is as or more important then say 120 vs 240hz?  I think most people would agree with that, its just the case that often (not always ofc) the higher the hz the better these things become.  I often talk about how i remember or even lust after the days i had growing up with CRT gaming because i could look around at any given speed and blur never happened, even if the monitor or refresh was say 30-60fps or so.

Yes, or at least something in that realm.  I am reluctant to say it as an absolute because who knows what we'll learn tomorrow, however if lower lag causes more accurate screen update then it might trump higher FPS,  which would actually fit with the research I have read.   Where as the usually rhetoric on the topic is higher FPS is essential because we can see higher FPS.   I usually come unstuck arguing this because I have never been able to properly explain that higher FPS might just be the non causal symptom of something else that does make the game better.

 

 

30 minutes ago, Otto_iii said:


One specific memory from this era that is still relevant imo, is when spectating eachother in games my friends had already started going to the contemporary practice of snap-looking, i was one of the few on CRT so i could look around quickly rather then forced to snap, and everybody found that odd or interesting watching me, only in recent years have a realized a big part of it was the fact i was using a CRT for a lot longer then most my friends, who had very early flat-screens with god knows how bad of blur, so the meta with those was not just snap-shotting, but snapping to look.  You still see snap-to-look with a lot of people on 144hz even, so although its all improved not everything has changed from the early days of flat-screen LCDs or whatever it was. 

 

That is the sole reason I did not upgrade to a flat panel for years.   Also an interesting piece of trivia,  when LCD's first came out there used to be a lot of arguments regarding why they were worse for videos (especially when upscaled).  Several people tried to claim that CRT's were analogue thus had no upper resolution and that is why they were better.  Being an electronic tech for several years prior I tried explaining how the shadow mask works and how it effectively limits the resolution.  They used to even be sold with a dot pitch spec (distance between color dots),  however people don't like being corrected when they think they know it all.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, Mr Prince said:

Integer scaling can be done by app from steam store to, no idea why it only supports turing its obviously supported by every gpu

It's a wider question, why hasn't it been implemented before now? Unless I missed it, it isn't offered by AMD, Intel have talked about it but only for their next gen. Now nvidia implemented it on Turing.

 

It seems like a simple operation to do, duplicate pixels into 2x2 (or bigger) blocks. Maybe it is something that isn't as simple when you dig into it. Maybe nvidia, and Intel, want to differentiate their latest from previous generations? Maybe they need to implement it differently for past generations, and only have had time to do it on Turing so far?

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, porina said:

It's a wider question, why hasn't it been implemented before now? Unless I missed it, it isn't offered by AMD, Intel have talked about it but only for their next gen. Now nvidia implemented it on Turing.

 

It seems like a simple operation to do, duplicate pixels into 2x2 (or bigger) blocks. Maybe it is something that isn't as simple when you dig into it. Maybe nvidia, and Intel, want to differentiate their latest from previous generations? Maybe they need to implement it differently for past generations, and only have had time to do it on Turing so far?

I've always been confused why monitors didn't do integer scaling when the input resolution fits equally into the panel resolution.  It would have solved the blurry (poor upscaling) issues people were moaning about since day one, at least at certain resolutions.

I was always told "its not that simple", though never a good explanation of exactly how.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×