Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
porina

nvidia 436.02 drivers, draws inspiration from AMD, Intel

Recommended Posts

2 hours ago, Otto_iii said:

I don't think its cool people are dogpiling on you, but honestly you kinda deserved it.  Its not just Hardware Unboxed, but iirc Steve from Gamers Nexus, Wendell from Level1Techs(and or EposVox), Anthony himself from LTT, i could go on and on just listing all the tech youtubers i know there wasn't a single one that didnt praise the feature compared to Nvideas half baked performance-tanking DLSS 

Maybe DLSS seemed technically neat, but so was the turbine car, yet there is good reason nobody adopted it.

Like I said, the issue I had wasn't that adaptive filtering/sharpening isn't useful or in fact better than DLSS right now.  Its that like RTX, people are dismissing the feature outright and are even angry at NVIDIA for including it in their GPU.

This can be directly compared to the same reaction people had when they introduced programmable shaders too, as fixed-function hardware also suffered at that point as the transistor space was dedicated to something which no game used at the time, rather than something they could.

Even the respected YouTubers you mentioned do admit that RTX at least, and possibly DLSS, has potential for the future.  Though they do not always remember to mention that as time has shown, those things HAVE to be put into a commercial product FIRST, before developers will try to find a way to make use of them.  Its also not unusual for first-generation versions to be a bit under-powered, you have to start somewhere and its impossible to know how much raw power you need until developers start trying to optimise for it.

Do you not see the irony that the very thing people complained about before, programmable shaders, is EXACTLY what is powering the thing people are saying is so much better than DLSS today?  Its the same narrow-mindedness and thus why I feel the need to call it out.

Yes DLSS is junk today, but all this aggression towards NVIDIA for attempting it is extremely short sighted.


Router: i5-7200U appliance running pfSense.
ISP: Zen Unlimited Fibre 2 (66Mbit) + Plusnet Unlimited Fibre Extra. (56Mbit)

Link to post
Share on other sites
2 hours ago, Otto_iii said:

So far as i am aware when the US Air Force (i believe it was them) did tests about this in the 1980 or so people could pick out varied flashes at atleast about 200-250hz, and ghosting blur from flashes (less precise) at 1000hz.  I can't source this but im fairly sure the data for that is out there.  I suppose in regards to perfectly crisp motion picture 80fps might be a relevant number, but monitors clearly work different enough that that same "80fps" looks far worse on a computer monitor then whatever was used for the study you seem to be referencing. 

 

If we are talking the same thing that test was something like 899 black frames and one frame of an aircraft, which is the problem when trying to argue you can discern between similar images.

2 hours ago, Otto_iii said:


Would be interesting to compose a bunch of links of academic studies related to this.  

 

 

 There are (or at least was) about 7 peer reviewed papers that went into great detail about the visual processing system and FPS, I had linked them in previous discussions on these forums (so the links are still here somewhere), however over the last 7 years as articles get moved from journals and go in and out of publication the links are all ineffective.

2 hours ago, Otto_iii said:


All i know is as somebody whose done high-speed sports in my past im always amazed at people who think 75hz looks fine, but i guess everybody perceives things a bit different. 

You just have to wave your arm in front of your face to see the blur and stagger in motion as the eyes can't properly capture all the photons.

 

 

2 hours ago, porina said:

My gut feeling is that passive observation and an active influence would lead to very different results here. I use a 144 Hz monitor and just by looking at how the mouse pointer moves you can tell roughly what refresh rate it is set on. Even 144 Hz is not perfectly smooth and it is easy to see the movement steps between frames. I don't know how much faster it would have to be to give that impression.

 

That's because you have distance across the screen and time and a hard edge to observe.  As Ryan pointed out, in order for the pointer to fill every pixel as it moves across the screen at that speed you'd need stupid high FPS.  And even then you'd still likely see blur and not be able to accurately pin point where the cursor was due to said visual persistence. 

 

 

In all of these discussion I think the one thing that never gets qualified is distance of travel, in fact I think Ryan is the only person to have raised it that I know of. Which actually puts into perspective many discussions where people are adamant the human eye is analogue and a perfect stream of information.  Because none of the research tells us this, and many people just don't understand the visual system well enough too understand why.


QuicK and DirtY. Read the CoC it's like a guide on how not to be moron.  Also I don't have an issue with the VS series.

Sometimes I miss contractions like n't on the end of words like wouldn't, couldn't and shouldn't.    Please don't be a dick,  make allowances when reading my posts.

Link to post
Share on other sites

With low latency what are others impressions with "On" vs "Ultra" ?  Gonna give it a shot, see how it goes 
 

1 hour ago, mr moose said:

-snip-

May i surmise that atleast in part what you are saying is reduction in motion blur, grey to grey response rate etc is as or more important then say 120 vs 240hz?  I think most people would agree with that, its just the case that often (not always ofc) the higher the hz the better these things become.  I often talk about how i remember or even lust after the days i had growing up with CRT gaming (800x600 days ) because i could look around at any given speed and blur never happened, even if the monitor or refresh was say 30-60fps or so.

One specific memory i personally have from this era that is still relevant imo, is when spectating each other in games my friends had already started going to the contemporary practice of snap-looking, i was one of the few on CRT so i could look around quickly rather then forced to snap, and everybody found that odd or interesting watching me, only in recent years have a realized a big part of it was the fact i was using a CRT for a lot longer then most my friends, who had very early flat-screens with god knows how bad of blur, so the meta with those was not just snap-shotting, but snapping to look.  You still see snap-to-look with a lot of people on 144hz even, so although its all improved not everything has changed from the early days of flat-screen LCDs or whatever it was. 

Link to post
Share on other sites
30 minutes ago, Otto_iii said:



 

May i surmise that atleast in part of what you are saying is reduction in motion blur, grey to grey response rate etc is as or more important then say 120 vs 240hz?  I think most people would agree with that, its just the case that often (not always ofc) the higher the hz the better these things become.  I often talk about how i remember or even lust after the days i had growing up with CRT gaming because i could look around at any given speed and blur never happened, even if the monitor or refresh was say 30-60fps or so.

Yes, or at least something in that realm.  I am reluctant to say it as an absolute because who knows what we'll learn tomorrow, however if lower lag causes more accurate screen update then it might trump higher FPS,  which would actually fit with the research I have read.   Where as the usually rhetoric on the topic is higher FPS is essential because we can see higher FPS.   I usually come unstuck arguing this because I have never been able to properly explain that higher FPS might just be the non causal symptom of something else that does make the game better.

 

 

30 minutes ago, Otto_iii said:


One specific memory from this era that is still relevant imo, is when spectating eachother in games my friends had already started going to the contemporary practice of snap-looking, i was one of the few on CRT so i could look around quickly rather then forced to snap, and everybody found that odd or interesting watching me, only in recent years have a realized a big part of it was the fact i was using a CRT for a lot longer then most my friends, who had very early flat-screens with god knows how bad of blur, so the meta with those was not just snap-shotting, but snapping to look.  You still see snap-to-look with a lot of people on 144hz even, so although its all improved not everything has changed from the early days of flat-screen LCDs or whatever it was. 

 

That is the sole reason I did not upgrade to a flat panel for years.   Also an interesting piece of trivia,  when LCD's first came out there used to be a lot of arguments regarding why they were worse for videos (especially when upscaled).  Several people tried to claim that CRT's were analogue thus had no upper resolution and that is why they were better.  Being an electronic tech for several years prior I tried explaining how the shadow mask works and how it effectively limits the resolution.  They used to even be sold with a dot pitch spec (distance between color dots),  however people don't like being corrected when they think they know it all.


QuicK and DirtY. Read the CoC it's like a guide on how not to be moron.  Also I don't have an issue with the VS series.

Sometimes I miss contractions like n't on the end of words like wouldn't, couldn't and shouldn't.    Please don't be a dick,  make allowances when reading my posts.

Link to post
Share on other sites
Posted · Original PosterOP
26 minutes ago, Mr Prince said:

Integer scaling can be done by app from steam store to, no idea why it only supports turing its obviously supported by every gpu

It's a wider question, why hasn't it been implemented before now? Unless I missed it, it isn't offered by AMD, Intel have talked about it but only for their next gen. Now nvidia implemented it on Turing.

 

It seems like a simple operation to do, duplicate pixels into 2x2 (or bigger) blocks. Maybe it is something that isn't as simple when you dig into it. Maybe nvidia, and Intel, want to differentiate their latest from previous generations? Maybe they need to implement it differently for past generations, and only have had time to do it on Turing so far?


Main rig: Asus Maximus VIII Hero, i7-6700k stock, Noctua D14, G.Skill Ripjaws V 3200 2x8GB, Gigabyte GTX 1650, Corsair HX750i, In Win 303 NVIDIA, Samsung SM951 512GB, WD Blue 1TB, HP LP2475W 1200p wide gamut

Gaming system: Asrock Z370 Pro4, i7-8086k stock, Noctua D15, Corsair Vengeance LPX RGB 3000 2x8GB, Gigabyte RTX 2070, Fractal Edison 550W PSU, Corsair 600C, Optane 900p 280GB, Crucial MX200 1TB, Sandisk 960GB, Acer Predator XB241YU 1440p 144Hz G-sync

Ryzen rig: Asrock B450 ITX, R5 3600, Noctua D9L, G.SKill TridentZ 3000C14 2x8GB, Gigabyte RTX 2070, Corsair CX450M, NZXT Manta, WD Green 240GB SSD, LG OLED55B9PLA

VR rig: Asus Z170I Pro Gaming, i7-6700T stock, Scythe Kozuti, Kingston Hyper-X 2666 2x8GB, Zotac 1070 FE, Corsair CX450M, Silverstone SG13, Samsung PM951 256GB, Crucial BX500 1TB, HTC Vive

Gaming laptop: Asus FX503VD, i5-7300HQ, 2x8GB DDR4, GTX 1050, Sandisk 256GB + 480GB SSD

Total CPU heating: i7-8086k, i3-8350k, i7-7920X, 2x i7-6700k, i7-6700T, i5-6600k, i3-6100, i7-5930k, i7-5820k, i7-5775C, i5-5675C, 2x i7-4590, i5-4570S, 2x i3-4150T, E5-2683v3, 2x E5-2650, E5-2667, R7 3700X, R5 3600, R5 2600, R7 1700

Link to post
Share on other sites
15 hours ago, porina said:

It's a wider question, why hasn't it been implemented before now? Unless I missed it, it isn't offered by AMD, Intel have talked about it but only for their next gen. Now nvidia implemented it on Turing.

 

It seems like a simple operation to do, duplicate pixels into 2x2 (or bigger) blocks. Maybe it is something that isn't as simple when you dig into it. Maybe nvidia, and Intel, want to differentiate their latest from previous generations? Maybe they need to implement it differently for past generations, and only have had time to do it on Turing so far?

I've always been confused why monitors didn't do integer scaling when the input resolution fits equally into the panel resolution.  It would have solved the blurry (poor upscaling) issues people were moaning about since day one, at least at certain resolutions.

I was always told "its not that simple", though never a good explanation of exactly how.


Router: i5-7200U appliance running pfSense.
ISP: Zen Unlimited Fibre 2 (66Mbit) + Plusnet Unlimited Fibre Extra. (56Mbit)

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×