Jump to content

nvidia 436.02 drivers, draws inspiration from AMD, Intel

porina
16 hours ago, Princess Luna said:

What happened to AI Tensor Core powered DLSS being the holy grail? Now they are implementing sharpening filter just like AMD lol

The two do opposite things. DLSS softens jagged edges. Sharpening makes them worse (as jagged edges are sharp). You would think that the two would cancel, but using them at the same time would produce something not that great, although perhaps it would appeal to some people. I have never really liked sharpened images though. You see halos and other artifacts. Plus, I don’t like having more prominent edges.

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, Alex Atkin UK said:

My point from the start was that comparing a fairly simple (compared to DLSS)  sharpening filter to DLSS is flawed, at least if you are going to assume one is always better than the other.

its a visual comparison in an upscaling enviroment. they are doing what DLSS was supposed to. give more performance while running a game at a given resolution. 

 

so 1800p upscaled with sharpening. which roughly gives the performance DLSS does at 4k. 

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, Alex Atkin UK said:

Except that is EXACTLY what DLSS is supposed to be doing.  It doesn't matter of some YouTuber compares the two and thinks one is better despite being a completely different thing.  Its their opinion and doesn't necessarily reflect the intent of the function.

Yes its useful to know if one thing might perceptually look better despite the fact it probably shouldn't, but it shouldn't be your only consideration.

You mileage will vary from game to game, monitor to monitor, eyeballs to eyeballs.

I felt that his video completely missed the point of DLSS. It is a super sampling technique for adding detail to images in a way that allows performance gains from reductions in the render resolution. You also get less prominent edges from it.

 

Comparing DLSS to sharpening at the same render resolution misses the point in two different ways. Super samplingmakes makes edges less prominent. Sharpening makes them more prominent (and adds artifacts). Which you prefer is largely a matter of taste, although when looking at his video, I saw artifacts in the image from the sharpening algorithm as he was talking about how great it was and I asked myself whether we had seen the same images (it is possible that compression artifacts caused it). He did admit toward the end of his video as a side note that using it to upscale (like it was intended to be used) produced better quality results (which was the point). :/

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, GoldenLag said:

its a visual comparison in an upscaling enviroment. they are doing what DLSS was supposed to. give more performance while running a game at a given resolution. 

 

so 1800p upscaled with sharpening. which roughly gives the performance DLSS does at 4k. 

Honestly, neither is very good compared to rendering at the display resolution and getting higher resolution displays. Whether you do sharpening or some kind of supersampling is a matter of taste. I never liked the halo effect and artifacts added by sharpening though. I consider the idea to be an old gimmick from TV where if sharpening had been a good idea, videos would have been broadcast presharpened. Similarly, if there were a benefit, game developers would have already used the effect.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, ryao said:

Honestly, neither is very good compared to rendering at the display resolution and getting higher resolution displays. Whether you do sharpening or some kind of supersampling is a matter of taste. I never liked the halo effect and artifacts added by sharpening though. I consider the idea to be a gimmick.

i mean. DLSS didnt compare favorably to rendering at 1800p with no sharpening. 

 

at that point, any sharpening may be a bonus. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Ryan_Vickers said:

My understanding of how AMD's anti-lag system works is it dynamically learns how long it takes to render a frame, and if it then knows that your next monitor refresh is in 16 ms and it takes only 6 ms to render the frame, rather than starting immediately the way it normally would, it will wait 10 ms, allowing new things to happen in the game, like the user making an input, etc., then render and deliver the finished frame at the last second so when you finally see it, it's only 6 ms old instead of 16, thus effectively shaving 10 ms off the total input lag.

 

I actually had the idea to do this myself once not too long ago but dismissed it as impractical and ineffective.  I assume they did some testing and found otherwise, but to be honest, I don't actually know.  I've yet to see any third party reviews or tests of it, so I'm not sure we can actually say with any confidence whether AMD's system or nvidia's system is better, or whether either is actually any good at all.  If anyone does have info, please let us know xD

 

One of the reasons I suspected it would not work well is that the time to render a frame is not constant - the demand of a game changes constantly from minute to minute, second to second, and even frame to frame, and if you intentionally wait, anticipating you'll have enough time, only to experience a sudden increase in load that causes you to not have enough time when you otherwise would have, that's going to result in stutter and lower frame rates that otherwise would not have been an issue.  Another reason I wonder about the usefulness is the fact that if you are able to render well above your target display rate (say, 200+ FPS on a 60 Hz monitor), what happens is after rendering that first frame, it doesn't just stop and wait for the screen to refresh - that's how vsync works, more or less - no, it starts rendering a new frame, and if it finishes in time, you get that newer one instead, and if not, at least it has the previous one to fall back on.  This means that you're already seeing "recent" information, and this system isn't likely going to be able to improve on that significantly.

 

When it comes to stuff like this being an advantage in FPS games, there is still an awful lot we don't know.    We know there is a difference between 60hz and 240hz monitors, we know there is a an improvement to visual effects with higher FPS.   We also know from many studies into the human visual processing system that FPS above about 80 (as a moving picture) are not individually discernible.  We also know that even elite athletes can't react faster than about 150ms (when they do it is due to anticipation not reaction), which means assuming the human can benefit from 200 FPS, we would see about 30 of them in the time we are calculating the required reaction.   Therefore I don't see that reducing lag by adjusting frames by 10ms is likely to have a large effect globally.  The same goes at 100FPS, the quickest reaction you can expect means that  about 15 frames would have been drawn and displayed while we are physically not able to do anything with them.  

 

Which means if there is an improvement by reducing lag by 10ms, then is it is more likely to be because those first few frames we see are sufficiently up to date with what the CPU has calculated as happening in the game and has nothing to do with human reaction, FPS, etc.   Which leads to more questions i.e whats better,  Anti lag systems or more processing power, or,  is there even a point in going over 100hz if lag is under control?

 

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, mr moose said:

When it comes to stuff like this being an advantage in FPS games, there is still an awful lot we don't know.    We know there is a difference between 60hz and 240hz monitors, we know there is a an improvement to visual effects with higher FPS.   We also know from many studies into the human visual processing system that FPS above about 80 (as a moving picture) are not individually discernible.  We also know that even elite athletes can't react faster than about 150ms (when they do it is due to anticipation not reaction), which means assuming the human can benefit from 200 FPS, we would see about 30 of them in the time we are calculating the required reaction.   Therefore I don't see that reducing lag by adjusting frames by 10ms is likely to have a large effect globally.  The same goes at 100FPS, the quickest reaction you can expect means that  about 15 frames would have been drawn and displayed while we are physically not able to do anything with them.  

 

Which means if there is an improvement by reducing lag by 10ms, then is it is more likely to be because those first few frames we see are sufficiently up to date with what the CPU has calculated as happening in the game and has nothing to do with human reaction, FPS, etc.   Which leads to more questions i.e whats better,  Anti lag systems or more processing power, or,  is there even a point in going over 100hz if lag is under control?

I've seen people talk about this before (reaction time and such) and never fully understood the logic behind how that fits into input lag.  I think there's often some misunderstandings and inappropriate comparisons.

 

For one, despite reaction time being around 250 ms for the average person, I guarantee you that you will be able to tell the difference between a near 0 lag system and a 50 ms lag system.  I'm sure if given only the 50 ms lag system that you would get used to it and think it reacts perfectly quickly, but the point stands.  There's also a spectrum of impact.  Lag may be significant enough to hinder you without being consciously noticeable, so just because you don't feel it doesn't mean you wouldn't benefit from an improvement.  Like many things, by the time you notice it, it's already at critical levels of bad - it starts having an affect long before it gets to that point.

 

It's also important to remember using a computer isn't the same as taking a reaction time test.  You're not just reacting to things you see, you're making your own inputs, and so it ends up being more of a constant feedback system, where you do a motion, see the impact, and this then affects how you continue with the motion.  It's not conscious - you're not processing each of these things separately and sequentially - but it does happen.  Think of a PID controller.  It's essentially the same idea - any additional latency between readings and action decreases the quality with which you can control the system.  If you ever get the opportunity to test out a particularly high latency setup, you can feel this for yourself.  Just move the mouse around, try simply clicking on things in rapid succession.  You'll find yourself either overshooting the target and having to backtrack, or approaching it more slowly than you otherwise could to try and avoid this.

 

Finally, it's a simple case of every little bit helps.  If we can shave off 10 ms here, 10 ms there, before you know it they all add up to quite a lot.

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

Thanks AMD

9900K  / Noctua NH-D15S / Z390 Aorus Master / 32GB DDR4 Vengeance Pro 3200Mhz / eVGA 2080 Ti Black Ed / Morpheus II Core / Meshify C / LG 27UK650-W / PS4 Pro / XBox One X

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Ryan_Vickers said:

Finally, it's a simple case of every little bit helps.  If we can shave off 10 ms here, 10 ms there, before you know it they all add up to quite a lot.

I might be looking at it simplistically, but 10ms is 10ms. Assume you have two people who are the same. One is given a system that gives them information 10ms after the other. They will, on average, be 10ms behind. There's probably some spread in reaction times, but that 10ms will be an advantage to getting the shot out, before being shot at. It doesn't matter if you can perceive say 144fps vs 240fps, it's how soon you see it that helps, however small.

 

BTW do these latency reduction techniques still apply to G-sync/Freesync displays? Or only for fixed refresh? I thought one of the selling points of variable refresh was to show the complete frame as soon as it was available, unless there is some excessive buffering going on I can't see how you can get better than that.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, ryao said:

The two do opposite things. DLSS softens jagged edges. Sharpening makes them worse (as jagged edges are sharp). You would think that the two would cancel, but using them at the same time would produce something not that great, although perhaps it would appeal to some people. I have never really liked sharpened images though. You see halos and other artifacts. Plus, I don’t like having more prominent edges.

That's not the case when you apply sharpening the way AMD does it. AMD applies sharpening effect in surfaces, not screen wide as post process effect.

Link to comment
Share on other sites

Link to post
Share on other sites

On 8/20/2019 at 11:39 AM, porina said:

Integer scaling

YES
YES

YES
YES
YESSSSSSSSSS!

 

This should have been an option 10 years ago, and it's nice to see that AMD and NVIDIA are actually getting to it. 

muh specs 

Gaming and HTPC (reparations)- ASUS 1080, MSI X99A SLI Plus, 5820k- 4.5GHz @ 1.25v, asetek based 360mm AIO, RM 1000x, 16GB memory, 750D with front USB 2.0 replaced with 3.0  ports, 2 250GB 850 EVOs in Raid 0 (why not, only has games on it), some hard drives

Screens- Acer preditor XB241H (1080p, 144Hz Gsync), LG 1080p ultrawide, (all mounted) directly wired to TV in other room

Stuff- k70 with reds, steel series rival, g13, full desk covering mouse mat

All parts black

Workstation(desk)- 3770k, 970 reference, 16GB of some crucial memory, a motherboard of some kind I don't remember, Micomsoft SC-512N1-L/DVI, CM Storm Trooper (It's got a handle, can you handle that?), 240mm Asetek based AIO, Crucial M550 256GB (upgrade soon), some hard drives, disc drives, and hot swap bays

Screens- 3  ASUS VN248H-P IPS 1080p screens mounted on a stand, some old tv on the wall above it. 

Stuff- Epicgear defiant (solderless swappable switches), g600, moutned mic and other stuff. 

Laptop docking area- 2 1440p korean monitors mounted, one AHVA matte, one samsung PLS gloss (very annoying, yes). Trashy Razer blackwidow chroma...I mean like the J key doesn't click anymore. I got a model M i use on it to, but its time for a new keyboard. Some edgy Utechsmart mouse similar to g600. Hooked to laptop dock for both of my dell precision laptops. (not only docking area)

Shelf- i7-2600 non-k (has vt-d), 380t, some ASUS sandy itx board, intel quad nic. Currently hosts shared files, setting up as pfsense box in VM. Also acts as spare gaming PC with a 580 or whatever someone brings. Hooked into laptop dock area via usb switch

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Syntaxvgm said:

YES
YES

YES
YES
YESSSSSSSSSS!

 

This should have been an option 10 years ago, and it's nice to see that AMD and NVIDIA are actually getting to it. 

If your sig is up to date, I have to burst your bubble. Turing only.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, porina said:

If your sig is up to date, I have to burst your bubble. Turing only.

I dont care. Before there was NO option for this, AMD or NVIDIA. It's stupid, because it's an artificial limitation, yes. I am actually able to force this on the linux drivers. 

Since the first 1440p screen I have waited for fucking integer scaling. 

And no, it's not up to date, but the 1080 is still my newest GPU. 

muh specs 

Gaming and HTPC (reparations)- ASUS 1080, MSI X99A SLI Plus, 5820k- 4.5GHz @ 1.25v, asetek based 360mm AIO, RM 1000x, 16GB memory, 750D with front USB 2.0 replaced with 3.0  ports, 2 250GB 850 EVOs in Raid 0 (why not, only has games on it), some hard drives

Screens- Acer preditor XB241H (1080p, 144Hz Gsync), LG 1080p ultrawide, (all mounted) directly wired to TV in other room

Stuff- k70 with reds, steel series rival, g13, full desk covering mouse mat

All parts black

Workstation(desk)- 3770k, 970 reference, 16GB of some crucial memory, a motherboard of some kind I don't remember, Micomsoft SC-512N1-L/DVI, CM Storm Trooper (It's got a handle, can you handle that?), 240mm Asetek based AIO, Crucial M550 256GB (upgrade soon), some hard drives, disc drives, and hot swap bays

Screens- 3  ASUS VN248H-P IPS 1080p screens mounted on a stand, some old tv on the wall above it. 

Stuff- Epicgear defiant (solderless swappable switches), g600, moutned mic and other stuff. 

Laptop docking area- 2 1440p korean monitors mounted, one AHVA matte, one samsung PLS gloss (very annoying, yes). Trashy Razer blackwidow chroma...I mean like the J key doesn't click anymore. I got a model M i use on it to, but its time for a new keyboard. Some edgy Utechsmart mouse similar to g600. Hooked to laptop dock for both of my dell precision laptops. (not only docking area)

Shelf- i7-2600 non-k (has vt-d), 380t, some ASUS sandy itx board, intel quad nic. Currently hosts shared files, setting up as pfsense box in VM. Also acts as spare gaming PC with a 580 or whatever someone brings. Hooked into laptop dock area via usb switch

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, porina said:

BTW do these latency reduction techniques still apply to G-sync/Freesync displays? Or only for fixed refresh? I thought one of the selling points of variable refresh was to show the complete frame as soon as it was available, unless there is some excessive buffering going on I can't see how you can get better than that.

I don't know for certain but if my understanding is correct they would not apply on a vrr display since the idea behind those, as you say, is to render as fast as you can and then display each frame as soon as it's ready

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

I tried turning on ultra low latency and played some Battlefield, then I turned it off, and played again. I cannot tell the difference. Would make for an interesting LTT video, can anyone tell the difference in a blind test for Nvidia and AMD.

 

I did have GSync on, if that does make any difference. 

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Ryan_Vickers said:

I've seen people talk about this before (reaction time and such) and never fully understood the logic behind how that fits into input lag.  I think there's often some misunderstandings and inappropriate comparisons.

 

For one, despite reaction time being around 250 ms for the average person, I guarantee you that you will be able to tell the difference between a near 0 lag system and a 50 ms lag system.  I'm sure if given only the 50 ms lag system that you would get used to it and think it reacts perfectly quickly, but the point stands.  There's also a spectrum of impact.  Lag may be significant enough to hinder you without being consciously noticeable, so just because you don't feel it doesn't mean you wouldn't benefit from an improvement.  Like many things, by the time you notice it, it's already at critical levels of bad - it starts having an affect long before it gets to that point.

 

It's also important to remember using a computer isn't the same as taking a reaction time test.  You're not just reacting to things you see, you're making your own inputs, and so it ends up being more of a constant feedback system, where you do a motion, see the impact, and this then affects how you continue with the motion.  It's not conscious - you're not processing each of these things separately and sequentially - but it does happen.  Think of a PID controller.  It's essentially the same idea - any additional latency between readings and action decreases the quality with which you can control the system.  If you ever get the opportunity to test out a particularly high latency setup, you can feel this for yourself.  Just move the mouse around, try simply clicking on things in rapid succession.  You'll find yourself either overshooting the target and having to backtrack, or approaching it more slowly than you otherwise could to try and avoid this.

 

Finally, it's a simple case of every little bit helps.  If we can shave off 10 ms here, 10 ms there, before you know it they all add up to quite a lot.

I was using the faster 150ms reaction time because it errs on the side of benefit for those who think they are elite gamer athletes.  Even with a slower reaction time, at best with lag at the start you are only effecting the outcome by a factor of less than 1/30th.  At those times it would be very difficult to measure as the results would all appear to be within the margin of error.

 

It's easy to say every little bit helps, but why does it help?  I've always argued that faster computers mean better results in games,  I have always maintained that this is due to lower lag times between input and feedback.  However at what point does that lag time become irrelevant to a limited human? And as I asked earlier,  if lag time savings using this method is better, then does that mean striving for more than 150FPS is pointless once you get lag time to a reasonable level?

 

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, mr moose said:

I was using the faster 150ms reaction time because it errs on the side of benefit for those who think they are elite gamer athletes.  Even with a slower reaction time, at best with lag at the start you are only effecting the outcome by a factor of less than 1/30th.  At those times it would be very difficult to measure as the results would all appear to be within the margin of error.

I bring up all that because again, there's more to gaming (or anything on a PC) than just reactions.  There are many situations that completely eliminate any affect of reaction time and make it visible in a "flattened" manner.  Here's two examples:

You are not reacting - you simply choose to make an input of your own volition.  You can simultaneously observe (either literally or through feel) your mouse and your cursor (or crosshair, etc.) moving across the screen, starting and stopping at the same moment, or not.

Guitar Hero.  This game is notoriously bad on laggy displays, even if the lag is only 2 frames at 60 fps.  A good player will hit the notes almost always exactly on the hit line.  This only works if the visuals, audio, and game are all in sync (no lag).  If there is considerable display latency, as is often the case with modern TVs, it goes like this: you see the note coming, you strum in the correct position.  This signal is sent to the game.  However, what you're seeing is two frames old already so the game has already rendered and sent the frames of the note slipping past the line, and those show up on your TV before you finally see the frame that indicates you hit the note.  To a skilled player this can be quite distracting as they will notice this and interpret it as them being behind, and subconsciously want to adjust their playing slightly, when in fact they do not need to.

1 minute ago, mr moose said:

It's easy to say every little bit helps, but why does it help?  I've always argued that faster computers mean better results in games,  I have always maintained that this is due to lower lag times between input and feedback.  However at what point does that lag time become irrelevant to a limited human? And as I asked earlier,  if lag time savings using this method is better, then does that mean striving for more than 150FPS is pointless once you get lag time to a reasonable level?

I think input lag and framerate ideals are separate things, although they definitely impact each other.  I don't know how much is (in the case of frame rate) too much, or (in the case of lag) sufficiently little.

 

I've heard that for the majority of people, framerates above the high 200s become indistinguishable - you can no longer perceive any benefit to going higher.  However, I also know that studies have been done that show people can perceive and even retain information from an image flashed for as little as 1/600 of a second.  Then there's the flattening example again.  Consider moving your mouse very quickly so the cursor shoots across your screen.  Did you see it move, continuously, as a solid object?  No, it jumped across, existing in only a few separate places, seemingly teleporting.  This is reduced at higher framerates, but to truly eliminate it, you would need 10s of thousands of FPS.

 

As for input lag, the easiest way to figure out how little is little enough would be to get a bunch of pros and just test them with different setups.  I suspect that it would reveal improvements right up to even the best systems available today, implying that additional reductions would also help.  Only when we have systems with near 0 ms of lag from input to output can we test to the limit and observe where, if ever, the performance plateaus.

 

Certainly higher framerates as a method of input lag reduction have been common and effective for a while now, but obviously there's more in the chain and at some point, going beyond 600, 1000, etc. isn't really helping much because it's only like 5% of the total chain.  Improvements will have to come from somewhere else.  But, that's not to say that increases in framerate won't still be useful for other reasons, as mentioned above.

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Ryan_Vickers said:

I bring up all that because again, there's more to gaming (or anything on a PC) than just reactions.

 

 

I never said reactions were the only thing, in fact I pointed out and specifically said twice there is a lot we don't know. reaction time is just one aspect.  I also said this:

 

11 hours ago, mr moose said:

 

 

Which means if there is an improvement by reducing lag by 10ms, then is it is more likely to be because those first few frames we see are sufficiently up to date with what the CPU has calculated as happening in the game and has nothing to do with human reaction, FPS, etc.   Which leads to more questions i.e whats better,  Anti lag systems or more processing power, or,  is there even a point in going over 100hz if lag is under control?

 

 

 

 

As I was commenting on the concept as a whole and not trying to narrow it down to a specific reasoning.

 

6 minutes ago, Ryan_Vickers said:

 

 

I've heard that for the majority of people, framerates above the high 200s become indistinguishable - you can no longer perceive any benefit to going higher.  However, I also know that studies have been done that show people can perceive and even retain information from an image flashed for as little as 1/600 of a second. 

 

Those are not frames per second, that is one frame flashed in the same time frame.  It is easy to perceive that because of image persistence.   Once you put half dozen similar images before and after the frame in questions no one can identify it.  In fact there is no evidence that people can discern a difference in frame rate after about 80 (as I said in my first post) for motion picture.

 

All other research showing 500 FPS + relies on very specialized images with hard edges and not really comparable to games.

 

6 minutes ago, Ryan_Vickers said:

 

Certainly higher framerates as a method of input lag reduction have been common and effective for a while now, but obviously there's more in the chain and at some point, going beyond 600, 1000, etc. isn't really helping much because it's only like 5% of the total chain.  Improvements will have to come from somewhere else.  But, that's not to say that increases in framerate won't still be useful for other reasons, as mentioned above.

 

I think we would need to see some sort of testing for that.  I am not satisfied enough quality data exists to make claims about

Pushing FPS beyond any number just yet. 1000FPS means 1 frame every ms, that is surely fast enough to start to accurately measure lag time?

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, mr moose said:

I never said reactions were the only thing, in fact I pointed out and specifically said twice there is a lot we don't know. reaction time is just one aspect.  I also said this:

 

 

As I was commenting on the concept as a whole and not trying to narrow it down to a specific reasoning.

I know, and I'm thankful for that, I just felt like property addressing everything I could think of on the topic because I've seen before people who don't seem to get it ? Sorry I should have made that more clear

Quote

Those are not frames per second, that is one frame flashed in the same time frame.  It is easy to perceive that because of image persistence.   Once you put half dozen similar images before and after the frame in questions no one can identify it.

Yeah, it's just worth pointing out imo because it shows how there's more to this than just frame rate.  The way our eyes factor in, etc.

Quote

Infact there is no evidence that people can discern a difference in frame rate after about 80 (as I said in my first post) for motion picture.

"Motion picture" you mean like a movie?  That's interesting, I have not heard that before but I could be convinced to believe it I guess.  However for gaming that's absolutely not true.

Quote

I think we would need to see some sort of testing for that.  I am not satisfied enough quality data exists to make claims about

Pushing FPS beyond any number just yet. 1000FPS means 1 frame every ms, that is surely fast enough to start to accurately measure lag time?

Yeah it really doesn't, there would need to be more tests on what helps, what's enough, etc.  What I mean is everything from your display to your mouse has latency too, and normally we don't think of them but if the time to render a frame is super low and the screen is showing them as soon as they're ready, suddenly those become a larger portion of the total lag.  I would expect 1000 fps would be plenty for lag elimination, yes.

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

The games I play most regularly:

 

Battlefield 1

Stardew Valley

 

Is it worth updating? I usually don't update anything right away, for fear of bugs or intentional nerfing of my hardware (FUCK YOU NVIDIA. GIVE ME BACK MY 10MHZ)

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Ryan_Vickers said:

"Motion picture" you mean like a movie?

Like any series of similar images that make up a sequence.  image persistence and visual processing means that the more frames there are of a similar image the harder it is for humans to perceive, they blend into each other.

 

 

5 minutes ago, Ryan_Vickers said:

  That's interesting, I have not heard that before but I could be convinced to believe it I guess.  However for gaming that's absolutely not true.

Yeah it really doesn't, there would need to be more tests.  But yeah I would expect 1000 fps would be plenty for lag elimination, yes.

 

I am hoping that in the future we see actual research grade studies done (as opposed to youtuber and media people). 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Has anybody used the Scaling or Anti-Lag sort of feature?  And do either seem to be working well? in regards to scaling how is it in regards to performance hit,  Looking forward to upscaling some games i run on 1600x900 if it works as well as AMDs appeared to
 

On 8/20/2019 at 12:35 PM, Alex Atkin UK said:

Except that is EXACTLY what DLSS is supposed to be doing.  It doesn't matter of some YouTuber compares the two and thinks one is better despite being a completely different thing.  Its their opinion and doesn't necessarily reflect the intent of the function.

 

I don't think its cool people are dogpiling on you, but honestly you kinda deserved it.  Its not just Hardware Unboxed, but iirc Steve from Gamers Nexus, Wendell from Level1Techs(and or EposVox), Anthony himself from LTT, i could go on and on just listing all the tech youtubers i know there wasn't a single one that didnt praise the feature compared to Nvideas half baked performance-tanking DLSS 

Maybe DLSS seemed technically neat, but so was the turbine car, yet there is good reason nobody adopted it. 

 

1 hour ago, mr moose said:

Those are not frames per second, that is one frame flashed in the same time frame.  It is easy to perceive that because of image persistence.   Once you put half dozen similar images before and after the frame in questions no one can identify it.  In fact there is no evidence that people can discern a difference in frame rate after about 80 (as I said in my first post) for motion picture.


So far as i am aware when the US Air Force (i believe it was them) did tests about this in the 1980 or so people could pick out varied flashes at atleast about 200-250hz, and ghosting blur from flashes (less precise) at 1000hz.  I can't source this but im fairly sure the data for that is out there.  I suppose in regards to perfectly crisp motion picture 80fps might be a relevant number, but monitors clearly work different enough that that same "80fps" looks far worse on a computer monitor then whatever was used for the study you seem to be referencing. 

Would be interesting to compose a bunch of links of academic studies related to this.  

All i know is as somebody whose done high-speed sports in my past im always amazed at people who think 75hz looks fine, but i guess everybody perceives things a bit different. 

Link to comment
Share on other sites

Link to post
Share on other sites

38 minutes ago, mr moose said:

In fact there is no evidence that people can discern a difference in frame rate after about 80 (as I said in my first post) for motion picture.

My gut feeling is that passive observation and an active influence would lead to very different results here. I use a 144 Hz monitor and just by looking at how the mouse pointer moves you can tell roughly what refresh rate it is set on. Even 144 Hz is not perfectly smooth and it is easy to see the movement steps between frames. I don't know how much faster it would have to be to give that impression.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, porina said:

My gut feeling is that passive observation and an active influence would lead to very different results here. I use a 144 Hz monitor and just by looking at how the mouse pointer moves you can tell roughly what refresh rate it is set on. Even 144 Hz is not perfectly smooth and it is easy to see the movement steps between frames. I don't know how much faster it would have to be to give that impression.

It's not too hard to calculate.  You would need the cursor to move no more than a single pixel each frame, and yet still be able to traverse your monitor at the rate of your mouse (depends on dpi, sensor's max m/s ability to read, etc.).  Lets estimate you can whip it across in ~30 ms though.  I'm guessing that based on the fact I can move it such that it appears only in the middle and then on the other side on my 60 Hz monitor.  If you have a 1920 x 1080 screen, that would mean it needs to update 1920 times in 30 ms, or in other words, 64,000 fps.

 

This is where the frame rate discussion becomes problematic, because I think we can all agree no one in the history of the world would be able to perceive a monitor at this rate and compare it to one running at say, 2000 fps, and think the 2000 fps one looks slow.  Yet, this simple test could and would reveal the difference to any person.  It sort of hijacks and subverts the persistence of vision thing.  If an object actually moved across our field of view, that's one thing, but in this example, nothing is actually moving, it's just appearing in a different place in rapid succession.  Without enough of those to fill in the gap, there's nothing for persistence to retain - the "blur" effect is in it fading in and out, not actually moving across.

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, Ryan_Vickers said:

You'll find yourself either overshooting the target and having to backtrack, or approaching it more slowly than you otherwise could to try and avoid this.

This is me using any of the laptops at work compared to using my desktop at home.

CPU: Intel i7 7700K | GPU: ROG Strix GTX 1080Ti | PSU: Seasonic X-1250 (faulty) | Memory: Corsair Vengeance RGB 3200Mhz 16GB | OS Drive: Western Digital Black NVMe 250GB | Game Drive(s): Samsung 970 Evo 500GB, Hitachi 7K3000 3TB 3.5" | Motherboard: Gigabyte Z270x Gaming 7 | Case: Fractal Design Define S (No Window and modded front Panel) | Monitor(s): Dell S2716DG G-Sync 144Hz, Acer R240HY 60Hz (Dead) | Keyboard: G.SKILL RIPJAWS KM780R MX | Mouse: Steelseries Sensei 310 (Striked out parts are sold or dead, awaiting zen2 parts)

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×