Jump to content

Does Anti-Aliasing add letancy(input lag)?

Go to solution Solved by thekingofmonks,

MSAA samples pixel colors to join their colors to other pixels, which does add more steps to the render queue, but the added time is so insignificant that it's not even noticeable.

I Want to know if anti-alias add letancy on its own.

 

Same frame rate, the only difference to be anti-alias on or off

we have different anti-aliasing methods like FXAA | MSAA | TAA and Driver anti-aliasing that can over-ride game anti-alias.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, GozenUnknown said:

I Want to know if anti-alias add letancy on its own.

 

Same frame rate, the only difference to be anti-alias on or off

we have different anti-aliasing methods like FXAA | MSAA | TAA and Driver anti-aliasing that can over-ride game anti-alias.

It shouldn't.  Obviously if you have a weak GPU and increase Anti Aliasing up like MSAA x8, there will be a performance hit.

Link to comment
Share on other sites

Link to post
Share on other sites

i see

 

in CS2 they say anti-alias MSAA X8 add +0.89ms input delay, but they don't clarify that its running at the same frames rate

 

CS2 is cpu limited game, settings like Shadows, Anti-aliasing will increase cpu usage = lower frame rate

https://csgo.com/news/96030-input-lag-impact-of-cs2-graphics-settings-discovered#:~:text=Input lag increases by nearly,have significant effect on lags.

 

For example they show this, without showing that the frame rate was the same

image.thumb.png.f7d607c4d479b9e6bef929a81c787ec3.png

Link to comment
Share on other sites

Link to post
Share on other sites

MSAA samples pixel colors to join their colors to other pixels, which does add more steps to the render queue, but the added time is so insignificant that it's not even noticeable.

Asus ROG G531GT : i7-9750H - GTX 1650M +700mem - MSI RX6600 Armor 8G M.2 eGPU - Samsung 16+8GB PC4-2666 - Samsung 860 EVO 500G 2.5" - 1920x1080@145Hz (172Hz) IPS panel

Family PC : i5-4570 (-125mV) - cheap dual-pipe cooler - Gigabyte Z87M-HD3 Rev1.1 - Kingston HyperX Fury 4x4GB PC3-1600 - Corsair VX450W - an old Thermaltake ATX case

Test bench 1 G3260 - i5-4690K - 6-pipe cooler - Asus Z97-AR - Panram Blue Lightsaber 2x4GB PC3-2800 - Micron CT500P1SSD8 NVMe - Intel SSD320 40G SSD

iMac 21.5" (late 2011) : i5-2400S, HD 6750M 512MB - Samsung 4x4GB PC3-1333 - WT200 512G SSD (High Sierra) - 1920x1080@60 LCD

 

Test bench 2: G3260 - H81M-C - Kingston 2x4GB PC3-1600 - Winten WT200 512G

Acer Z5610 "Theatre" C2 Quad Q9550 - G45 Express - 2x2GB PC3-1333 (Samsung) - 1920x1080@60Hz Touch LCD - great internal speakers

Link to comment
Share on other sites

Link to post
Share on other sites

It only increases input lag in the sense that it reduces fps because it's more demanding on your GPU. Less fps = more input lag.

If you're CPU limited at very high fps and enabling AA doesn't decrease your fps, there won't be any additional input lag.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

yes, everything adds lag, its just usually so little for AA that people don't notice or simply don't care...

 

if you add everything up most games (yes they even have internal "engine lag" btw) has like at least 60ms input lag, most games have more like 100-130ms... then you have monitor lag, control input lag.... you're typically looking at like at least 100-200ms of "lag"... then you also have network lag (of course) which adds another 15-350ms.

 

no one really cares about "0.xx" additional ms for a basic feature that makes your game playable in the first place.

 

but yes, it undoubtedly, unequivocally does. always.  

 

hope this helps! 🙂

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Trying to understand possible mechanisms for what is reported here.

 

The tester says they limited to 300fps, doesn't matter where it is done. That should be a 3.33ms frame time. End to end latency would have to include input, render, display, and everything in between. Render time must be below frame time, so it can finish earlier or later, and display accordingly. Then it just waits idle due to frame cap.

 

Basically if there wasn't a frame cap, there would be a corresponding difference in fps. Does that make sense?

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, porina said:

Trying to understand possible mechanisms for what is reported here.

 

The tester says they limited to 300fps, doesn't matter where it is done. That should be a 3.33ms frame time. End to end latency would have to include input, render, display, and everything in between. Render time must be below frame time, so it can finish earlier or later, and display accordingly. Then it just waits idle due to frame cap.

 

Basically if there wasn't a frame cap, there would be a corresponding difference in fps. Does that make sense?

actually this is really annoying because i *know* where this nonsensical myth comes from and "frame time" isnt "input lag" , it adds to input lag yes, but it's only a very small part of it...

 

some games actually have official figures of their *internal* input lag - which is funnily extremely hard to find ... for example Tekken used to boast to have the least internal input lag (tekken 7 has 63ms iirc) and that was actually confirmed too, CSgo was second place with like 70ms (iirc)

 

But as i said above,  that's only *internal input lag* there's a gazillion other factors that add to overall input  lag - yes, not at last network lag ... so people saying they can "feel" 0.3ms is just a severe case of placebo effect and i wouldn't trust such people with anything, because their obvious lack of self consciousness and common sense (sorry not sorry) 

 

ps: here all i can find rn, oddly its missing tekken and csgo (??) but there are as said several sources for that information,  sometimes even the developers themselves,  point is, input lag varies widely with games, and "frametimes" don't really have anything much to do with it - that's why they're called frame times and not input lag (not saying it's not a factor, but it's very minor and not deciding at all - as explained above. 

 

dfinputlstency.png.7da0ce6e8ef207157dd10908bbb519b9.png

 

(source: df, who i don't trust at all, but you gotta take what you can, eh, and its "reputable" 🙄)

 

 

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, porina said:

Trying to understand possible mechanisms for what is reported here.

how this stuff is measured is indeed big part of the question,  i doubt most of this is tested scientifically in any way,  most people don't even know what a lag tester is and that includes the people who "test" this stuff, they're influencers and not scientists, after all. 

 

ps: here's from the source (reddit 🤷‍♀️) of what OP posted above: 

 

Quote

Input lag measured with Nvidia FrameView, and validated with an external end to end latency tool.

 

"end to end latency tool" yeah, they got nothing really. 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Mark Kaine said:

actually this is really annoying because i *know* where this nonsensical myth comes from and "frame time" isnt "input lag" , it adds to input lag yes, but it's only a very small part of it...

Frame time is usually interesting because it can be measured without external hardware, and end to end latency does need that hardware significantly increasing complexity. For more GPU bound gaming frame time can be useful.

 

What initially threw me off in this specific case was the frame rate limit, which essentially throws out frame time as a useful measure. Presentmon GPU Busy time is probably more interesting here. If the frame rate cap wasn't present, the GPU Busy and frame time should converge to be essentially the same.

 

11 minutes ago, Mark Kaine said:

(source: df, who i don't trust at all, but you gotta take what you can, eh, and its "reputable" 🙄)

I feel they're on the top tier of gaming/PC channels at the moment. They're not perfect by any means but they're less prone to nonsense conclusions. Alex is great but John needs a new pair of glasses. Thankfully Alex does a lot of the PC stuff and John more the console side.

 

8 minutes ago, Mark Kaine said:

how this stuff is measured is indeed big part of the question,  i doubt most of this is tested scientifically in any way,  most people don't even know what a lag tester is and that includes the people who "test" this stuff, they're influencers and not scientists, after all. 

If you click through it said they used average of 3 measurements. I have no idea if that is enough or not. There may be some extra variance from that.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, porina said:

Frame time is usually interesting because it can be measured without external hardware, and end to end latency does need that hardware significantly increasing complexity. For more GPU bound gaming frame time can be useful.

i mean yes but as said its only a very small part of the equation,  i don't think people understand that. i also think its kinda not very interesting because its directly tied to framerate,  so yeah, its there and can be measured, but that's about it.

 

2 hours ago, porina said:

I feel they're on the top tier of gaming/PC channels at the moment

yeah,  but just because they're popular doesn't mean they're good, or reliable,  they're still influencers first and foremost, so take everything they say with a huge grain of salt (which doesn't mean they can't be right sometimes) 

 

2 hours ago, porina said:

If you click through it said they used average of 3 measurements. I have no idea if that is enough or not. There may be some extra variance 

look i don't think they're faking it, just that they have no real scientific approach so the results are whatever to me honestly (may be useful,  may be not)

 

also again this is only about a tiny fraction of what input lag consists of, i just think people take absurd interest in these numbers that ultimately dont mean much, eg 0.9 ms out of ~300 (realistically)  sure it ads up but overall its still small and things like server lag are almost entirely out of the users control. 

 

So interesting?  yes, i mean sure (even if the numbers are rather estimates than fact) but significantly?  not really.  vsync/gsync are the biggest offenders just turn that off and you're good tbh? 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Mark Kaine said:

i also think its kinda not very interesting because its directly tied to framerate,  so yeah, its there and can be measured, but that's about it.

In a GPU constrained scenario, which is the most common, more fps = lower frametime = lower latency. It's not difficult.

 

3 hours ago, Mark Kaine said:

yeah,  but just because they're popular doesn't mean they're good, or reliable,  they're still influencers first and foremost, so take everything they say with a huge grain of salt (which doesn't mean they can't be right sometimes) 

I never mentioned popular, I only said I consider them relatively sensible. I have no idea how their sub count compares to other tech channels. I judge them by their content. By my standards, they're good.

 

3 hours ago, Mark Kaine said:

look i don't think they're faking it, just that they have no real scientific approach so the results are whatever to me honestly (may be useful,  may be not)

You could say that about >99% of the testing out there. No one has infinite time to test every variable and double check it. At some point, you have to call good enough and move on. Take the risk that comes with it, and if a user particularly cares about a certain case, they can try to dig deeper. If you want a scientific paper grade peer reviewed result, you might get it long after the product is relevant.

 

To me, if you were to say results are good to within 3% of the true value, that's plenty good enough. I wouldn't get meaningfully more value if they reduced that to 0.1%.

 

3 hours ago, Mark Kaine said:

also again this is only about a tiny fraction of what input lag consists of, i just think people take absurd interest in these numbers that ultimately dont mean much, eg 0.9 ms out of ~300 (realistically)  sure it ads up but overall its still small and things like server lag are almost entirely out of the users control. 

If you have 300ms end to end latency you have big problems. You can literally measure that with a stopwatch. No special hardware needed. Probably wont hit it unless you're running <30 fps with V-sync on.

 

I take the view, if you want to take control of things, you can still improve what you can. Don't stress about things outside your control.

 

3 hours ago, Mark Kaine said:

vsync/gsync are the biggest offenders just turn that off and you're good tbh? 

G-Sync is the best thing that happened to latency to me (by association, VRR in general). I can't stand tearing so V-sync off is not an option. V-sync on without VRR does adversely impact latency, especially at low frame rates. 

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/1/2024 at 12:27 AM, porina said:

more fps = lower frametime = lower latency. It's not difficult.

it apparently is, its just a fraction of overall latency,  hence if someone tells me they can "feel" the difference between 200ms and 201ms all this does is make me question their sanity,  because literally no one can do that.

 

On 3/1/2024 at 12:27 AM, porina said:

If you have 300ms end to end latency you have big problems.

as i have elaborated,  with data from "df" even,  200-300ms is absolutely the ballpark you have to deal with in *most* games, yes there are some with less, as mentioned,  and say 250 or 90, that *is* a difference that's noticeable to most people yes. 

 

but the overall observation,  that like "0.9ms" don't matter in the grand scheme at all, is, i think, still very reasonable. then you have people that actually use vsync or gsync, yet say they care about the overall latency, it just doesn't make sense, vsync / gsync is so much worse than for example, AA, AA has literally almost no influence on latency, its not noticeable while vsync/gsync very much is, its borderline unplayable in some situations.  

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

38 minutes ago, Mark Kaine said:

as i have elaborated,  with data from "df" even,  200-300ms is absolutely the ballpark you have to deal with in *most* games, yes there are some with less, as mentioned,  and say 250 or 90, that *is* a difference that's noticeable to most people yes. 

It's not "most" games. There wasn't a single one at 300ms in that list you yourself gave earlier. There was a single "up to" 200ms, and vast majority were 133ms or lower.

 

38 minutes ago, Mark Kaine said:

then you have people that actually use vsync or gsync, yet say they care about the overall latency, it just doesn't make sense, vsync / gsync is so much worse than for example, AA, AA has literally almost no influence on latency, its not noticeable while vsync/gsync very much is, its borderline unplayable in some situations.  

V-sync can have bad latency, especially at lower fps. G-sync, do you even know what it is? G-sync (or other VRR tech) is essential for me in gaming. It reduces latency significantly compared to V-sync case while still preventing tearing. I don't know if it is as good as V-sync off case, but it'll be damn close without the negatives of it.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, porina said:

It's not "most" games. There wasn't a single one at 300ms in that list you yourself gave earlier. There was a single "up to" 200ms, and vast majority were 133ms or lower.

it really seems you are very selective reading and conveniently simply ignoring things i said ... you also seem to be unaware those are "the best" games, there are much worse, additionally it doesn't matter if its 200 or 150, the point is thats something people simply often do not account for when doing  their "math". you also conveniently forget network latency which is as i said everything between 15 and 300ms *factually*, that part isnt even debatable,  just plain old well known physics,  yet, its ignored by you - you cant just assume the best case scenario when discussing these things.

 

i urge you to read what i wrote (again?) and then explain to me precisely why "0.9ms" for "AA" would matter in any way (which seems also a part that you conveniently ignore, even though its literally the topic)

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, Mark Kaine said:

it really seems you are very selective reading and conveniently simply ignoring things i said

I was blinded by other stuff your wrote that didn't make sense so I focus on the big things over the small things.

 

23 minutes ago, Mark Kaine said:

... you also seem to be unaware those are "the best" games, there are much worse

Where did it say it was "best"? It looks like a random selection. Link to original source whenever you use an external reference is welcome.

 

23 minutes ago, Mark Kaine said:

you also conveniently forget network latency which is as i said everything between 15 and 300ms *factually*, that part isnt even debatable,  just plain old well known physics,  yet, its ignored by you - you cant just assume the best case scenario when discussing these things.

That's a separate dimension only applying to online games, and many have mitigations in place. Before you misunderstand that, I'm not saying it doesn't matter, or there is some magic way around physics, but I'd suggest watching the GDC video on Rocket League on how they largely solved network latency related problems in actual gameplay experience.

 

I think network latency doesn't affect local end to end latency anyway. We're talking push button to see something on screen.

 

23 minutes ago, Mark Kaine said:

i urge you to read what i wrote (again?) and then explain to me precisely why "0.9ms" for "AA" would matter in any way (which seems also a part that you conveniently ignore, even though its literally the topic)

My position was stated earlier. Less graphical computation results in lower frame times and with appropriate settings contribute to lower latency. It is up to the user to trade off image quality and performance, however big or small the differences are. Because it is one small difference doesn't mean it is no difference. Lots of small improvements can add up to a bigger improvement. I don't play speed competitive games but for those that do, a 1ms difference still nudges the statistics slightly in your favour.

 

Another example of a small change that people do is if you switch from a basic mouse/keyboard to a gaming one. You could go from 125 Hz to 1000 Hz polling rate. Average latency from polling alone goes from 4ms to 0.5ms. 

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×