Jump to content

Input lag vs framerate: The true competitive edge

mew905

Hello. No idea if LTT checks this out or not and yes, I saw his Shroud "60hz vs 144hz" video, but that mainly focused on the framerate.

It is my firm opinion that it's not the framerate that makes people better players, but the input lag. See, As far as I'm aware, LCD's have an inherent input delay based on their refresh rate: 60hz gives you ~16ms, 144hz gives you ~7ms, etc.

But there is one kind of monitor that can get you 0ms. CRT monitors.

See I used to play Counter-Strike: Source competitively (this was before "professionally" was a thing), and after an extended hiatus because of life stuff, I came back to PC gaming only to find that I could never match my skill previous. While I do expect some "rust", I was performing well below expectation. After years of thought and attempts at getting back into it, I've begun to realize that when I stopped, I was using a 21" HP CRT monitor, and now I was using the cheapest LCD's I could get my hands on. Both monitors were essentially run at 60hz, so that couldn't be it. My computer hardware post-hiatus was far superior to the hardware I had before (8800GT rocks!).

So my proposal is some kind of test. Retro gamers and FPS gamers are likely to be the best candidates for this testing: Are CRT's more competitive than top-tier gaming LCDs? Don't misunderstand, there are tons of good reasons we phased out the CRT, but if my hunch is correct, that means that "competitive gameplay" just went from an even playing field (all CRT's were 0ms) to essentially pay-to-win (yes, better hardware meant better framerates but dropping resolution on a CRT wasn't a big deal). Some CRT's are capable of 120+hz (albeit at hilariously tiny resolutions of like 1024x768), so we can utilize that to test 60hz and 144hz LCD vs CRT.

My theory is that a CRT (fished out of recycling at this point) will provide more of a competitive edge than a $300-$400 144hz top-tier gaming monitor.

But some caveats: framerate must be the same between screens. You can use radeon boost or freesync or whatever tech gives LCD's an advantage, but the CRT being at an inherently lower resolution, we will need to cap the framerate. I'd rather not use vsync since that seems to introduce a ton of input delay on its own. Perhaps Rivatuner's framerate limiter? Though ideally we'd run the game with unlocked frames and no vsync. Perhaps resolution scaling (can resolution scaling work with 4:3 aspect?).

Also it'll be very difficult finding a way to output an analog signal to a VGA/DVI screen since.... well the last card to have a VGA output (or a DAC since DVI-I has an analog signal too) was a GTX 745 wasnt it? To pass HDMI through a converter seems like it'd introduce latency as well.

 

I'd really like to see this testing done.

Link to comment
Share on other sites

Link to post
Share on other sites

CRT do still have input lag. It definitly does not have 0ms inputlag. It has inconsistancies between varies models.

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, tp95112 said:

CRT do still have input lag. It definitly does not have 0ms inputlag. It has inconsistancies between varies models.

I accept it's not actually 0ms, input lag since... physics, but effectively 0.

Do you have any sources for that, though? Something that eliminates mirror/extend lag or external DACs that would add lag so the CRT is the only variable?

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, mew905 said:

Hello. No idea if LTT checks this out or not and yes, I saw his Shroud "60hz vs 144hz" video, but that mainly focused on the framerate.

It is my firm opinion that it's not the framerate that makes people better players, but the input lag. See, As far as I'm aware, LCD's have an inherent input delay based on their refresh rate: 60hz gives you ~16ms, 144hz gives you ~7ms, etc.

But there is one kind of monitor that can get you 0ms. CRT monitors.

LCDs and CRTs both scan down the screen from top to bottom in one refresh period the same way. The extra latency you speak of exists for both types of display. CRTs don't have any processing latency, because they don't have to decode the signal, the display elements are driven directly from the signal from the graphics card.

 

It is well-known that CRTs have less latency than LCD monitors, I don't think there's any new ground being tread here.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Glenwing said:

LCDs and CRTs both scan down the screen from top to bottom in one refresh period the same way. The extra latency you speak of exists for both types of display. CRTs don't have any processing latency, because they don't have to decode the signal, the display elements are driven directly from the signal from the graphics card.

 

It is well-known that CRTs have less latency than LCD monitors, I don't think there's any new ground being tread here.

I know it's well known that CRT's have less latency. the question is is it really the framerate that makes the better player? Or is it the input lag? For example: In Linus' test, it was exclusively identical monitors: one at 144hz, the other at 60hz.

But would a 60hz CRT make a gamer a better performer (or close the gap) as a 144hz or even 240hz LCD? What about if you cranked the CRT to 144hz?

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, mew905 said:

I accept it's not actually 0ms, input lag since... physics, but effectively 0.

Do you have any sources for that, though? Something that eliminates mirror/extend lag or external DACs that would add lag so the CRT is the only variable?

My only source is mainly the Melee community who still play tournaments exclusively on CRT.  I havent read up on enough to be very technical about it it but if you want here are some stuff from a guy who knows his stuff. (I know its CRT TV but should also apply)

 

https://twitter.com/kadano/status/1113360594743320576

http://kadano.net/SSBM/inputlag/

https://smashboards.com/threads/perfect-setups-tv-monitor-console-capture-device.355292/page-7#post-21307864

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, tp95112 said:

My only source is mainly the Melee community who still play tournaments exclusively on CRT.  I havent read up on enough to be very technical about it it but if you want here are some stuff from a guy who knows his stuff. (I know its CRT TV but should also apply)

 

https://twitter.com/kadano/status/1113360594743320576

http://kadano.net/SSBM/inputlag/

https://smashboards.com/threads/perfect-setups-tv-monitor-console-capture-device.355292/page-7#post-21307864

 

 

I can't disagree, and I'm aware CRT's arent 0ms per se (but measured in tens of microseconds is effectively 0ms, according to your third link, <1ms is, for the most part, the majority of cases. However the ones > 1ms of latency.... are they on the same hardware? All the technical talk may have lost the testing methods for me). However check out this post by 3Dfan https://forums.blurbusters.com/viewtopic.php?t=4177&start=20#p33594

The 8600GT appears to be *much* more responsive than the 980ti on analog signals.

I feel as though those seeing anything more than a few milliseconds of lag are being affected by factors other than the signal itself: slower DACs perhaps, slower CPU, it'll be difficult to narrow it down.

Link to comment
Share on other sites

Link to post
Share on other sites

The way input latency is calculated can be different depending on the tester. Some will include frame time, others the pixel response instead, others just the processing delay.

 

At 240hz the frame time is 4.16ms. Meaning each frame is held fro 4.16ms, HOWEVER each frame is display from top left to bottom right and most measurements are from the center of the screen. So the fastest possible input delay for that frame is half the frame time. Ofc thats only for that one individual frame. The next frame still has to wait for the full sample and hold duration before it is shown. So while u can say that you can at the fastest see something on screen in half the frame time, u wont be able to see the reaction to that 'thing' until half way through the next frame which is the full frame time away.

 

When measuring input latency using a high speed camera, a LED board to display inputs, and the monitor displaying a reaction measured at the center of the screen, if the result is above half the frame time u can attribute that additional time to processing delay and thus call that the input lag.

 

How that number is displayed depends on the reviewer, one could include the frame time, one could not.

So in the event the test results in a latency of 6ms on a 240hz display u can ether write down the input delay as 3.92ms (6 - (4.16/2)), or 6ms. The 6ms includes the half frame time and the processing delay, the 3.92ms is just the processing delay.

 

CRT vs LCD.  CRT dont need to decode the input as its direct RGB, there's still some minor delay but for the most part its negligible. CRT also doesn't use the sample and hold technique, CRTs flicker, reducing persistence blur but still having to stick to the frequency its being run at.

So ur still limited by frame time, ur just not having to deal with the added blur that comes with sample and hold. This difference is one the big points that make CRT better for competitive gaming, they have negligible added input delay and clear motion due to a lack of sample and hold. They also dont suffer from slow pixel transitions which also adds blur in LCDs.

 

So now thats out of the way. Does monitor frequency matter ? . .yes ofc it does. A 60hz monitor has a frame time of 16.66ms, meaning u have the wait a minimum of 8.33ms of time before seeing something in the center of the screen. but a 60hz monitor with slow processing is still going to be worse than one with fast processing. The higher the monitors frequency the more of an impact additional processing time has. Eventually however the amount of time we are talking about becomes so small its negligible.

 

'Generally' speaking if the additional processing lag is less than the frame time of the display its 'relatively' harmless.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...
On 5/4/2020 at 6:55 AM, mew905 said:

Hello. No idea if LTT checks this out or not and yes, I saw his Shroud "60hz vs 144hz" video, but that mainly focused on the framerate.

It is my firm opinion that it's not the framerate that makes people better players, but the input lag. See, As far as I'm aware, LCD's have an inherent input delay based on their refresh rate: 60hz gives you ~16ms, 144hz gives you ~7ms, etc.

But there is one kind of monitor that can get you 0ms. CRT monitors.

See I used to play Counter-Strike: Source competitively (this was before "professionally" was a thing), and after an extended hiatus because of life stuff, I came back to PC gaming only to find that I could never match my skill previous. While I do expect some "rust", I was performing well below expectation. After years of thought and attempts at getting back into it, I've begun to realize that when I stopped, I was using a 21" HP CRT monitor, and now I was using the cheapest LCD's I could get my hands on. Both monitors were essentially run at 60hz, so that couldn't be it. My computer hardware post-hiatus was far superior to the hardware I had before (8800GT rocks!).

So my proposal is some kind of test. Retro gamers and FPS gamers are likely to be the best candidates for this testing: Are CRT's more competitive than top-tier gaming LCDs? Don't misunderstand, there are tons of good reasons we phased out the CRT, but if my hunch is correct, that means that "competitive gameplay" just went from an even playing field (all CRT's were 0ms) to essentially pay-to-win (yes, better hardware meant better framerates but dropping resolution on a CRT wasn't a big deal). Some CRT's are capable of 120+hz (albeit at hilariously tiny resolutions of like 1024x768), so we can utilize that to test 60hz and 144hz LCD vs CRT.

My theory is that a CRT (fished out of recycling at this point) will provide more of a competitive edge than a $300-$400 144hz top-tier gaming monitor.

But some caveats: framerate must be the same between screens. You can use radeon boost or freesync or whatever tech gives LCD's an advantage, but the CRT being at an inherently lower resolution, we will need to cap the framerate. I'd rather not use vsync since that seems to introduce a ton of input delay on its own. Perhaps Rivatuner's framerate limiter? Though ideally we'd run the game with unlocked frames and no vsync. Perhaps resolution scaling (can resolution scaling work with 4:3 aspect?).

Also it'll be very difficult finding a way to output an analog signal to a VGA/DVI screen since.... well the last card to have a VGA output (or a DAC since DVI-I has an analog signal too) was a GTX 745 wasnt it? To pass HDMI through a converter seems like it'd introduce latency as well.

 

I'd really like to see this testing done.

Ping is king. If you have better ping all the FPS in the world won’t outweigh that. 

Dirty Windows Peasants :P ?

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...

So the question is which is it? Seeing the other person first or firing that little bit faster.

I think seeing the other person first would be more important, if you really wanted to reduce input lag the better option would be to fix the human one and work on your own reaction times.

Proud owner of a custom water cooled Ryzen 1400. 5800x  

Link to comment
Share on other sites

Link to post
Share on other sites

CRTs have significantly higher motion resolution vs LCDs which is their great advantage (bc of scanline vs stop and hold). A 60hz crt is like a 480hz lcd. a 120hz crt is like a 1000hz lcd.

StreamFX Ambassador (s.xaymar.com/streamfx-dc for support)

Zephyrus G14 (5900HS, 3060) running Fedora i3 and some 14nm Skylake for hosting linux repos

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 5/4/2020 at 6:55 AM, mew905 said:

Hello. No idea if LTT checks this out or not and yes, I saw his Shroud "60hz vs 144hz" video, but that mainly focused on the framerate.

It is my firm opinion that it's not the framerate that makes people better players, but the input lag. See, As far as I'm aware, LCD's have an inherent input delay based on their refresh rate: 60hz gives you ~16ms, 144hz gives you ~7ms, etc.

But there is one kind of monitor that can get you 0ms. CRT monitors.

See I used to play Counter-Strike: Source competitively (this was before "professionally" was a thing), and after an extended hiatus because of life stuff, I came back to PC gaming only to find that I could never match my skill previous. While I do expect some "rust", I was performing well below expectation. After years of thought and attempts at getting back into it, I've begun to realize that when I stopped, I was using a 21" HP CRT monitor, and now I was using the cheapest LCD's I could get my hands on. Both monitors were essentially run at 60hz, so that couldn't be it. My computer hardware post-hiatus was far superior to the hardware I had before (8800GT rocks!).

So my proposal is some kind of test. Retro gamers and FPS gamers are likely to be the best candidates for this testing: Are CRT's more competitive than top-tier gaming LCDs? Don't misunderstand, there are tons of good reasons we phased out the CRT, but if my hunch is correct, that means that "competitive gameplay" just went from an even playing field (all CRT's were 0ms) to essentially pay-to-win (yes, better hardware meant better framerates but dropping resolution on a CRT wasn't a big deal). Some CRT's are capable of 120+hz (albeit at hilariously tiny resolutions of like 1024x768), so we can utilize that to test 60hz and 144hz LCD vs CRT.

My theory is that a CRT (fished out of recycling at this point) will provide more of a competitive edge than a $300-$400 144hz top-tier gaming monitor.

But some caveats: framerate must be the same between screens. You can use radeon boost or freesync or whatever tech gives LCD's an advantage, but the CRT being at an inherently lower resolution, we will need to cap the framerate. I'd rather not use vsync since that seems to introduce a ton of input delay on its own. Perhaps Rivatuner's framerate limiter? Though ideally we'd run the game with unlocked frames and no vsync. Perhaps resolution scaling (can resolution scaling work with 4:3 aspect?).

Also it'll be very difficult finding a way to output an analog signal to a VGA/DVI screen since.... well the last card to have a VGA output (or a DAC since DVI-I has an analog signal too) was a GTX 745 wasnt it? To pass HDMI through a converter seems like it'd introduce latency as well.

 

I'd really like to see this testing done.

Ping is king still. 16ms faster response in input lag or a 3ms advantage in Hz matters not when you're on a 50ms better ping. 

Dirty Windows Peasants :P ?

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×