Hello. No idea if LTT checks this out or not and yes, I saw his Shroud "60hz vs 144hz" video, but that mainly focused on the framerate.
It is my firm opinion that it's not the framerate that makes people better players, but the input lag. See, As far as I'm aware, LCD's have an inherent input delay based on their refresh rate: 60hz gives you ~16ms, 144hz gives you ~7ms, etc.
But there is one kind of monitor that can get you 0ms. CRT monitors.
See I used to play Counter-Strike: Source competitively (this was before "professionally" was a thing), and after an extended hiatus because of life stuff, I came back to PC gaming only to find that I could never match my skill previous. While I do expect some "rust", I was performing well below expectation. After years of thought and attempts at getting back into it, I've begun to realize that when I stopped, I was using a 21" HP CRT monitor, and now I was using the cheapest LCD's I could get my hands on. Both monitors were essentially run at 60hz, so that couldn't be it. My computer hardware post-hiatus was far superior to the hardware I had before (8800GT rocks!).
So my proposal is some kind of test. Retro gamers and FPS gamers are likely to be the best candidates for this testing: Are CRT's more competitive than top-tier gaming LCDs? Don't misunderstand, there are tons of good reasons we phased out the CRT, but if my hunch is correct, that means that "competitive gameplay" just went from an even playing field (all CRT's were 0ms) to essentially pay-to-win (yes, better hardware meant better framerates but dropping resolution on a CRT wasn't a big deal). Some CRT's are capable of 120+hz (albeit at hilariously tiny resolutions of like 1024x768), so we can utilize that to test 60hz and 144hz LCD vs CRT.
My theory is that a CRT (fished out of recycling at this point) will provide more of a competitive edge than a $300-$400 144hz top-tier gaming monitor.
But some caveats: framerate must be the same between screens. You can use radeon boost or freesync or whatever tech gives LCD's an advantage, but the CRT being at an inherently lower resolution, we will need to cap the framerate. I'd rather not use vsync since that seems to introduce a ton of input delay on its own. Perhaps Rivatuner's framerate limiter? Though ideally we'd run the game with unlocked frames and no vsync. Perhaps resolution scaling (can resolution scaling work with 4:3 aspect?).
Also it'll be very difficult finding a way to output an analog signal to a VGA/DVI screen since.... well the last card to have a VGA output (or a DAC since DVI-I has an analog signal too) was a GTX 745 wasnt it? To pass HDMI through a converter seems like it'd introduce latency as well.
I'd really like to see this testing done.