Jump to content

mew905

Member
  • Posts

    4
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

mew905's Achievements

  1. I can't disagree, and I'm aware CRT's arent 0ms per se (but measured in tens of microseconds is effectively 0ms, according to your third link, <1ms is, for the most part, the majority of cases. However the ones > 1ms of latency.... are they on the same hardware? All the technical talk may have lost the testing methods for me). However check out this post by 3Dfan https://forums.blurbusters.com/viewtopic.php?t=4177&start=20#p33594 The 8600GT appears to be *much* more responsive than the 980ti on analog signals. I feel as though those seeing anything more than a few milliseconds of lag are being affected by factors other than the signal itself: slower DACs perhaps, slower CPU, it'll be difficult to narrow it down.
  2. I know it's well known that CRT's have less latency. the question is is it really the framerate that makes the better player? Or is it the input lag? For example: In Linus' test, it was exclusively identical monitors: one at 144hz, the other at 60hz. But would a 60hz CRT make a gamer a better performer (or close the gap) as a 144hz or even 240hz LCD? What about if you cranked the CRT to 144hz?
  3. I accept it's not actually 0ms, input lag since... physics, but effectively 0. Do you have any sources for that, though? Something that eliminates mirror/extend lag or external DACs that would add lag so the CRT is the only variable?
  4. Hello. No idea if LTT checks this out or not and yes, I saw his Shroud "60hz vs 144hz" video, but that mainly focused on the framerate. It is my firm opinion that it's not the framerate that makes people better players, but the input lag. See, As far as I'm aware, LCD's have an inherent input delay based on their refresh rate: 60hz gives you ~16ms, 144hz gives you ~7ms, etc. But there is one kind of monitor that can get you 0ms. CRT monitors. See I used to play Counter-Strike: Source competitively (this was before "professionally" was a thing), and after an extended hiatus because of life stuff, I came back to PC gaming only to find that I could never match my skill previous. While I do expect some "rust", I was performing well below expectation. After years of thought and attempts at getting back into it, I've begun to realize that when I stopped, I was using a 21" HP CRT monitor, and now I was using the cheapest LCD's I could get my hands on. Both monitors were essentially run at 60hz, so that couldn't be it. My computer hardware post-hiatus was far superior to the hardware I had before (8800GT rocks!). So my proposal is some kind of test. Retro gamers and FPS gamers are likely to be the best candidates for this testing: Are CRT's more competitive than top-tier gaming LCDs? Don't misunderstand, there are tons of good reasons we phased out the CRT, but if my hunch is correct, that means that "competitive gameplay" just went from an even playing field (all CRT's were 0ms) to essentially pay-to-win (yes, better hardware meant better framerates but dropping resolution on a CRT wasn't a big deal). Some CRT's are capable of 120+hz (albeit at hilariously tiny resolutions of like 1024x768), so we can utilize that to test 60hz and 144hz LCD vs CRT. My theory is that a CRT (fished out of recycling at this point) will provide more of a competitive edge than a $300-$400 144hz top-tier gaming monitor. But some caveats: framerate must be the same between screens. You can use radeon boost or freesync or whatever tech gives LCD's an advantage, but the CRT being at an inherently lower resolution, we will need to cap the framerate. I'd rather not use vsync since that seems to introduce a ton of input delay on its own. Perhaps Rivatuner's framerate limiter? Though ideally we'd run the game with unlocked frames and no vsync. Perhaps resolution scaling (can resolution scaling work with 4:3 aspect?). Also it'll be very difficult finding a way to output an analog signal to a VGA/DVI screen since.... well the last card to have a VGA output (or a DAC since DVI-I has an analog signal too) was a GTX 745 wasnt it? To pass HDMI through a converter seems like it'd introduce latency as well. I'd really like to see this testing done.
×