I am REALLY sorry for resurrecting this thread, but I wanted to chime in.
My experience with CRTs in the 90s was that most monitors were able to do 80+hz. At some point in the early 90s I realized you can adjust the refresh rate of monitors in Windows 3.1. This became even easier to do with Windows 95. The default was almost always the 60hz everyone is familiar with. I'd always shoot that up to 80hz with my PCs. Every time I'd be at someone's house I'd show them the neat trick of changing their CRT from 60 to 80hz. They acted as if I just gave them the recipe to change lead to gold. Of course, the refresh rate was dependent on the resolution the monitor was running, but most people had 640x480 and 800x600. People always ask what refresh rate old CRTs had, but the problem is that it is a variable answer since CRTs can run different refresh rates. The lower resolutions were able to run the highest refresh rates which would go down the higher in resolution you went.
I still game on a Sony GDM-FW900 I've had for 15+ years before their prices became stupid, and I really enjoy the zero input lag. Some people can notice the difference while others can't. I've been playing since the Doom era, and I can definitely notice it. I currently play my CRT with an EVGA 3080 at 2235x1397 @83hz. At 2304x1440 I was only able to do 79Hz so I opted for the slightly lower resolution at 83hz.
The early 2000s LCD ads were pretty hilarious. They always mentioned "less eye strain." Yeah, less eye strain cuz no one knew how to adjust their CRT's refresh rate.