If we were to put an objective metric, "snappiness" is basically the average input latency of the system. That is, if you press the "A" key on a keyboard and expect text to show up on screen, the faster it does this, the 'snappier' it is.
Ironically older systems, like 8-bit computers of the 80s, are actually snappier than modern systems. Why? It's a combination of the following:
The OS or system software ran on ROM. So it likely knew exactly where to go each time to grab data. Plus the fact that ROM speed, at least at the time, was relatively very fast.
The components, like CPU, memory, sound, and video, were simple. They had a fixed address in memory space (makes it easy for anything to talk directly to it) and were often easy to work with.
Basic input devices, like keyboards and controllers, sent data that was literally like a few bytes at most and the application directly received this. This is unlike say a USB based keyboard or controller where the payload may be a few bytes at most, but it also includes many more bytes for the USB protocol, which had to be packaged at the device and unpacked at the OS level, before the app even sees it.
tl;dr, making the system simpler makes it spend less time trying to figure everything out, which ends up adding to latency. You could apply those same principles to modern computers, but you can only get so far.