Cool thing if they had adaptive graphical settings, when frame rate drops the various settings automatically drops to get the frame rate up...
Just a thought but probably complicated. Also had the idea of upsize screen field resolution and then down sample to actual screen width with some good average pixel color measure to get implicit anti-aliasing but also so much more. (EDIT It already exists on Nvidia GTX graphical cards by the name DSR (Dynamic Screen Resolution)). Stuck at HD but if the scene is rendered at say 4k and then converted down to plain hd it would give better image... of course then dual gtx 1080 is something you want. Thing is the display screen can be 4k but in the end the frame construct in game could already now accommodate better resolution. (i.e my screen has max resolution say 1920x1080 why not be able to specify in game resolution higher than that and then it's very fast to recalculate the larger image to fit screen...)