If you ever see someone say "A bottlenecks B" remind yourself that that is fundamentally a poor understanding of what bottlenecking means. It's not so simple unfortunately. It's not that A bottlenecks B, it's that A or B is the "weakest link" in that particular application, i.e. A or B is being utilized to the max and is indirectly "holding the other back".
In games it generally depends on the resolution and graphics settings, and also whether the game in question is inherently CPU-heavy due to complex AI and whatnot (like Supreme Commander with massive amounts of units).
On very low res/details settings (no vsync fps uncapped), the CPU is the likely candidate to bottleneck, being the lowest common denominator to the framerate of the game. The GPU utilization is low, so even if you overclocked graphics core/memory, it wouldn't affect your fps. Also good examples of easily CPU bottlenecked games are retro titles based on the Quake 3 engine, even at high res, because graphically those are a piece of cake.
On high res/details (1440p-1600p-4K) the GPU gets taxed more and more until there is no difference between running a 3770K at stock or a 3930K at 5ghz, because the load on the GPU is far outweighing the CPU load. Conversely, in this scenario overclocking your graphics core/memory would actually make a difference.
The thing with modern 1080p gaming is, it's usually halfway. Like Cacao said, GPU utilization isn't great (about 50-70% depending on your anti-aliasing settings and stuff), And so a faster CPU could potentially give you some extra fps, until you transition to a point where the GPU starts bottlenecking again, since the higher fps you gained from a faster CPU, also slowly increased the GPU load.
In so many words, this is simply the reason why in GPU reviews you see bigger differences between results at 1440 or 1600p, than at 1080p, because the GPU utilization is higher. Would a stock 3930k be a bottleneck in an SLI 780Ti setup? On 1080p, most likely, on 4K probably not.
tl;dr hope this made sense