How to determine if a CPU bottlenecks the GPU?
i'm gonna type this rant one more time: (because this time people seem to actually be interested in it...)
"bottlenecking" is a fine balance between a piece of software, the different areas of your computer it uses, and how powerful those different areas are.
bottlenecking is something that *always* happens, because theres always one piece of hardware thats the limiting factor. (a perfectly balanced system doesnt exist)
the "bottleneck" thats most known on the net is when your CPU isnt fast enough to let the game use 100% of your GPU: a CPU bottleneck.
when your CPU is strong enough to do this, you *technically* have a GPU bottleneck, the GPU is the limiting factor of how fast your game runs. (this is what most gamers are after)
theres also more areas where a computer can bottleneck than CPU and GPU: imagine extracting a 20GB zip archive on a system with 8GB RAM.
the system will be unable to load the entire archive in RAM for processing, and it'll have to spend time on dynamically loading and unloading parts of the archive, making the process slower than it could, resulting in a "RAM bottleneck"
then we come to the hidden parts of the computer no one really talks about: the different busses in your system:
if you're running an 18core xeon thats magically OC'ed to 10GHz, and a fury X on steroids, plugged ina PCI-e 4x slot. (the reason i say specificly fury x is because AMD allows down to 4x pci-e, nvidia only down to 8x)
theres a big chance that 4x pci-e slot will not be able to keep up with the talk between the CPU and GPU, allowing neither to run at 100% load.
theres less extreme places where this happens, mostly in badly coded games (minecraft had it going on for a while...) this is the point people think "my computer is broken" while in fact its an issue hidden deep inside the system, thats not really measurable trough conventional means.
--
if you've made it this far, i'm sure by now you've realised there is no clear-cut way of deciding if something will "bottleneck" or not. but it gets worse:
within one game, there could be scenes that load the CPU more, and scenes that load the GPU more. as well as the fact theres no two games with the same balance between CPU load, GPU load, and other loads. (even within one engine, theres a clear difference between HL2 and HL2: lost coast)
you can always "guesstimate" if a system will be balanced for your specific workload, but its never a clear answer. the only reason to know for sure is by testing.
i'd also like to add that theres no single "balanced system" that is ideal for everything. i spent a while with an i7 4770 and a GT640.
more often than not the 4770 was the "bottleneck" in my system, rather than the GT640, even in games.
i'd also like to add that benchmarks arent a clear answer to if a game will run fine on specific hardware or not, because generally benchmarks are performed with all settings maxed out, to create a "fair comparison"
in a lot of cases theres one or two settings you can turn down to make a game MUCH more playable on a system with a weak GPU, or in some cases even in cases with a weak CPU.
to give you an example: my GT640 ran league of legends at a clean 60FPS (1440p), while barely looking different from "all the way maxed out" that ran at a fairly horrible 20FPS.
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now