Jump to content

Guide to BottleNecking

RedJamaX

"Bottle-Necking" - Explanation and Benchmarks

 

A note about the focus of this explanation:
This is not a technical explanation, rather a real-world approach that should better help people to decide whether or not they should be focusing on a CPU upgrade, a GPU upgrade, or both.  I am focusing on the issue of bottlenecks to answer the question in general, and to dispel some of the myths.  This is not meant to address the differences caused by any particular driver, API, or game being poorly optimized in one way or another.  Also, VR is not a part of this topic… if VR is your goal, then an upgrade to a 1070/80 makes perfect sense.

 

So... why did I do this lengthy explanation and review??  Well, I think most people with experience will agree that we are tired of hearing this question, and tired of seeing people giving bad advice.  Two of the most common questions are (almost verbatim):

  1. Will my Core i5-2500k CPU bottleneck my new GTX 1070 / 1080 graphics card?
    1. Short Answer:  Yes
  2. Do I “need” a new SkyLake i5-6600k, or i7-6700k CPU
    1. Short Answer:  Not Necessarily

The long answer is, well, keep reading...

 

First things first… What is “bottle-necking”?
    A bottleneck is when one component of a multiple-component system is running measurably slower than the others, and slows down the overall performance of the system.  In terms of traffic, you can think of a highway construction zone as the bottle-neck.  Several lanes of traffic filled with cars are suddenly reduced to fewer lanes, causing all of the cars to slow down.  Even after the lanes open back up again the number of cars is still restricted to whatever rate they are able to pass through the construction zone.
In regards to computers, the most common bottleneck for most people is their Hard Drive because among of all of the components in the system, a hard drive is typically the slowest device in terms of reading and writing data.  In regards to video games, bottlenecks are usually associated with your CPU.  A Graphics Card is typically not considered a bottle-neck if it's the component causing your bad game performance because it's at the end of the line in terms of data flow for gaming performance.  We just call that a slow graphics card.  (Technically, it’s a bottleneck)

 

Several factors come into play here.  
•    What is your current system configuration?  (CPU, GPU, RAM, Motherboard, PSU)
•    What games are you playing?  
•    What resolution and graphics settings are you using?
•    What resolution and graphics settings do you “want” to use?

 

So many times I see people on forums everywhere telling people they NEED to upgrade their 2500k because otherwise it's pointless to get the GTX 1070 or 1080.  "The 2500k will bottleneck the 1070 so you need to get a new 6700k" ... Or something to that end, implying that an upgrade from a GTX4xx, 5xx, 6xx, or 7xx is absolutely useless without a new CPU and that there will be no performance gain.  Let’s put this to rest, shall we?

 

Rule of Thumb:
•    Higher Resolution and/or Graphics Settings = Lower CPU Bottleneck
•    Higher Resolution and/or Graphics Settings = Lower Framerates
•    Lower Resolution and/or Graphics Settings = Higher CPU Bottleneck
•    Lower Resolution and/or Graphics Settings = Higher Framerates
Exceptions to the rule:
•    CPU heavy games (Civilization)
•    GPU heavy games (Tomb Raider)

(I separated the CPU bottleneck effect and Framerate because they are not necessarily joined in relation to your graphic settings)

 

The next question is, “does it matter”?  Specifically, does it matter to you?  Basically, what are you hoping to accomplish in terms of performance vs game settings?  

 

Games:
If you are just playing games like such as Counter-Strike, League of Legends, or World of WarCraft… those games are not particularly demanding by today’s standards, and they definitely don’t “need” a GTX 1070 or 1080, but if you have a G-Sync display and want to max out the refresh rate, then those newer GPUs could be helpful.  Whereas games like Crysis 3, The Witcher 3, and Far Cry Primal are very demanding in terms of CPU and GPU performance, and will definitely benefit from a newer GPU at almost any resolution.

 

Resolution:
    Most of the claims are that a GTX 1070 or GTX 1080 are overkill for gaming at 1080p resolution.  Or that you “must” upgrade to the 6700K if that’s your plan.  Again, this depends of your goal, and your games.  If you want to hit that max G-Sync refresh rate at 1080p while playing Witcher 3, then yes, a Skylake 6700k will help you get “closer” to that goal.  But, if you’re playing at 4K resolution, a 2500k with a nice overclock is going to feed those new GPUs just fine.  Sure, a 2500k will have to work harder, but at really high resolutions, most of the work is up to the GPU.

 

Graphics Settings:
    Do you care about the level of Eye-Candy?  Do you just install and play?  Or do you go into those graphics settings and jack everything up to ULTRA, then see if the gameplay experience is reasonable?  Higher graphics on demanding games, just like higher resolutions, put a lot more stress on the GPU even at 1080p resolution.  You can find several videos with The Witcher 3 or Crysis 3 running on a 6700K / GTX-1080, at Ultra settings, and the 6700k is running around 50% utilization.  Even IF there are areas where the CPU hits 100% (indicating a CPU bottleneck), for games as demanding as The Witcher 3, that will not be the average.  Here again, even a 2500K with a decent overclock can produce very playable frame rates.  

Personally, if I had to choose… I prefer Ultra Settings at 1080p with high frame rates, vs 1440p or 4K with lower details and lower frame rates.  More texture details, post processing effects, and higher frames-per-second (for me anyway) create a much more immersive environment… as opposed to sacrificing those details or frame rates for higher resolutions.


ON TO THE TESTING:

Test Systems:
Core2Duo E8400 @3.00GHz – 8GB DDR2 800MHz  (Intel G33 motherboard)
Core2Quad Q9500 @2.83GHz – 8GB DDR2 800MHz  (Intel G33 motherboard)
Core i5-2500 @3.4GHz – 16GB DDR3 1333MHz  (ASUS Z68 motherboard)
Core i5-2500K @4.8GHz – 16GB DDR3 2133MHz  (ASUS Z68 motherboard)

 

About the CPUs…
    First, this is what I have available… Second, I think this variation provides a good demonstration of bottle-necking and performance gain from a CPU, or a GPU upgrade.  The E8400 offers the chance to really show how a Graphics Card upgrade is still an upgrade, even with a CPU that is a significant bottleneck.  That said, if you are still gaming on an E8400 and trying to play the latest titles… it’s time to plan a CPU platform upgrade.  For the Core2Quad, I would rather have had a Q9650, or at least a Xeon X5450 on a 771-mod with 1333MHz RAM… but, the Q9500 only cost me $20 and I can make that money back.  For the i5-2500 running at 3.4GHz, and not 3.7GHz…  I’ve cleared this up before, as have others.  The reported “max turbo speed of 3.7GHz” on the Intel website only applies when a single core is being utilized.  For each additional core being used, the max turbo speed drops another 100MHZ, so the effective max turbo speed with all four cores being used is only 3.4GHz.  The Core i5-2500K @ 4.8GHz is what I run in my gaming PC right now, paired with my GTX 980 Ti.

 

Video Cards: Factory clock speeds
Gigabyte GTX960 2GB (Mini-ITX Model)
EVGA GTX 980Ti 6GB (ACX 2.0)

 

About the GPUs…
    Again, this is what I have available (that’s relevant to the discussion)… And, I think this also shows a good average of where a lot of people are currently, to where a lot of people are considering for an upgrade.  This works especially well since the 980 Ti and 1070 have very similar performance.  You can look up these graphics cards easily, they both have a slight factory overclock per their specs, but I did not use anything to drive them further, all clocks are default.  Also, the NVidia control panel settings were configured for default settings for all testing.

 

The Tests:
Games:  Frames Per Second

  • Hitman Absolution
    • Medium:  Medium Preset, No AA
    • Ultra:  Ultra Preset, AA set to 8x
  • Batman Arkham Origins
    • Medium:  Normal (vs DX11), No AA, No PhysX
    • Ultra:  DX11 Enhanced, Max AA (TXAA High??), PhysX on High
  • Shadow of Mordor
    • Medium:  Medium Preset, No AA
    • Ultra:  Ultra Preset, Max AA setting, Max AO setting (they are not maxed by default)
  • Watch Dogs
    • Medium:  Medium Preset, Textures Medium, No AA
    • Ultra:  All Settings Maxed, Temporal SMAA

Benchmarks:  Frames Per Second & Score

  • Unigine Heaven
  • Unigine Valley
  • 3DMark Cloud Gate
  • 3DMark Sky Diver
  • 3DMark Fire Strike

 

Testing Methods:
    All tests were performed at 1080p resolution.  Two sets of tests were performed for the games and the Unigine benchmarks.  One with a “Medium Settings” configuration, and one with an “Ultra Settings” configuration.  I think the different 3DMark tests provide a good example of Medium, High, and Ultra settings respectively.  For the “Medium” settings tests, I loaded the pre-configured “Medium” details setting, and then disabled any Anti-Aliasing.  For the “Ultra” settings test, I maxed all graphics settings.  Slight specific changes between the tests can be found above.  
    For Hitman, Batman and Shadow of Mordor, the built-in benchmarks were used.  For Watch Dogs, I used FRAPS to run a 5 minutes benchmark on fps, with each test consisting of the same starting point, running, driving through the city and stopping in the country, swimming and boating.  Unigine just ran the standard tests.  And finally , for 3DMark, I ran the normal tests, but I also ran the FRAPS fps benchmark during the demo for each test, recording the minimum, average, and maximum frame rates.

Results:
    I recorded the results in a spreadsheet and created graphs so it would be easier to compare.  Each test has two sets of results, and two graphs, one for Medium and one for Ultra.  This shows the effects of a processor upgrade based on the graphics settings (first four), as well as the effect of a GPU upgrade (last four).  Different games and different settings will show different result from the possible upgrades shown here.  I’ll go through each set of results with a short review in regards to where the bottle neck is, and the effects of CPU or GPU upgrade.
    Mix, Max and Avg…  I included all of these, but Avg Frame Rate is typically the best measure of comparison because Min and Max can be affected by “peaks” and “valleys” that don’t really reflect an overall performance measure.  We will see that a lot.

 

Hitman Absolution
    At medium settings, the CPU upgrade path makes the most sense and its easy see the benefit of a faster quad core processor.  The max fps has a higher peak with the E8400, but min and avg saw a substantial increase, even with the lower clock speed of the Q9500.  This will be common in many results so I won’t mention it each time, some games like more cores, others just benefit from raw processing speed.  In each test we can clearly see that at Medium settings, the performance is limited by the CPU power, this means that the bottleneck is the CPU.  This is clearly visible because as the CPU power and speed are increased, the performance results scale up nearly identically.  With the 980Ti and the two i5 processors, the Max Frame Rate might suggest that that GPU is possibly bottlenecking a bit here because they are so close together and not scaling with the CPU performance difference, but this is most likely one of those cases where the Max FPS peaked very high for one reason or another.  This is why Avg Frame Rate is a more important measure.

Hitman Medium.JPG

 

    For the Ultra settings test, we see a very different story.  It is clear that a GPU upgrade is the way to go if you want better performance with Ultra level details.  There is barely any performance gain from the E8400 to the i5… and none at all from the 1.4GHz increase offered by the 2500k.  This clearly shows that the bottleneck in the first four results is caused by the GTX 960, as it is the limiting factor in performance for all of those configurations.  Once we upgrade to the 980Ti the performance gain is higher even with the E8400 Dual Core.  The 980Ti scales nicely from the E8400 to the i5-2500, and then we see something else shows a processor upgrade is not always required, depending of your preferred gaming settings.  The results from the 2500k are better, but they do not scale up with the 30% clock speed increase compared to the regular 2500.  This indicates that the CPU and the GPU are trading off the bottleneck back and forth.  If the CPU was the only limiting factor, then the results should scale up with the CPU performance, but 30% more processor speed only gives us 10% faster performance.

Hitman Ultra.JPG

 

Batman: Arkham Origins:
    At medium settings for Batman we see nice scaling for a CPU upgrade, and another oddly high Max FPS peak with the 960 and 2500k.  But the 980 results show clearly that the CPU is the limiting factor (bottleneck).  You’ll notice that the performance increase from the 2500 to the 2500k is actually more than the 30% performance difference in the CPU clock speed.  This is most likely due to one more aspect I haven’t discussed yet which is the system memory speed.  The System RAM with the 2500 is only at 1333MHz, but with the 2500k it’s running at 2133MHz.  I chose those configurations because I wanted to represent average configurations that people might have and compare those to upgrade considerations.  More on that in the conclusion of this guide.

Batman Medium.JPG

 

    For the Ultra settings in Batman we see again that the 960 is the primary bottle neck in the first four results, evidence that would suggest some GPU bottle neck with the 980-2500k configuration, and again we see that upgrading the GPU is clearly the better option for Ultra settings even with the Dual Core E8400.  Even though the Min FPS of the E8400-980ti drops below the 2500k-960, the average is still noticeably faster, and offers twice the performance gain if you must choose between upgrading the CPU or the GPU, but cannot get both.

Batman Ultra.JPG

 

 

Shadow of Mordor:
    For Shadow of Mordor, the Medium results scaling does not show as clear a benefit as Hitman and Batman.  This is evidence that the game itself is very GPU dependent.  Meaning that, regardless of which CPU you have, and which settings you are running, a slower GPU will result in slower performance, and thus the GPU is a bottle neck, even at lower settings.  The CPU upgrade from E8400 to the 2500 (paired with the 960) is still faster than only upgrading the GPU to the 980 (paired with the E8400), but not nearly as beneficial as the previous games.  With games that a heavily GPU dependent we also typically see that raw CPU speed matters more than CPU cores (I believe Tomb Raider is similar).  The performance actually drops from the E8400 to the Q9500 due to the slower clock speed.  This is because in a CPU dependent game, it just need to the CPU to feed data more than it need the CPU to calculate anything.  So the higher clocked Dual-Core CPU helps more.

Mordor Medium.JPG

 

    Ultra settings on Mordor show similar results we’ve seen so far in that the 960 is clearly the bottleneck in the first four results, and that a GPU upgrade would be more beneficial.  We also see that the GPU is clearly the limiting factor between the 2500 and the 2500k as the Min and Avg FPS don’t budge.  Upgrading to a 6700k with the 980Ti would not likely help very much in this situation.

Mordor Ultra.JPG

 

 

Watch Dogs:
    At Medium settings, Watch Dogs clearly benefits from a faster CPU.

WatchDogs Medium.JPG

 

At Ultra settings, the GTX 960 proves to be the bottleneck once again.  However, this game is so CPU dependent that just upgrading the CPU to the 2500 is significantly better than upgrading to the 980Ti GPU (if you have to choose one or the other).  And the scaling on the 980Ti results suggests that an upgrade to the 6600k or 6700k would yield even better results.  (Crysis 3 results would be similar)

WatchDogs Ultra.JPG

 

Unigine: Heaven
    Unigine Heaven is the first benchmark in our testing and we can see that the GTX 960 is a limiting factor in both sets of tests.  Though, there is a little improvement gained by the CPU, it doesn’t really amount to much.  Moving over to the 980Ti makes a much more cost effective improvement across the board.  Interestingly, we can see how the CPU influences the performance as well, but it scales differently than the other tests we’ve performed earlier.  The performance gains appear to follow the “series” of CPU rather than then the raw processing power and speed.  This is most likely an indication that the Unigine benchmark is taking advantage of different features of the processor instead of the speed.  The 6600k / 6700k may very well bump this score up quite a bit?  I’m not sure, but maybe somebody else could answer that question.  The scoring for the Unigine test are directly related to the fps performance, so those graphs reflect that same differences in performance.

Unigin-H-fps Medium.JPG

Unigin-H-fps Ultra.JPG

Unigin-H-Score Medium.JPG

Unigin-H-Score Ultra.JPG

 

Unigine: Valley
    Unigine Valley takes more advantage of the CPU processing power which is reflected in results.  On Medium the increased CPU power definitely helps a lot more and you can easily see with the 2500k that the CPU is probably a bit of a bottle neck given that the scores are so close with the 960 and the 980Ti.  However in the Ultra detail test it’s clear that the GTX960 is the bottleneck and even the E8400 is able to feed data fast enough to the 960.  Once again, if you have to choose between the two, a GPU upgrade is the clear winner if you want to play with high detail levels.

Unigin-V-fps Medium.JPG

Unigin-V-fps Ultra.JPG

Unigin-V-Score Medium.JPG

Unigin-V-Score Ultra.JPG

 

3D Mark
    For the 3DMark test I used FRAPS to get the fps bench mark on ONLY the demo, and then allowed the tests to run through completion to get the entire score.  I left the defaults and ran all three tests which provide a good range of detail from Medium, High, and Ultra detail levels with Cloud Gate, Sky Diver, and Fire Strike, respectively.   The fps results were done because the overall score places quite a bit of weight on the CPU Physics score which is not a fair weighting system when considering what detail level somebody may be running on a lower-end rig.

 

    Cloud Gate – The fps results of the demo show the 960 as being matched up well with the Core2 series processors, but then clearly a bottleneck when paired with the Core i5.  And a CPU upgrade at this detail level is clearly the better option.  However, there is much less than a 30% gain between the 2500 and the 2500k with the 980Ti which suggests that even the non-k CPU is keeping up fairly well.   The overall scoring shows a significant increase for an upgraded CPU, again, this is due to the weight of the CPU performance in the scoring.

3DM-CG-fps.JPG

3DM-CG-score.JPG


    Sky Diver – For the “high-settings” equivalent, the fps results from the demo show that while a CPU upgrade will help the 960 perform better (CPU being the bottleneck), it’s still clear that a GPU upgrade alone will result in better performance.  The overall score shows once again that the CPU plays a significant role in the results.

3DM-SD-fps.JPG

3DM-SD-score.JPG


    Fire Strike – As with the trend of the other Ultra-equivalent level detail tests, the GTX960 is clearly the bottleneck and a GPU upgrade makes more sense in bang-4-buck assessment.  Yes, I am aware this is not the “Ultra” Fire Strike test, but that’s not the purpose of this.  Even the overall score reflects this same trend at this level of detail for 1080p gaming.  At this level, even the 980Ti is a bottleneck, easily shown with the exact same fps results between the 2500 and the 2500k.

3DM-FS-fps.JPG

3DM-FS-score.JPG

 

Conclusion
    So… Bottle-Necks and choosing a CPU upgrade or a GPU upgrade (if you can’t afford both)….
The tests I performed here do not represent everybody’s situation, rather, more of a main-stream average for graphics… and definitely a lower-end spectrum for the CPUs.  Though, in the quad-core spectrum, the SkyLake is the first CPU since the Sandy-Bridge that really raises the bar in performance high enough to consider spending the money to upgrade… speaking in terms of “core-for-core” and “clock-for-clock”.
    If you are running Medium-High level details on your games and you are happy with that, you might want to consider a CPU upgrade first.  This will increase your overall system performance and give you a nice bump in gaming performance at 1080p (possibly even more at lower resolutions like 1600 or 1440x900).  However, if you want to run games at High-Ultra level details, your best bet is to opt for the GPU upgrade, as In many case it might not just match, but surpass the performance gain from a significant upgrade in CPU power.  
That said… if you are running a Dual-Core, Anything! … that should be your focus.  Even a slight upgrade to Dual-Core with Hyper-Threading will help out here.  More and more games are taking advantage of more processing cores, and that trend will only continue.  A new game optimized for more cores will probably benefit much more from a CPU upgrade if it is significant enough, as long as your GPU is good enough to keep up.

 

Again, every situation is different based on the criteria I mentioned at the beginning, but hopefully this provides you with more of an idea of where bottle necking can exist and what factors can cause it.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Nice guide! will definately help out a couple people on the forum 

Elemental 

Spoiler

Intel i5 6500 @3.8ghz - 8GB HyperX - 600w Apex PSU - GTX 1060 G1 GIGABYTE 6GB - s340 Black - 240gb Toshiba Q300 - Cooler master TX3i - MSI z170-A PRO.

Old Build (sold for 290€)

Spoiler

Intel i3 540 @ 3.9ghz (On stock cooler, Hits 80c max) - 8gb ram - 500w power supply - P7H55-M LE  120gb SSD - Talius Drakko case

Project Frug 50$ Water loop

 

Laptops

Spoiler

13" Macbook Air - Alienware m14x r2 -  2009 15" Macbook Pro (I was give all of these and would never buy them myself)

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Julian5 said:

Nice guide! will definately help out a couple people on the forum 

Thanks... I really wished I could afford a new 6700k system right now... and a GTX1080...  since that is the biggest question people ask...  but, I have enough AAA Titles in my steam account from the LAST TWO sales to keep me going until next year anyway...

Link to comment
Share on other sites

Link to post
Share on other sites

One grain of salt : the minimums, which represent dips and stutter that can really affect the overall experience.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Megahurt said:

One grain of salt : the minimums, which represent dips and stutter that can really affect the overall experience.

Agreed!  I thought I had mentioned that in some of the intro paragraphs, I'll go back and look ... and also that the maximums are only "peaks"..

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×