Jump to content

Graphics card VS CPU 'bottlenecking' in games

Kalm_Traveler

Wanted to get a discussion going to figure out the best ways to determine where the bottleneck lies on a given setup. 

 

This came to mind because last year when I upgraded from 1 1080 to 2 1080's, then again to a pair of 1080 Ti's I had checked Witcher 3 (the only modern game I've played recently, since WoW only sort of counts) FPS with all the same maxed out settings, and everything else in my setup remaining the same more or less.

 

My screen all this time has been a 3440 x 1400 100Hz Gsync Asus display, the CPU has been the same 6900k @ 4.5 GHz, but RAM was replaced from 8 x 8gb 2666 CL16  on the 1080 and 1080 Ti testing to 4 x 16gb 3200 CL14 with the Titan V testing now, so that could play a role as well.

 

Basically, as I recall with the SLI 1080's, Witcher 3 would be around 40-60 fps, with 1 1080 Ti it was about the same, with SLI 1080 Ti's it was around 67-80 FPS. I finally got around to playing it again last night with the Titan V (technically both are installed and the 2nd one is being used for PhysX automatically by the Nvidia Configuration panel) and now Witcher 3 is running between about 95-110 FPS.

 

-------------

 

The reason this is strange to me is that when I looked at CPU and GPU utilization last year with the 1080 Ti's, It appeared that 2 or 3 CPU cores were pegged on 99% utilization when running W3, while both graphics cards were around 80% utilization.  This led me to believe that my CPU was then the bottleneck, but all I've done is install a bit faster RAM, and better graphics cards (not even in SLI) and seen a very huge performance increase.

 

I didn't check CPU util last night but just seeing the FPS increase seems pretty obvious that in my previous testing with the 1080 Ti's, the 6900k was not actually the bottlenecking part, right?

HEDT: i9 10980XE @ 4.9 gHz, 64GB @ 3600mHz CL14 G.Skill Trident-Z DDR4, 2x Nvidia Titan RTX NVLink SLI, Corsair AX1600i, Samsung 960 Pro 2TB OS/apps, Samsung 850 EVO 4TB media, LG 38GL950G-B monitor, Drop CTRL keyboard, Decus Respec mouse

Laptop: Razer Blade Pro 2019 9750H model, 32GB @ 3200mHz CL18 G.Skill Ripjaws DDR4, 2x Samsung 960 Pro 1TB RAID0, repasted with Thermal Grizzly Kryonaut
Gaming Rig: i9 9900ks @ 5.2ghz, 32GB @ 4000mHz CL17 G.Skill Trident-Z DDR4, EVGA RTX 2080 Ti Kingpin, Corsair HX1200, Samsung 970 EVO Plus 2TB, Asus PG348Q monitor, Corsair K70 LUX RGB keyboard, Corsair Ironclaw mouse
HTPC: i7 7700 (delidded + LM), 16GB @ 2666mHz CL15 Corsair Vengeance LPX DDR4, MSI Geforce GTX 1070 Gaming X, Corsair SFX 600, Samsung 850 Pro 512gb, Samsung Q55R TV, Filco Majestouch Convertible 2 TKL keyboard, Logitech G403 wireless mouse

Link to comment
Share on other sites

Link to post
Share on other sites

Perhaps the Titan V being a completely different architecture with different drivers could have surely been a factor.

There's a time and place for everything! But not now. - Professor Oak

i7 2600K 4.3GHz  -  GTX 1060 3GB  - ASUS P8Z68-V - 16GB DDR3-1600 CL9 - EIZO 1080p 120Hz VA

Intel Skulltrail: 2x Core 2 Quad QX9775 - Intel D5400XS - 16GB FB DDR2-800 CL5 Quad Channel

EVGA SR-2 Classified - 2x Xeon X5675 4.2GHz - 24GB DDR3-1830 C10 Triple Channel

Intel Skulltrail #2: 2x Xeon E5472  - Intel D5400XS - 16GB FB DDR2-667 CL5 Quad Channel

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Kalm_Traveler1 said:

but all I've done is install a bit faster RAM,

Memory frequency takes much bigger role these days, the gap of performance between 2133mhz to 3200mhz is huge on the latest more modern titles... also yes 100% usage is just the easier way to spot a CPU bottleneck not the only way... I think single core wise the i7 6900K indeed can lag a dual 1080 Ti setup somewhat... Also a single card configuration will always work better than a dual setup.

 

Scaling of performance will simply always better on single cards configurations.

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

Not many have done testing with RAM speeds on W3, but for Overwatch going from 1600MHz to 3000MHz RAM can net you over 100+ FPS, everything else in the system being the same.

QUOTE ME IN A REPLY SO I CAN SEE THE NOTIFICATION!

When there is no danger of failure there is no pleasure in success.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Princess Cadence said:

Memory frequency takes much bigger role these days, the gap of performance between 2133mhz to 3200mhz is huge on the latest more modern titles... also yes 100% usage is just the easier way to spot a CPU bottleneck not the only way... I think single core wise the i7 6900K indeed can lag a dual 1080 Ti setup somewhat... Also a single card configuration will always work better than a dual setup.

 

Scaling of performance will simply always better on single cards configurations.

 

I understand that RAM speed will have an impact, but 2666 to 3200 is only half the increase of the speeds you mentioned. 

 

From the older testing, it is obvious that all else being equal, Witcher 3 'works better' with two of the same card in SLI than one, as the FPS increased a bit with both SLI 1080's over a single 1080, and SLI 1080 Ti's over a single 1080 Ti.

 

The reason I'm confused is that it looked like I was hitting a CPU bottleneck since the SLI 1080 Ti's were only around 80% utilization, but I was seeing several cores on the CPU pegged at ~ 100%. 

 

At that point, several forums had suggested that I needed to get a (at the time) 7700k if I wanted to see 100 fps stable in WC3 with everything maxed at 3440x1440 but obviously that has proven to not be the case.

 

Is there a more accurate way to figure out where a performance bottleneck is?

I'm going to switch to SLI Titan Xp's probably (unless the new 1180/2080/whatever it is bests the Titan Xp) since the handful of games I play all benefit from SLI and I'm pretty sure two Titan Xp's on water will outdo what I'm seeing from a single Titan V in those games, but at the same time this is making me want to really make sure this build is as balanced as possible  - IE there's not one glaring part that is way behind the rest of the system in overall performance.

HEDT: i9 10980XE @ 4.9 gHz, 64GB @ 3600mHz CL14 G.Skill Trident-Z DDR4, 2x Nvidia Titan RTX NVLink SLI, Corsair AX1600i, Samsung 960 Pro 2TB OS/apps, Samsung 850 EVO 4TB media, LG 38GL950G-B monitor, Drop CTRL keyboard, Decus Respec mouse

Laptop: Razer Blade Pro 2019 9750H model, 32GB @ 3200mHz CL18 G.Skill Ripjaws DDR4, 2x Samsung 960 Pro 1TB RAID0, repasted with Thermal Grizzly Kryonaut
Gaming Rig: i9 9900ks @ 5.2ghz, 32GB @ 4000mHz CL17 G.Skill Trident-Z DDR4, EVGA RTX 2080 Ti Kingpin, Corsair HX1200, Samsung 970 EVO Plus 2TB, Asus PG348Q monitor, Corsair K70 LUX RGB keyboard, Corsair Ironclaw mouse
HTPC: i7 7700 (delidded + LM), 16GB @ 2666mHz CL15 Corsair Vengeance LPX DDR4, MSI Geforce GTX 1070 Gaming X, Corsair SFX 600, Samsung 850 Pro 512gb, Samsung Q55R TV, Filco Majestouch Convertible 2 TKL keyboard, Logitech G403 wireless mouse

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Overl0rd said:

Perhaps the Titan V being a completely different architecture with different drivers could have surely been a factor.

for sure, it's a completely different part - though to be fair the Titan V runs on the Geforce driver stack just like all the Geforce-branded cards.

HEDT: i9 10980XE @ 4.9 gHz, 64GB @ 3600mHz CL14 G.Skill Trident-Z DDR4, 2x Nvidia Titan RTX NVLink SLI, Corsair AX1600i, Samsung 960 Pro 2TB OS/apps, Samsung 850 EVO 4TB media, LG 38GL950G-B monitor, Drop CTRL keyboard, Decus Respec mouse

Laptop: Razer Blade Pro 2019 9750H model, 32GB @ 3200mHz CL18 G.Skill Ripjaws DDR4, 2x Samsung 960 Pro 1TB RAID0, repasted with Thermal Grizzly Kryonaut
Gaming Rig: i9 9900ks @ 5.2ghz, 32GB @ 4000mHz CL17 G.Skill Trident-Z DDR4, EVGA RTX 2080 Ti Kingpin, Corsair HX1200, Samsung 970 EVO Plus 2TB, Asus PG348Q monitor, Corsair K70 LUX RGB keyboard, Corsair Ironclaw mouse
HTPC: i7 7700 (delidded + LM), 16GB @ 2666mHz CL15 Corsair Vengeance LPX DDR4, MSI Geforce GTX 1070 Gaming X, Corsair SFX 600, Samsung 850 Pro 512gb, Samsung Q55R TV, Filco Majestouch Convertible 2 TKL keyboard, Logitech G403 wireless mouse

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×