Jump to content

Low FPS with low GPU Usage

SoggyHippo

So I have a GTX 1070 TI, I5 9600k, and a Dell U2412M. In Fortnite, I get frames maxing at about 140 fps but it always drops and I don't why. The GPU usage is at about 40%-65%. I think it may be because of the monitor because it is at 1200x1920 at 60Hz. Also the game is at max settings. And the cpu usage is 70%-100%.  Please help.

Gaming PC: Intel Core i5 9600k ; MSI Aero GTX 1070 TI ; MSI Z390-A PRO ; G.Skill Trident Z 16Gb 2400MHz ; Corsair RM750X ; Corsair Crystal 570x ; Samsung 850 EVO 250GB ; PNY ssd ; Corsair H75 AIO

 

Laptop: 2017 Macbook Pro 13' 256gb SSD

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, SoggyHippo said:

So I have a GTX 1070 TI, I5 9600k, and a Dell U2412M. In Fortnite, I get frames maxing at about 140 fps but it always drops and I don't why. The GPU usage is at about 40%-65%. I think it may be because of the monitor because it is at 1200x1920 at 60Hz. Also the game is at max settings. And the cpu usage is 70%-100%.  Please help.

By default vsync is off so it doesn't matter you have a 60hz monitor.  Your CPU usage should not hit 100 percent, perhaps 60 percent but not 100 percent.  Something is fishy.  I recommend running DDU "Display Driver Uninstaller" in safe mode, clean and restart and back into normal mode install the latest WHQL drivers and let us know how Fortnite runs.  By default the drivers are set to application controlled for vsync.  Make sure you don't have vsync on in Fortnite.  Let us know what happens.  Good luck!

Asus Sabertooth x79 / 4930k @ 4500 @ 1.408v / Gigabyte WF 2080 RTX / Corsair VG 64GB @ 1866 & AX1600i & H115i Pro @ 2x Noctua NF-A14 / Carbide 330r Blackout

Scarlett 2i2 Audio Interface / KRK Rokits 10" / Sennheiser HD 650 / Logitech G Pro Wireless Mouse & G915 Linear & G935 & C920 / SL 88 Grand / Cakewalk / NF-A14 Int P12 Ex
AOC 40" 4k Curved / LG 55" OLED C9 120hz / LaCie Porsche Design 2TB & 500GB / Samsung 950 Pro 500GB / 850 Pro 500GB / Crucial m4 500GB / Asus M.2 Card

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, ELECTRISK said:

Because your CPU is garbage and is bottle-necking your GPU.

Upgrade CPU. 

This is wildly incorrect, the 9600K is actually a really decent CPU and will by no means bottleneck a 1070Ti. Heck, even an 8600K won't hold a 1070Ti back(ask @LukeSavenije for further information).

Quote or tag me( @Crunchy Dragon) if you want me to see your reply

If a post solved your problem/answered your question, please consider marking it as "solved"

Community Standards // Join Floatplane!

Link to comment
Share on other sites

Link to post
Share on other sites

OP what game are you playing where you're seeing this?  Sike in fornite

 

Okay, how does it perform in other games

Community Standards || Tech News Posting Guidelines

---======================================================================---

CPU: R5 3600 || GPU: RTX 3070|| Memory: 32GB @ 3200 || Cooler: Scythe Big Shuriken || PSU: 650W EVGA GM || Case: NR200P

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Crunchy Dragon said:

This is wildly incorrect, the 9600K is actually a really decent CPU and will by no means bottleneck a 1070Ti. Heck, even an 8600K won't hold a 1070Ti back(ask @LukeSavenije for further information).

Just because I lied about it bottle necking the 1070 TI Doesn't make it not garbage. Intel has released CPU's since then so it is garbage. 

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, SoggyHippo said:

So I have a GTX 1070 TI, I5 9600k, and a Dell U2412M. In Fortnite, I get frames maxing at about 140 fps but it always drops and I don't why. The GPU usage is at about 40%-65%. I think it may be because of the monitor because it is at 1200x1920 at 60Hz. Also the game is at max settings. And the cpu usage is 70%-100%.  Please help.

What is the gpu temp? 

PC: CPU: i5-9600k - CPU Cooler: be quiet! Dark Rock Pro 4 - GPU: Sapphire Radeon RX 5700 XT 8GB GDDR6 - Motherboard: ASRock - Z370 Extreme4 - RAM: Team - T-Force Delta RGB 16 GB DDR4-3000 - PSU: Corsair - TXM Gold 550 W 80+ Gold Certified Semi-Modular ATX Power Supply - Case: Thermaltake - Core G21 TG

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, ELECTRISK said:

Just because I lied about it bottle necking the 1070 TI Doesn't make it not garbage. Intel has released CPU's since then so it is garbage. 

Why are you lying in the first place?

 

Also, just because there's something newer doesn't mean the old stuff is automatically trash. I daily run a GTX 780. My phone is a Galaxy S3. My CPU even is a Ryzen 5 1600. Look up some actual benchmarks before making statements like that.

 

Intel's lack of innovation is so high that even a 2600K isn't that much worse than an 8700K, which just makes your point look a little silly, if I'm honest. The 2600K came out in 2011.

Quote or tag me( @Crunchy Dragon) if you want me to see your reply

If a post solved your problem/answered your question, please consider marking it as "solved"

Community Standards // Join Floatplane!

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, ELECTRISK said:

I'm literally not. I don;t deserve to get banned just because he has garbage hardware. 

Tell us why specifically, get to the down and dirty of why the 9600k is bad

Community Standards || Tech News Posting Guidelines

---======================================================================---

CPU: R5 3600 || GPU: RTX 3070|| Memory: 32GB @ 3200 || Cooler: Scythe Big Shuriken || PSU: 650W EVGA GM || Case: NR200P

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Crunchy Dragon said:

Why are you lying in the first place?

 

Also, just because there's something newer doesn't mean the old stuff is automatically trash. I daily run a GTX 780. My phone is a Galaxy S3. My CPU even is a Ryzen 5 1600. Look up some actual benchmarks before making statements like that.

 

Intel's lack of innovation is so high that even a 2600K isn't that much worse than an 8700K, which just makes your point look a little silly, if I'm honest. The 2600K came out in 2011.

I lied to encourage an unnecessary upgrade.

It's not my fault you have a GTX 780 and a Galaxy S3.

Find one person with a 2600K on their spec list that's bragging about it. Bragging rights make new hardware worth it. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, lmeneses said:

What is the gpu temp? 

The GPU temp is 127 degrees Fahrenheit. 

Gaming PC: Intel Core i5 9600k ; MSI Aero GTX 1070 TI ; MSI Z390-A PRO ; G.Skill Trident Z 16Gb 2400MHz ; Corsair RM750X ; Corsair Crystal 570x ; Samsung 850 EVO 250GB ; PNY ssd ; Corsair H75 AIO

 

Laptop: 2017 Macbook Pro 13' 256gb SSD

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Slottr said:

Tell me why specifically, get to the down and dirty of why the 9600k is bad

It's not the newest processor released by Intel. 

Link to comment
Share on other sites

Link to post
Share on other sites

OP, if you can run MSI Afterburner with the onscreen display while playing games with it set to show your CPU usage %, Temps, and clock speed. As well as GPU usage %, Temps, clock speed. Also RAM usage might help as well.

What graphical settings in game are you using?
 

17 minutes ago, SoggyHippo said:

In Fortnite, I get frames maxing at about 140 fps but it always drops and I don't why.

Is it just a short, momentary drop in frames and then recovers back to 140FPS, or does the FPS gradually get lower and lower the longer you play?
When it does drop, what does it drop to?
Do the drops in FPS coincide with action happening on screen, like intense fights with other players, or entering a large city?
 

Quote

G.Skill Trident Z 16Gb 2400MHz

Is that 2x8GB sticks or a single 16GB stick?

PS. Don't feed the April Fool trolls.

CPU: Intel i7 6700k  | Motherboard: Gigabyte Z170x Gaming 5 | RAM: 2x16GB 3000MHz Corsair Vengeance LPX | GPU: Gigabyte Aorus GTX 1080ti | PSU: Corsair RM750x (2018) | Case: BeQuiet SilentBase 800 | Cooler: Arctic Freezer 34 eSports | SSD: Samsung 970 Evo 500GB + Samsung 840 500GB + Crucial MX500 2TB | Monitor: Acer Predator XB271HU + Samsung BX2450

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, ELECTRISK said:

I lied to encourage an unnecessary upgrade.

Why? That's clearly not the issue here.

1 minute ago, ELECTRISK said:

It's not my fault you have a GTX 780 and a Galaxy S3.

I'm not saying it is, just that I still use it and it's not "garbage" just because there are newer products.

1 minute ago, ELECTRISK said:

Bragging rights make new hardware worth it. 

Hardware is really only worth it when you pay a price worth the performance you get from it.

Quote or tag me( @Crunchy Dragon) if you want me to see your reply

If a post solved your problem/answered your question, please consider marking it as "solved"

Community Standards // Join Floatplane!

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, SoggyHippo said:

The GPU temp is 127 degrees Fahrenheit. 

WOAH NELLY 

 

WOAH I CAN'T READ

Community Standards || Tech News Posting Guidelines

---======================================================================---

CPU: R5 3600 || GPU: RTX 3070|| Memory: 32GB @ 3200 || Cooler: Scythe Big Shuriken || PSU: 650W EVGA GM || Case: NR200P

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, SoggyHippo said:

The GPU temp is 127 degrees Fahrenheit. 

That's what it runs at in game?

 

8 hours ago, Slottr said:

WOAH NELLY 

thats only 52 degrees Celsius 

PC: CPU: i5-9600k - CPU Cooler: be quiet! Dark Rock Pro 4 - GPU: Sapphire Radeon RX 5700 XT 8GB GDDR6 - Motherboard: ASRock - Z370 Extreme4 - RAM: Team - T-Force Delta RGB 16 GB DDR4-3000 - PSU: Corsair - TXM Gold 550 W 80+ Gold Certified Semi-Modular ATX Power Supply - Case: Thermaltake - Core G21 TG

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Spotty said:

OP, if you can run MSI Afterburner with the onscreen display while playing games with it set to show your CPU usage %, Temps, and clock speed. As well as GPU usage %, Temps, clock speed. Also RAM usage might help as well.

What graphical settings in game are you using?
 

Is it just a short, momentary drop in frames and then recovers back to 140FPS, or does the FPS gradually get lower and lower the longer you play?
 

Is that 2x8GB sticks or a single 16GB stick?

PS. Don't feed the April Fool trolls.

The frames drop a lot, what I was saying was the maximum frames I get is 140 fps. but it never stays there for long. It is 2 8 gig sticks.

Gaming PC: Intel Core i5 9600k ; MSI Aero GTX 1070 TI ; MSI Z390-A PRO ; G.Skill Trident Z 16Gb 2400MHz ; Corsair RM750X ; Corsair Crystal 570x ; Samsung 850 EVO 250GB ; PNY ssd ; Corsair H75 AIO

 

Laptop: 2017 Macbook Pro 13' 256gb SSD

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, lmeneses said:

thats only 52 degrees Celsius 

Holy I read that wrong, I think you see what I did lol 

Community Standards || Tech News Posting Guidelines

---======================================================================---

CPU: R5 3600 || GPU: RTX 3070|| Memory: 32GB @ 3200 || Cooler: Scythe Big Shuriken || PSU: 650W EVGA GM || Case: NR200P

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, SoggyHippo said:

The frames drop a lot, what I was saying was the maximum frames I get is 140 fps. but it never stays there for long. It is 2 8 gig sticks.

Try to turn details down one notch and maybe same with AA

Community Standards || Tech News Posting Guidelines

---======================================================================---

CPU: R5 3600 || GPU: RTX 3070|| Memory: 32GB @ 3200 || Cooler: Scythe Big Shuriken || PSU: 650W EVGA GM || Case: NR200P

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Crunchy Dragon said:

Why? That's clearly not the issue here.

 

It's in his best interest to upgrade and without deception he wouldn't do what's best for him.

2 minutes ago, Crunchy Dragon said:

I'm not saying it is, just that I still use it and it's not "garbage" just because there are newer products.

Actually, it is garbage. The point stands more on the S3 than the GTX 780 because it's still practical to be running a 780 in 2019 but all trolling aside, why on earth are you using the S3?

 

5 minutes ago, Crunchy Dragon said:

Hardware is really only worth it when you pay a price worth the performance you get from it.

That's just your inferior opinion. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Spotty said:

OP, if you can run MSI Afterburner with the onscreen display while playing games with it set to show your CPU usage %, Temps, and clock speed. As well as GPU usage %, Temps, clock speed. Also RAM usage might help as well.

What graphical settings in game are you using?
 

Is it just a short, momentary drop in frames and then recovers back to 140FPS, or does the FPS gradually get lower and lower the longer you play?
When it does drop, what does it drop to?
Do the drops in FPS coincide with action happening on screen, like intense fights with other players, or entering a large city?
 

Is that 2x8GB sticks or a single 16GB stick?

PS. Don't feed the April Fool trolls.

I was using all low settings except view distance. I just tested now with max settings and I am gettings 10-100 fps depending where i look and how i move. The gpu usage is 80-90%. Cpu is 100%. Ram is 46%.

Gaming PC: Intel Core i5 9600k ; MSI Aero GTX 1070 TI ; MSI Z390-A PRO ; G.Skill Trident Z 16Gb 2400MHz ; Corsair RM750X ; Corsair Crystal 570x ; Samsung 850 EVO 250GB ; PNY ssd ; Corsair H75 AIO

 

Laptop: 2017 Macbook Pro 13' 256gb SSD

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, lmeneses said:

That's what it runs at in game?

Yes

 

Gaming PC: Intel Core i5 9600k ; MSI Aero GTX 1070 TI ; MSI Z390-A PRO ; G.Skill Trident Z 16Gb 2400MHz ; Corsair RM750X ; Corsair Crystal 570x ; Samsung 850 EVO 250GB ; PNY ssd ; Corsair H75 AIO

 

Laptop: 2017 Macbook Pro 13' 256gb SSD

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, SoggyHippo said:

Yes

 

Interesting, do you have any power limits on your gpu? And what is your cpu temp?

PC: CPU: i5-9600k - CPU Cooler: be quiet! Dark Rock Pro 4 - GPU: Sapphire Radeon RX 5700 XT 8GB GDDR6 - Motherboard: ASRock - Z370 Extreme4 - RAM: Team - T-Force Delta RGB 16 GB DDR4-3000 - PSU: Corsair - TXM Gold 550 W 80+ Gold Certified Semi-Modular ATX Power Supply - Case: Thermaltake - Core G21 TG

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, lmeneses said:

Interesting, do you have any power limits on your gpu? And what is your cpu temp?

CPU Temp is 120 degrees Fahrenheit. And what do you mean by power limits? It has one 8 pin connector from a corsair rm750x.

Gaming PC: Intel Core i5 9600k ; MSI Aero GTX 1070 TI ; MSI Z390-A PRO ; G.Skill Trident Z 16Gb 2400MHz ; Corsair RM750X ; Corsair Crystal 570x ; Samsung 850 EVO 250GB ; PNY ssd ; Corsair H75 AIO

 

Laptop: 2017 Macbook Pro 13' 256gb SSD

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, SoggyHippo said:

CPU Temp is 120 degrees Fahrenheit. And what do you mean by power limits? It has one 8 pin connector from a corsair rm750x.

I just want to point out that neither your CPU nor your GPU will be reaching 100% in Fortnite.

 

What framerate are you targeting?

Quote or tag me( @Crunchy Dragon) if you want me to see your reply

If a post solved your problem/answered your question, please consider marking it as "solved"

Community Standards // Join Floatplane!

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Turtle Rig said:

By default vsync is off so it doesn't matter you have a 60hz monitor.  Your CPU usage should not hit 100 percent, perhaps 60 percent but not 100 percent.  Something is fishy.  I recommend running DDU "Display Driver Uninstaller" in safe mode, clean and restart and back into normal mode install the latest WHQL drivers and let us know how Fortnite runs.  By default the drivers are set to application controlled for vsync.  Make sure you don't have vsync on in Fortnite.  Let us know what happens.  Good luck!

I ran DDU and restarted my pc. I attempted to install Nvidia Display Driver v419.67 and it failed the first time. The second time around it did work. Now in fortnite I am getting 60-160 fps with max settings. (I had vsync off the whole time)

 

7 hours ago, Crunchy Dragon said:

I just want to point out that neither your CPU nor your GPU will be reaching 100% in Fortnite.

 

What framerate are you targeting?

Sorry for the late reply. I would have thought that one would reach 100%. So that means something else is bottlenecking both? Also I was aiming to pretty much have a solid 120 fps.

Gaming PC: Intel Core i5 9600k ; MSI Aero GTX 1070 TI ; MSI Z390-A PRO ; G.Skill Trident Z 16Gb 2400MHz ; Corsair RM750X ; Corsair Crystal 570x ; Samsung 850 EVO 250GB ; PNY ssd ; Corsair H75 AIO

 

Laptop: 2017 Macbook Pro 13' 256gb SSD

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×