Jump to content

Lower FPS on 4080 than expected

qjohn23
PC Specs at Bottom:
 
Hi everyone! So I recently upgraded from a 2060 to an MSI 4080 Gaming X Trio. I was aiming to game at 1080p and I was aware before buying it that my CPU (i9-9900k) would bottleneck the most at that resolution. I did some research prior to getting the card and was seeing that I would probably get a 10%-15% decrease in performance. But after seeing a ton of benchmarks of games at 1080p, I'm getting almost half the frames from all of the benchmarks in games like Cyberpunk, RDR2, etc. All settings on high-ultra in RDR2 are at 75-80 fps and Cyberpunk is the exact same frames. I put all the settings on low in Warzone 2.0 and I'm only getting 90 frames. Even turning on DLSS barely helps. I also have the latest NVIDIA drivers installed and my refresh rate is at 240Hz on Nvidia Control Panel. I'm really hoping I am just overlooking something simple, but I don't know what else to do. Any help is appreciated!
 
PC Specs:
MSI 4080 Gaming X Trio
i9-9900k
ASUS ROG STRIX Z390-E Motherboard
32 Gb DDR4
EVGA 750W GQ 80+ Gold
Link to comment
Share on other sites

Link to post
Share on other sites

Honestly speaking, buying a 4080 to play at 1080p was pretty dumb to do. Its gonna have some major issues.

 

Generally you need to do a complete Display driver removal (DDR) and once you cleanly wiped it try again. 

 

Also playing on ultra high with certain settings will always limit your FPS. Some of those settings are pretty trash for performance

Link to comment
Share on other sites

Link to post
Share on other sites

Definitely CPU bottleneck, 9900k is barely faster than 12400, in gaming 12400 would beat it. I happen to have 12400, it sometimes bottleneck my 6800 XT in 1440p, imagine a 9900k with 4080 in 1080p...

I'd say, you'd need 13700k or 13600k to maximize 4080 in 1080p.

4080 is a 4K card anyway, if you plan to play in 1080p or 1440p, go with AMD, their GPUs are somehow better in lower resolution but worse in 4K.

Not an expert, just bored at work. Please quote me or mention me if you would like me to see your reply. **may edit my posts a few times after posting**

CPU: Intel i5-12400

GPU: Asus TUF RX 6800 XT OC

Mobo: Asus Prime B660M-A D4 WIFI MSI PRO B760M-A WIFI DDR4

RAM: Team Delta TUF Alliance 2x8GB DDR4 3200MHz CL16

SSD: Team MP33 1TB

PSU: MSI MPG A850GF

Case: Phanteks Eclipse P360A

Cooler: ID-Cooling SE-234 ARGB

OS: Windows 11 Pro

Pcpartpicker: https://pcpartpicker.com/list/wnxDfv
Displays: Samsung Odyssey G5 S32AG50 32" 1440p 165hz | AOC 27G2E 27" 1080p 144hz

Laptop: ROG Strix Scar III G531GU Intel i5-9300H GTX 1660Ti Mobile| OS: Windows 10 Home

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, Shimejii said:

Honestly speaking, buying a 4080 to play at 1080p was pretty dumb to do. Its gonna have some major issues.

 

Generally you need to do a complete Display driver removal (DDR) and once you cleanly wiped it try again. 

 

Also playing on ultra high with certain settings will always limit your FPS. Some of those settings are pretty trash for performance

Yeah I was aware before buying it that it isn't a 1080p card. I wanted an upgrade and had the opportunity to purchase one. 

 

And I'll do the DDR right now. I'll let you know if that helps. Thanks!

 

Are there any settings in particular that are not worth the performance decrease?

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, qjohn23 said:

Yeah I was aware before buying it that it isn't a 1080p card. I wanted an upgrade and had the opportunity to purchase one. 

 

And I'll do the DDR right now. I'll let you know if that helps. Thanks!

 

Are there any settings in particular that are not worth the performance decrease?

Should be able to use Display driver uninstaller for it.

 

Generally it comes down to what you are okay with losing out, and every game is gonna be different sadly. So its hard to say what settings, id recommend looking into each game and see what other post about it.

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, Shimejii said:

Should be able to use Display driver uninstaller for it.

 

Generally it comes down to what you are okay with losing out, and every game is gonna be different sadly. So its hard to say what settings, id recommend looking into each game and see what other post about it.

okay great!

 

And one last question. So for a CPU bottleneck to occur, it should be around 98-99% usage. But my CPU hovers around 75% and I don't have a frame limiter. Why do you think that may be happening?

 

I even lowered the settings in the same games to all medium and getting the same frame rates.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, qjohn23 said:

okay great!

 

And one last question. So for a CPU bottleneck to occur, it should be around 98-99% usage. But my CPU hovers around 75% and I don't have a frame limiter. Why do you think that may be happening?

 

I even lowered the settings in the same games to all medium and getting the same frame rates.

Lowering settings will make your cpu work even harder. in most games if your cpu is at 75% it is maxed out since you arent using every core. try raising your settings to lift the work off the cpu a bit , if you use afterburner you will want to see your gpu usage around 98-99% and cpu usage go down

Link to comment
Share on other sites

Link to post
Share on other sites

Because the games wont ever use all the cores, theres no way it would reach 100% usage. Just because the usage is not 100%, doesnt mean that the CPU is doing fine. 

Games rarely use more than 4 mores, only very recent games start to use more than 4 more cores. So you'll see that 2 or 4 of the CPU cores usage are much higher than the rest. 

Games favor single core performance due to this, by today's standard 9900k is weak, even a 12100f has faster single core performance. 

I guarantee you that without an upgrade or rendering the game in 4K or more, bottleneck would always be there. 

 

Not an expert, just bored at work. Please quote me or mention me if you would like me to see your reply. **may edit my posts a few times after posting**

CPU: Intel i5-12400

GPU: Asus TUF RX 6800 XT OC

Mobo: Asus Prime B660M-A D4 WIFI MSI PRO B760M-A WIFI DDR4

RAM: Team Delta TUF Alliance 2x8GB DDR4 3200MHz CL16

SSD: Team MP33 1TB

PSU: MSI MPG A850GF

Case: Phanteks Eclipse P360A

Cooler: ID-Cooling SE-234 ARGB

OS: Windows 11 Pro

Pcpartpicker: https://pcpartpicker.com/list/wnxDfv
Displays: Samsung Odyssey G5 S32AG50 32" 1440p 165hz | AOC 27G2E 27" 1080p 144hz

Laptop: ROG Strix Scar III G531GU Intel i5-9300H GTX 1660Ti Mobile| OS: Windows 10 Home

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, rijzen said:

Lowering settings will make your cpu work even harder. in most games if your cpu is at 75% it is maxed out since you arent using every core. try raising your settings to lift the work off the cpu a bit , if you use afterburner you will want to see your gpu usage around 98-99% and cpu usage go down

 

6 hours ago, Dukesilver27- said:

Because the games wont ever use all the cores, theres no way it would reach 100% usage. Just because the usage is not 100%, doesnt mean that the CPU is doing fine. 

Games rarely use more than 4 mores, only very recent games start to use more than 4 more cores. So you'll see that 2 or 4 of the CPU cores usage are much higher than the rest. 

Games favor single core performance due to this, by today's standard 9900k is weak, even a 12100f has faster single core performance. 

I guarantee you that without an upgrade or rendering the game in 4K or more, bottleneck would always be there. 

 

Okay good to know! When was playing RDR2 on maxed settings, I saw that my core load for all 8 cores were using 57-73% It was the same situation for Cyberpunk. Does that not mean all of the cores are being used for the game?

RDR2 CPU cores.png

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, Shimejii said:

Honestly speaking, buying a 4080 to play at 1080p was pretty dumb to do. Its gonna have some major issues.

i don't really agree... i love playing with DSR, so 1080p technically wouldn't be 1080p, and if 1080p was an issue everyone could to that... problem solved!?

 

ps: i strongly believe op has a cpu bottleneck,  regardless of resolution... don't most cpus bottleneck a 4080 anyway? 

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Yes they are, but not spread evenly, which is an issue for all games, the engine can't spread the workload evenly across all cores, which hinders performance. Also, the CPU's cache size affects performance as well.

Anyway, RDR2 is a pretty well optimized game, if your GPU can't reach 100% utilization, the CPU is the issue.

You can try OCing it, but I doubt the performance would increase much, maybe 1-3%.

4080 is faster than even 3090 Ti, people get bottleneck on 3080s in 1080p, even with 12th gen CPUs.

Trust me or don't, my opinion still is that the CPU is bottlenecking the GPU.

Not an expert, just bored at work. Please quote me or mention me if you would like me to see your reply. **may edit my posts a few times after posting**

CPU: Intel i5-12400

GPU: Asus TUF RX 6800 XT OC

Mobo: Asus Prime B660M-A D4 WIFI MSI PRO B760M-A WIFI DDR4

RAM: Team Delta TUF Alliance 2x8GB DDR4 3200MHz CL16

SSD: Team MP33 1TB

PSU: MSI MPG A850GF

Case: Phanteks Eclipse P360A

Cooler: ID-Cooling SE-234 ARGB

OS: Windows 11 Pro

Pcpartpicker: https://pcpartpicker.com/list/wnxDfv
Displays: Samsung Odyssey G5 S32AG50 32" 1440p 165hz | AOC 27G2E 27" 1080p 144hz

Laptop: ROG Strix Scar III G531GU Intel i5-9300H GTX 1660Ti Mobile| OS: Windows 10 Home

Link to comment
Share on other sites

Link to post
Share on other sites

Just to step in here and clear some of the nonsense in here.

 

Bottleneck or no bottleneck, a CPU will give only X amounts of frames regardless if the GPU is under utilized or is the bottleneck.

what I am thinking is that the OP changed from an RTX 2060 to a 4080 and their performance is now half of the 2060, that is exactly how the OP typed it.

So if the CPU is the bottleneck then it should be gving the same frames it did with the RTX 2060 in CPU bound scenario's.

 

There is an issue elsewhere.

5800X 4720mhz fixed OC 6900XT -75mv, 2600mhz 1440P 165hz

Full rig here: https://uk.pcpartpicker.com/list/xvJF2m  

 

Link to comment
Share on other sites

Link to post
Share on other sites

So I am also having issues with lower than expected FPS with my new build. I thought I would easily get 200+ FPS on 1080p/144hz in Warzone 2. But I am only averaging about 160. What gives? Some say I am bottlenecking cause of my RAM, but I can't find a true consensus. 

 

My specs are:

  • CPU:Intel Core i7-12700KF 12-Core 3.60 GHz

  • GPU:GeForce® RTX™ 4080

  • RAM:Team T-FORCE Delta RGB 3200 MHz 32 GB XMP 2.0 Enabled.

  • Storage:Western Digital Blue SN570 NVMe M.2 SSD 1 TB

  • PSU: XPG Core Reactor, 850W, 80+ Gold

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×