Dear Forum,
I have no real idea where to post this or ask it, so i decided to ask it here. Right of the bat i am not really familiar with stuff like this, so i thought you guys could hopefully help me out!
I've had a new GPU installed for the last 2 weeks, the RTX 2080 to replace my old GTX 970.
My pc specs are:
- i7-6700(non-k)
- RTX 2080
- 16GB RAM
- H110M PRO-D (Really standard mb i know..)
- 600W PSU
The big problem i'm getting is something y'all probably heard of is A underperforming GPU. Getting less fps on lower settings than on high.
In games like Battlefield 1, 4 & 5 and Singleplayer games where i always prefer the highest settings my GPU usage is 90-100% which is great! (With a 80% CPU) At great temps! (GPU ~80°C max)
But games like Rainbow Six Siege & Fortnite for example the usage on high is only 40% on gpu (all max) and 30% cpu with kinda ok fps but on low settings it gets as bad as 20% gpu and 40% cpu.
Temps are fine but fps isn't. As it's higher on high settings than low because of the usage. I have a 180hz monitor which i like to use even in highly demanding areas in the game. But it's not good to play at 80-100 fps there (it feels laggy).
What i tried: Fresh windows 10 reinstall with all drivers up to date, overclocking gpu, running different benchmarks (which look fine there, but not in games)
This can be an obvious cpu bottleneck for you guys, but like i said, i have no clue! I will answer all you guys ask for.
Thanks for helping!