Jump to content

Lower fps than expected on new build

Guef
Go to solution Solved by Guef,

The issue was running a single stick of RAM. Just tested Spider-man around Times Square and the performance increased is 40% both on average and 1%.
Cyberpunk from benchmark saw an increase of 40% on 1% lows and 25% on average fps.

Then running around the city encounters and such, saw an increase on 1% lows of almost 50%. (was on 35ish around that, now is on 58ish).

 

Cinebench benchmark and Heaven Benchmark didn't give much better results but i'm now content with my performance and is around what i expected.

Cyberpunk Benchmark:

97 average.

77 1%

Cyberpunk freeroam:

Average 60 - 80 fps

1%          57 - 65 fps

 

Spiderman rapid swings around times Square:

Average 74 fps

1%         61 fps

Hello, i have assembled my PC about 2 weeks ago.

Thing is I'm having lower average fps than i would expect compared to benchmarks on youtube of the same games.
Search for tests with the processor that should be my bottleneck, and they are getting like 90fps on Cyberpunk 2077 with a 3060.
All test were done in 1080p.

I've tested on Spiderman Remastered and CyberPunk 2077. On Spiderman i barely notice a change when switching settings like crowd density and LOD. I always get low fps when swinging rapidly through the city (between 40 to 50).

This is with everything on Max settings except from Raytracing on High (No DLSS, i tried DLSS but didn't see huge improvements).

On Cyberpunk the story is the same, i'm having low 40s when near crowds. And in some other zones(even with crowd density on low).(RT on high, DLSS Quality).


Specs:

Motherboard: msi b660m Mortar ddr4 (bios version E7D42IMS.110 , date: 12/13/2021)

CPU : Intel i5 12400F

RAM: DDR4 Patriot Viper RGB 16GB (1 stick)

PSU: 1000W Seasonic Focus GX-1000

GPU : 3080 TI Evga XC3 Ultra Gaming

Storage: Samsung 980 PRO Evo

Case: Corsair 4000D Airflow

Case fans: 2 stock ones+ 2 140mm Noctua Nf-a14 Pwm Chromax.Black.Swap.

CPU Fan: Noctua NH-u12A

 

Run Cinebench and my score : Single Core: 1725; Multicore: 12428
Run Heaven Benchmark : Score with 1080p max settings: 5878, min fps: 34.1, max fps: 487.2

Custom test of Heaven Benchmark near 4k(3374x1071) max settings: 3803, min fps 33.1, max fps 315.1

I think the benchmark scores are normal.
Max temps on CPU reach 60°

Max temps on GPU reach 80°

 

I'm just expecting the wrong performance from my hardware or is something of?
Is having just 1 ram stick instead of 2 on dual channel making a huge difference in performance?

Can a old bios version be the cause of low FPS?

Any suggestions or additional tests i should run?

Link to comment
Share on other sites

Link to post
Share on other sites

12400 definitely bottlenecks 3080 TI in 1080p, and 1 stick of RAM?

You need 2 sticks of RAM so it will be dual channel to have the best performance, so you're bottlenecking your bottleneck.

 

Not only that, 3080s and up run worse in 1080p with lower end CPU, 3080s and up are a 4K GPU, at the very least play in 1440p.

If you need your own confirmation, enable DSR and play the games in 4K, your GPU usage will go up and FPS will also go up.

Not an expert, just bored at work. Please quote me or mention me if you would like me to see your reply. **may edit my posts a few times after posting**

CPU: Intel i5-12400

GPU: Asus TUF RX 6800 XT OC

Mobo: Asus Prime B660M-A D4 WIFI MSI PRO B760M-A WIFI DDR4

RAM: Team Delta TUF Alliance 2x8GB DDR4 3200MHz CL16

SSD: Team MP33 1TB

PSU: MSI MPG A850GF

Case: Phanteks Eclipse P360A

Cooler: ID-Cooling SE-234 ARGB

OS: Windows 11 Pro

Pcpartpicker: https://pcpartpicker.com/list/wnxDfv
Displays: Samsung Odyssey G5 S32AG50 32" 1440p 165hz | AOC 27G2E 27" 1080p 144hz

Laptop: ROG Strix Scar III G531GU Intel i5-9300H GTX 1660Ti Mobile| OS: Windows 10 Home

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, Guef said:

90fps on Cyberpunk 2077 with a 3060

I doubt that very much. Your processor is surely the weak link of the chain here but also running modern processors with only one stick of RAM is costly in terms of FPS. Depending on the game you give up 10 to 35 % of FPS by halving the memory bandwidth. Modern CPUs were meant to operate in Dual Channel mode. They will do fine on single sticks as well but that is a fallback solution, not what they were designed to do. I'd really recommend getting another stick of matching memory or replace them both with a faster kit. Then, if your budget allows for it, get a 13th gen processor (you want to update to the latest BIOS before upgrading though). However that is not a must. The game is still not that well optimized so you will have to play around with the settings to get where you want to be. Some people want nice textures and don't mind playing games at 30-45 fps. I would rather loose some fidelity and play at higher fps. 

 

Also be very careful watching other people play the game and their claims for high fps on a certain part. Either they are lying about the settings or the hardware or both. In some instances those videos were made to sell an overpriced graphics card. I've seen videos claiming you can run Battlefield V on Intel 4500 Graphics at over 60 FPS. Just a lot of BS and clickbait.

Link to comment
Share on other sites

Link to post
Share on other sites

I suspect that with a single stick of RAM on a 12400 at 1080p, you won't have any better results whatever GPU you have above a 3060 as you're heavily CPU and RAM bandwidth bottlenecked

Get another stick of RAM and play at 1440p, which is the sweet spot of your built

System : AMD R9 5900X / Gigabyte X570 AORUS PRO/ 2x16GB Corsair Vengeance 3600CL18 ASUS TUF Gaming AMD Radeon RX 7900 XTX OC Edition GPU/ Phanteks P600S case /  Eisbaer 280mm AIO (with 2xArctic P14 fans) / 2TB Crucial T500  NVme + 2TB WD SN850 NVme + 4TB Toshiba X300 HDD drives/ Corsair RM850x PSU/  Alienware AW3420DW 34" 120Hz 3440x1440p monitor / Logitech G915TKL keyboard (wireless) / Logitech G PRO X Superlight mouse / Audeze Maxwell headphones

Link to comment
Share on other sites

Link to post
Share on other sites

Some clarifications:

- I knew when picking the parts that my GPU would be suited at running 2k games with everything max + RT and that's the final goal, but at the moment i don't have the monitor (I'm stuck with 1080p for a while).

- I know the single stick of RAM has some effect on performance, but can it have such a big effect? (from what i thought it would be 10 or 20% at best, i currently have the XMP enabled on my Ram and it's running at 3200 Hz) I will get the second stick as soon possible, but I'm afraid this might not be the solution.

 

8 hours ago, Dukesilver27- said:

Not only that, 3080s and up run worse in 1080p with lower end CPU, 3080s and up are a 4K GPU, at the very least play in 1440p.

If you need your own confirmation, enable DSR and play the games in 4K, your GPU usage will go up and FPS will also go up.

Tested using FSR to upscale to 2k and the max allowed, that is  2880 x 1800. Swinging around Time Square saw my fps vary between 30 to 50 fps on the three different resolutions (the 2880x1800 had 5 fps lower than the others more or less).
It doesn't make sense to me, that i would get lower fps on 1080p than 2 or 4k. And most benchmarks i seen sustain that claim. Take for example the video of Hardware Unboxed in which he tests a wide range of CPUs with dual channel DDR4 3600 Hz and 3090 ti. The 12400f on 1080p high with Raytracing enabled is giving 87 fps average and 1% lows of 66 fps.

When i disable ray tracing i even get dips below 60 to like 50 or so when swinging. Instead of the 100+ fps i should.

 

My concern comes from seeing multiple benchmarks of different sources that show higher fps in general like 30-50fps higher than I'm getting.

Is my expectation of being able to run these games with everything max(including RT) at 1080p 60 fps unrealistic?

Link to comment
Share on other sites

Link to post
Share on other sites

that is a very bad configuration with the money spent here, one stick of ram and a 3080 ti to work on an i5. huge bottlenecking here.

  • CPU
    I7 12700KF
  • Motherboard
    MSI Tomahawk z690 WIFI DDR4
  • RAM
    16gb Vengeance Pro
  • GPU
    RTX 3070 TI
  • Case
    NZXT H700i
  • Storage
    crucial P2 500gb m.2
    Raid 0 16tb server
  • PSU
    Corsair RM1000X
  • Display(s)
    ASUS PG259QN 360hz
    ASUS PG278QE 165hz
  • Cooling
    NZXT KRAKEN X62
  • Keyboard
    Corsair K95P Brown
    Ducky one 2 SF Silver
  • Mouse
    Razer Viper Ultimate
    logitech pro superlight x
  • Sound
    DT 990 PRO
Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Fl0yd- said:

that is a very bad configuration with the money spent here, one stick of ram and a 3080 ti to work on an i5. huge bottlenecking here.

Hi, thing is i already bought it.

The idea being to update everything but the GPU in the future.

What i'm asking here is if a overlooked something or someone has an idea of what could cause such low fps compared to what most people get (most people here say the single stick could make a difference).

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Guef said:

Tested using FSR to upscale to 2k and the max allowed, that is  2880 x 1800. Swinging around Time Square saw my fps vary between 30 to 50 fps on the three different resolutions (the 2880x1800 had 5 fps lower than the others more or less).

It doesn't make sense to me, that i would get lower fps on 1080p than 2 or 4k. And most benchmarks i seen sustain that claim. Take for example the video of Hardware Unboxed in which he tests a wide range of CPUs with dual channel DDR4 3600 Hz and 3090 ti. The 12400f on 1080p high with Raytracing enabled is giving 87 fps average and 1% lows of 66 fps.

When i disable ray tracing i even get dips below 60 to like 50 or so when swinging. Instead of the 100+ fps i should.

Not FSR, DSR. It's Nvidia's tech, you could render the game in higher resolution that your monitor could, and there's no limit to it, you could even run them in 8K.

HU definitely used dual channel, and would it make sense that a GPU that powerful could only do 87 fps in 1080p? There's a CPU bottleneck there, they used 3090 Ti to eliminate the chances of GPU bottleneck, that's how benchmarkers test CPUs.

If you search in this forum, you'll see a lot of people complained about their 3080s not performing as expected, and the only common thing among them is their low-mid end CPUs or old high-end CPUs that are now mid-end.

Not an expert, just bored at work. Please quote me or mention me if you would like me to see your reply. **may edit my posts a few times after posting**

CPU: Intel i5-12400

GPU: Asus TUF RX 6800 XT OC

Mobo: Asus Prime B660M-A D4 WIFI MSI PRO B760M-A WIFI DDR4

RAM: Team Delta TUF Alliance 2x8GB DDR4 3200MHz CL16

SSD: Team MP33 1TB

PSU: MSI MPG A850GF

Case: Phanteks Eclipse P360A

Cooler: ID-Cooling SE-234 ARGB

OS: Windows 11 Pro

Pcpartpicker: https://pcpartpicker.com/list/wnxDfv
Displays: Samsung Odyssey G5 S32AG50 32" 1440p 165hz | AOC 27G2E 27" 1080p 144hz

Laptop: ROG Strix Scar III G531GU Intel i5-9300H GTX 1660Ti Mobile| OS: Windows 10 Home

Link to comment
Share on other sites

Link to post
Share on other sites

57 minutes ago, Dukesilver27- said:

Not FSR, DSR. It's Nvidia's tech, you could render the game in higher resolution that your monitor could, and there's no limit to it, you could even run them in 8K.

HU definitely used dual channel, and would it make sense that a GPU that powerful could only do 87 fps in 1080p? There's a CPU bottleneck there, they used 3090 Ti to eliminate the chances of GPU bottleneck, that's how benchmarkers test CPUs.

If you search in this forum, you'll see a lot of people complained about their 3080s not performing as expected, and the only common thing among them is their low-mid end CPUs or old high-end CPUs that are now mid-end.

It was DSR sorry misspelled it , didn't do it directly from Ge force experience tho, i'll try that.

Yeah i know he used a 3090 ti to specifically test CPU performance.

I gave that video as an example of one of the benchmarks i saw in which the 12400F didn't perform that badly. Or you could say that the 20-30 fps difference is caused by de GPU in a CPU bottleneck scenario as that one?

I don't know expect to run both of them maxed at at least 60 fps on 1080p even with 1 stick. Saw plenty of videos bench marking 1 stick vs 2 on dual, the difference is usually not  that big but it's game dependent.
I'll make an update once i get a hold of another stick of ram.

Link to comment
Share on other sites

Link to post
Share on other sites

The issue was running a single stick of RAM. Just tested Spider-man around Times Square and the performance increased is 40% both on average and 1%.
Cyberpunk from benchmark saw an increase of 40% on 1% lows and 25% on average fps.

Then running around the city encounters and such, saw an increase on 1% lows of almost 50%. (was on 35ish around that, now is on 58ish).

 

Cinebench benchmark and Heaven Benchmark didn't give much better results but i'm now content with my performance and is around what i expected.

Cyberpunk Benchmark:

97 average.

77 1%

Cyberpunk freeroam:

Average 60 - 80 fps

1%          57 - 65 fps

 

Spiderman rapid swings around times Square:

Average 74 fps

1%         61 fps

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×