Jump to content

Help me figure my bottleneck? (Cyberpunk 2077)

Vice Fielder
Go to solution Solved by SkilledRebuilds,
11 minutes ago, Vice Fielder said:

Well, I'll fiddle some more with the settings, though like I said, lowering things just been dropping GPU usage.

With a RAM (Speed/Timing) bottleneck, you'd be better off using higher details and a lower overall avg fps to maintain a more stable less fluctuating AVGFPS without "larger drops" when playing. 2-4 sticks of 2933Mhz (Dualchannel) would still bring large benefits to "frame consistency" / "less variability" in gameplay over 1x2400Mhz stick, and will be less dollars than having to splurg 3200-3800 (if your mobo supported them and they were a choice) 2 sticks of 2933Mhz would be where I'd start my planned upgrade.

 

So, I've been trying to figure why my Cyberpunk 2077 performance is so bad. Start with my specs:

 

MB: Gigabyte H410m V2

CPU: i5 10400f

GPU: RTX 3060ti

RAM: 16gb 2400mhz cl14 single channel (the second stick will be arriving later this week)

Ssd: WD 1tb, it's an nvme m.2 green (something, can't recall exactly)

 

So, @1080p on ultra, I've been getting 80 fps, but in combat I think it stays at 40 and feels sluggish.

 

Burning question is, will second stick alleviate the low fps, or is it that the CPU is too slow?

 

Perhaps it's the 2400mhz ram?

 

Or even, is that the expected performance for an rtx 3060ti?

 

Should I consider looking for an i5 10600k? (Getting a new CPU is already a stretch, going for CPU + MB would be out of the question).

 

Thanks in advance

 

Ps: temps are well under control. CPU doesn't go above 65°C, GPU doesn't go above 70°C with a slight afterburner OC.

 

Also, rivaturner monitor says 99-100 GPU usage, 86% CPU (though task manager states 100% CPU).

 

Activating dlss or lowering any settings doesn't add any fps at all but dramatically drop GPU usage.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Thanks for providing a decent amount of info to go from..

 

99% it's your RAM setup (from experience)

The Single channel especially, even at 2400Mhz with Dual Channel (2 sticks) it would improve by a decent amount but you should be chasing 2 sticks of 3200-3800Mhz+ for best mainstream performance levels.

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

Firstly, you'll get a lot by going from single channel to dual channel. Just don't expect miracles. The game is simply hard on the hardware, requires a lot.

 

You can tune your ingame settings to get a bit more frames with almost no visual quality decrease:

 

https://www.eurogamer.net/articles/digitalfoundry-2020-cyberpunk-2077-pc-best-settings

 

 

The CPU you have is enough, even for a bit stronger card. No need to upgrade it.

 

I completed the game on a Ryzen 5 3600 with 16GB of RAM and a radeon 5700XT on 1440p and it was averaging around 50 fps.

So your performance seems to be fine. Now I have a Ryzen 5900X, 32GB RAM and the same 5700XT and I'll replay the game probably this summer.

M.S.C.E. (M.Sc. Computer Engineering), IT specialist in a hospital, 30+ years of gaming, 20+ years of computer enthusiasm, Geek, Trekkie, anime fan

  • Main PC: AMD Ryzen 7 5800X3D - EK AIO 360 D-RGB - Arctic Cooling MX-4 - Asus Prime X570-P - 4x8GB DDR4 3200 HyperX Fury CL16 - Sapphire AMD Radeon 6950XT Nitro+ - 1TB Kingston Fury Renegade - 2TB Kingston Fury Renegade - 512GB ADATA SU800 - 960GB Kingston A400 - Seasonic PX-850 850W  - custom black ATX and EPS cables - Fractal Design Define R5 Blackout - Windows 11 x64 23H2 - 3 Arctic Cooling P14 PWM PST - 5 Arctic Cooling P12 PWM PST
  • Peripherals: LG 32GK650F - Dell P2319h - Logitech G Pro X Superlight with Tiger Ice - HyperX Alloy Origins Core (TKL) - EndGame Gear MPC890 - Genius HF 1250B - Akliam PD4 - Sennheiser HD 560s - Simgot EM6L - Truthear Zero - QKZ x HBB - 7Hz Salnotes Zero - Logitech C270 - Behringer PS400 - BM700  - Colormunki Smile - Speedlink Torid - Jysk Stenderup - LG 24x External DVD writer - Konig smart card reader
  • Laptop: Acer E5–575G-386R 15.6" 1080p (i3 6100U + 12GB DDR4 (4GB+8GB) + GeForce 940MX + 256GB nVME) Win 10 Pro x64 22H2 - Logitech G305 + AAA Lithium battery
  • Networking: Asus TUF Gaming AX6000 - Arcadyan ISP router - 35/5 Mbps vDSL
  • TV and gadgets: TCL 50EP680 50" 4K LED + Sharp HT-SB100 75W RMS soundbar - Samsung Galaxy Tab A8 10.1" - OnePlus 9 256GB - Olymous Cameda C-160 - GameBoy Color 
  • Streaming/Server/Storage PC: AMD Ryzen 5 3600 - LC-Power LC-CC-120 - MSI B450 Tomahawk Max - 2x4GB ADATA 2666 DDR4 - 120GB Kingston V300 - Toshiba DT01ACA100 1TB - Toshiba DT01ACA200 2TB - 2x WD Green 2TB - Sapphire Pulse AMD Radeon R9 380X - 550W EVGA G3 SuperNova - Chieftec Giga DF-01B - White Shark Spartan X keyboard - Roccat Kone Pure Military Desert strike - Logitech S-220 - Philips 226L
  • Livingroom PC (dad uses): AMD FX 8300 - Arctic Freezer 64 - Asus M5A97 R2.0 Evo - 2x4GB DDR3 1833 Kingston - MSI Radeon HD 7770 1GB OC - 120GB Adata SSD - 500W Fractal Design Essence - DVD-RW - Samsung SM 2253BW - Logitech G710+ - wireless vertical mouse - MS 2.0 speakers
Link to comment
Share on other sites

Link to post
Share on other sites

I thought Ram speed didn't matter much with Intel.

Single channel might be the issue.

Activating DLSS should add fps

I'm willing to swim against the current.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, leclod said:

I thought Ram speed didn't matter much with Intel.

Single channel might be the issue.

Activating DLSS should add fps

Ram speed matters to every architecture, but the importance is still a USE CASE scenario.

3200C14 vs 4200C16 in OLD Warzone in the CPU bound areas of the game gave me AVG 15-20FPS improvements and up to 40FPS in extremes when I was using 4400C17 in Nakatomi Plaza.

 

Side by Side 3200C14 vs 4200C16 - https://youtu.be/x_lAm_8Xkyo?list=PLpooJB_1216s_N9CkK2weTUnlOvznPYoJ

Notice the GPU load...

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, SkilledRebuilds said:

Thanks for providing a decent amount of info to go from..

 

99% it's your RAM setup (from experience)

The Single channel especially, even at 2400Mhz with Dual Channel (2 sticks) it would improve by a decent amount but you should be chasing 2 sticks of 3200-3800Mhz+ for best mainstream performance levels.

I went 2400mhz following a friend's recommendation. But then again, checking this MB compatibility highest I could go would be 16gb @2933mhz, or 32gb with slower clock.

 

I had to buy prebuilt here in Brazil, cause otherwise it's impossible to get a decent GPU (the whole prebuilt was basically the full price of that rtx 3060ti). Anyway, I was coming from a gtx-680 haha, was long due for an upgrade that ended up being getting a new PC altogether.

 

Which was just to say that Mobo definitely was not something I'd get on my own.

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, 191x7 said:

Firstly, you'll get a lot by going from single channel to dual channel. Just don't expect miracles. The game is simply hard on the hardware, requires a lot.

 

You can tune your ingame settings to get a bit more frames with almost no visual quality decrease:

 

https://www.eurogamer.net/articles/digitalfoundry-2020-cyberpunk-2077-pc-best-settings

 

 

The CPU you have is enough, even for a bit stronger card. No need to upgrade it.

 

I completed the game on a Ryzen 5 3600 with 16GB of RAM and a radeon 5700XT on 1440p and it was averaging around 50 fps.

So your performance seems to be fine. Now I have a Ryzen 5900X, 32GB RAM and the same 5700XT and I'll replay the game probably this summer.

Well, I'll fiddle some more with the settings, though like I said, lowering things just been dropping GPU usage.

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Vice Fielder said:

Well, I'll fiddle some more with the settings, though like I said, lowering things just been dropping GPU usage.

With a RAM (Speed/Timing) bottleneck, you'd be better off using higher details and a lower overall avg fps to maintain a more stable less fluctuating AVGFPS without "larger drops" when playing. 2-4 sticks of 2933Mhz (Dualchannel) would still bring large benefits to "frame consistency" / "less variability" in gameplay over 1x2400Mhz stick, and will be less dollars than having to splurg 3200-3800 (if your mobo supported them and they were a choice) 2 sticks of 2933Mhz would be where I'd start my planned upgrade.

 

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, SkilledRebuilds said:

With a RAM (Speed/Timing) bottleneck, you'd be better off using higher details and a lower overall avg fps to maintain a more stable less fluctuating AVGFPS without "larger drops" when playing. 2-4 sticks of 2933Mhz (Dualchannel) would still bring large benefits to "frame consistency" / "less variability" in gameplay over 1x2400Mhz stick, and will be less dollars than having to splurg 3200-3800 (if your mobo supported them and they were a choice) 2 sticks of 2933Mhz would be where I'd start my planned upgrade.

 

Thx man. Second stick ended up arriving earlier rather than later.

 

So, you already knew that, but I figured I'd tell how it went.

 

Before I was getting 80 fps, averaging 40 something, dipping down to 35 in combat which was really screwing me over. No matter how much I fiddled with the settings it would only lower GPU usage and not improve fps, specially when it mattered.

 

Now, with the same settings as before, I was getting about the same 80fps, but my dips aren't so low. Also changing settings actually improve things. Dlss set to quality I can keep over 100 fps most of the time, GPU usage does lower a bit. 80% instead of dropping to like 60%.

 

Anyway, ray tracing still isn't a go, but I honestly cant notice it too much anyways.

 

Thanks again. Glad I didn't break my bank impulsively going for a new CPU.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Vice Fielder said:

Thx man. Second stick ended up arriving earlier rather than later.

 

So, you already knew that, but I figured I'd tell how it went.

 

Before I was getting 80 fps, averaging 40 something, dipping down to 35 in combat which was really screwing me over. No matter how much I fiddled with the settings it would only lower GPU usage and not improve fps, specially when it mattered.

 

Now, with the same settings as before, I was getting about the same 80fps, but my dips aren't so low. Also changing settings actually improve things. Dlss set to quality I can keep over 100 fps most of the time, GPU usage does lower a bit. 80% instead of dropping to like 60%.

 

Anyway, ray tracing still isn't a go, but I honestly cant notice it too much anyways.

 

Thanks again. Glad I didn't break my bank impulsively going for a new CPU.

don't tell me your monitor is 60hz? 

you won't see anything above 60fps then more or less, if that is so. 

 

Link to comment
Share on other sites

Link to post
Share on other sites

52 minutes ago, Robchil said:

don't tell me your monitor is 60hz? 

you won't see anything above 60fps then more or less, if that is so. 

 

144mhz.

 

Though I usually notice lower framrates. It's more like it kinda feels very uncomfortable for my eyes and usually gives me headaches.

 

It's buttery smooth now though.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×