Jump to content

Need help figuring out Fallout 4 FPS issue (and GC issue).

DevilishBooster

Here is my current rig: http://pcpartpicker.com/p/DcMR7P

I finally got to start playing Fallout 4 on Thanksgiving day. I was pleasantly surprised that it defaulted to Ultra and while playing it held a sold 60 fps (with V-Sync enabled I might add) almost all the time with very rare dips to as low as 45 fps, and it played that way until a few days ago. Suddenly it's fluctuating between 15-40 fps (usually holding closer to 25 fps) and it will randomly spike up to 60 fps, but it doesn't seem affected by anything in the game. It will happen in open world or inside a building, and the random spike up to 60 fps will happen when I'm in the middle of a big fire-fight so I know the low frame rate isn't because of a lot of things happening at once. I've been monitoring my graphics card with HWMonitor today and it isn't tied to my GCs not being able to keep up because my 280 will only read about avg 65% load, sometimes going up around 75%, and only once did it reach 99% and that was while I was in my inventory while wearing power armor. The GC also avg around 67 C and never went above 72 (when it was at 99%) so I know it's not thermal throttling. The only changes I've made to the game was to edit the .ini file so that the FOV was 100 (sooooooo much better!), to remove the depth of field effect that made it a PITA when ADS, and to remove the fps cap.

 

The other weird thing is that HWMonitor is only registering one of my 280s.

post-63986-0-58080700-1449350864.png

 

I do have the new Radeon Crimson driver, and in the 'Additional Setting..." under the "Display" tab (where it just opens a CCC window with more options, because it totally makes sense to keep CCC integrated instead of incorporating the additional settings into Crimson) and I do have CrossFire enabled and set to be enabled for DX 9/10/11 and OpenGL applications with no application profile. When the FPS issue first came up I tried resetting the Fallout 4 settings to default and I tried turning off CrossFire, but that didn't help. I also made sure to turn off V-Sync in Crimson. I'm not sure where to go from here. I read on a reddit thread that people would enable the Crysis 3 Xfire profile and that it would help a little, but I can't find that option anywhere. Is that an old CCC option? (I added the second card right when Crimson came out so I don't know the CCC Xfire settings)

 

I'm not sure if I let out any pertinent info, so if you need more details just ask and I'll let you know. I really want to get this fixed because the game is not really any fun when the FPS goes so low.

END OF LINE

-- Project Deep Freeze Build Log --

Quote me so that I always know when you reply, feel free to snip if the quote is long. May your FPS be high and your temperatures low.

Link to comment
Share on other sites

Link to post
Share on other sites

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

That game rely a lot on CPU per core performance, which isn't great with FX-8350, also i don't think it scale well with crossfire (if it even supports it at all) all in all your machine is close to the ''worst perf per dollar'' you can trow at a game such as fallout 4 unfortunately...i'm not surprised that it wouldn't run well on such a machine, it would most likely run better on a core i3 machine with a gtx 760.

That said, your machine is geared for games such as battlefield battlefront star wars shit whatever, it will do great those games are AMD games optimized for such machines.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

That game rely a lot on CPU per core performance, which isn't great with FX-8350, also i don't think it scale well with crossfire (if it even supports it at all) all in all your machine is close to the ''worst perf per dollar'' you can trow at a game such as fallout 4.

I know that its a CPU intensive game, and it definitely uses all my cores, but if it really was the CPU bottlenecking then how would it have ever ran at a sold 60 fps to begin with?

END OF LINE

-- Project Deep Freeze Build Log --

Quote me so that I always know when you reply, feel free to snip if the quote is long. May your FPS be high and your temperatures low.

Link to comment
Share on other sites

Link to post
Share on other sites

set shadow distance to medium

 

Here is my current rig: http://pcpartpicker.com/p/DcMR7P

I finally got to start playing Fallout 4 on Thanksgiving day. I was pleasantly surprised that it defaulted to Ultra and while playing it held a sold 60 fps (with V-Sync enabled I might add) almost all the time with very rare dips to as low as 45 fps, and it played that way until a few days ago. Suddenly it's fluctuating between 15-40 fps (usually holding closer to 25 fps) and it will randomly spike up to 60 fps, but it doesn't seem affected by anything in the game. It will happen in open world or inside a building, and the random spike up to 60 fps will happen when I'm in the middle of a big fire-fight so I know the low frame rate isn't because of a lot of things happening at once. I've been monitoring my graphics card with HWMonitor today and it isn't tied to my GCs not being able to keep up because my 280 will only read about avg 65% load, sometimes going up around 75%, and only once did it reach 99% and that was while I was in my inventory while wearing power armor. The GC also avg around 67 C and never went above 72 (when it was at 99%) so I know it's not thermal throttling. The only changes I've made to the game was to edit the .ini file so that the FOV was 100 (sooooooo much better!), to remove the depth of field effect that made it a PITA when ADS, and to remove the fps cap.

 

The other weird thing is that HWMonitor is only registering one of my 280s.

attachicon.gif2015-12-05_1526.png

 

I do have the new Radeon Crimson driver, and in the 'Additional Setting..." under the "Display" tab (where it just opens a CCC window with more options, because it totally makes sense to keep CCC integrated instead of incorporating the additional settings into Crimson) and I do have CrossFire enabled and set to be enabled for DX 9/10/11 and OpenGL applications with no application profile. When the FPS issue first came up I tried resetting the Fallout 4 settings to default and I tried turning off CrossFire, but that didn't help. I also made sure to turn off V-Sync in Crimson. I'm not sure where to go from here. I read on a reddit thread that people would enable the Crysis 3 Xfire profile and that it would help a little, but I can't find that option anywhere. Is that an old CCC option? (I added the second card right when Crimson came out so I don't know the CCC Xfire settings)

 

I'm not sure if I let out any pertinent info, so if you need more details just ask and I'll let you know. I really want to get this fixed because the game is not really any fun when the FPS goes so low.

Link to comment
Share on other sites

Link to post
Share on other sites

I know that its a CPU intensive game, and it definitely uses all my cores, but if it really was the CPU bottlenecking then how would it have ever ran at a sold 60 fps to begin with?

the performance is all over the place in that game i have seen it pinned at 144FPS at times on my machine (see sig) and i have seen it struggling for 60FPS in other areas as well...it's trash if you really enjoy it then you'll deal with poor performance at times.

set shadow distance to medium

^^ also this and disable god rays.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

the performance is all over the place in that game i have seen it pinned at 144FPS at times on my machine (see sig) and i have seen it struggling for 60FPS in other areas as well...it's trash if you really enjoy it then you'll deal with poor performance at times.

^^ also this and disable god rays.

Well, obviously something has changed causing the issue to happen constantly. What graphics effect do God Rays have on the actual performance?

END OF LINE

-- Project Deep Freeze Build Log --

Quote me so that I always know when you reply, feel free to snip if the quote is long. May your FPS be high and your temperatures low.

Link to comment
Share on other sites

Link to post
Share on other sites

Well, obviously something has changed causing the issue to happen constantly. What graphics effect do God Rays have on the actual performance?

like on AMD cards? god rays tank the framerates, they arent meant to drive advanced gameworks features in games. and as i said it depend on location, i was getting 120FPS+ until i reached lexington in which area i get like 65FPS average with as low as 50FPS on VERY powerful hardware meant to run that game on max settings, it's VERY poorly coded on a very old game engine, that's why!

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

Nothing to do with Nvidia or AMD, God that's getting old lol.

The Crimson beta driver will run Fallout 4 fairly well, watch out for the Bethesda beta patch. The Bethesda beta patch will disable a lot of mods and cause fps dips.

God rays do NOTHING to your performance, that's a red herring thrown out by fans.

Shadow distance set to medium will allow for much better fps but will have the game looking as it does on consoles. It will also have the shadows popping up right in front of you which can Break immersion.

Fallout 4 is a steaming pile of crap coded by blind monkeys. Run the Crimson beta and the original game on Steam, along with your choice in texture mods from the Nexus that help performance while making the game look better and you can run make it more palatable but don't expect it to be perfect.

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

As always, wait for community patch to fix Bethesda incompetence in optimizing their own game.

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

Nothing to do with Nvidia or AMD, God that's getting old lol.

God rays do NOTHING to your performance, that's a red herring thrown out by fans.

lol...that made me laugh...HARD! :)

http://www.gamersnexus.net/game-bench/2180-fallout-4-volumetric-lighting-benchmark-and-disable

The 390X still trails the 970 at 1080p when both devices disable godrays – probably an optimization issue in the drivers or game – but we see a measurable performance gain for each device by toggling god rays. The 390X moves from 71FPS to 80FPS average and increases its 0.1% bottom line to 44 from 36 by disabling godrays. The average FPS gain is about 12%. The 970 moves from 81 to 87 FPS – a smaller 7% gain, but still noticeable. The 970 retains its lead.

At 1440p, the 390X still runs ahead just from the additional pixel workload bogging down the 970.

What we found, though, is that even disabling many of the volumetric lighting effects does not inherently grant AMD a performance advantage. It closes the gap, but doesn't eliminate it.

 

http://www.overclock3d.net/reviews/gpu_displays/fallout_4_-_amd_vs_nvidia_performance_review/8

http://www.overclock3d.net/reviews/gpu_displays/fallout_4_-_amd_vs_nvidia_performance_review/9

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

set shadow distance to medium

-snip-

As always, wait for community patch to fix Bethesda incompetence in optimizing their own game.

-snip-

Ok, so I "figured it out".

I remembered a little bit ago that after my first time playing it I went into AMD's Raptr utility and told it to optimize the game, so I went in a did that again. I didn't write down exactly what settings it was changing when it optimized, but I'm back to 60 fps most of the time and dips to around 45. I did an experiment and, in the areas where I was experiencing the most low fps the last couple days (12-17fps), I am now experiencing the dips to the low 40's. Whatever AMD has in their Raptr utility for optimizing games seems to be working. I'll let you guys know if anything else happens.

END OF LINE

-- Project Deep Freeze Build Log --

Quote me so that I always know when you reply, feel free to snip if the quote is long. May your FPS be high and your temperatures low.

Link to comment
Share on other sites

Link to post
Share on other sites

lol...that made me laugh...HARD! :)

http://www.gamersnexus.net/game-bench/2180-fallout-4-volumetric-lighting-benchmark-and-disable

The 390X still trails the 970 at 1080p when both devices disable godrays – probably an optimization issue in the drivers or game – but we see a measurable performance gain for each device by toggling god rays. The 390X moves from 71FPS to 80FPS average and increases its 0.1% bottom line to 44 from 36 by disabling godrays. The average FPS gain is about 12%. The 970 moves from 81 to 87 FPS – a smaller 7% gain, but still noticeable. The 970 retains its lead.

At 1440p, the 390X still runs ahead just from the additional pixel workload bogging down the 970.

What we found, though, is that even disabling many of the volumetric lighting effects does not inherently grant AMD a performance advantage. It closes the gap, but doesn't eliminate it.

http://www.overclock3d.net/reviews/gpu_displays/fallout_4_-_amd_vs_nvidia_performance_review/8

http://www.overclock3d.net/reviews/gpu_displays/fallout_4_-_amd_vs_nvidia_performance_review/9

Or, just a thought. Listen to the guy who has 2 AMD cards and plays the game lol. I can save in a spot with a fps dip. God rays set to the lowest and highest setting do absolutely nothing. Shadow distance, when set to medium fix the dip every single time.

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

Or, just a thought. Listen to the guy who has 2 AMD cards and plays the game lol. I can save in a spot with a fps dip. God rays set to the lowest and highest setting do absolutely nothing. Shadow distance, when set to medium fix the dip every single time.

I just check in Raptr and it looks like the only change I can find in the graphics setting is that the shadow distance is now at medium. Lol

Everything else is still at Ultra. I'm definitely happy to be back up at 60fps.

END OF LINE

-- Project Deep Freeze Build Log --

Quote me so that I always know when you reply, feel free to snip if the quote is long. May your FPS be high and your temperatures low.

Link to comment
Share on other sites

Link to post
Share on other sites

Or, just a thought. Listen to the guy who has 2 AMD cards and plays the game lol. I can save in a spot with a fps dip. God rays set to the lowest and highest setting do absolutely nothing. Shadow distance, when set to medium fix the dip every single time.

make sense.

Hooah! for gameworks then :)

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

I just check in Raptr and it looks like the only change I can find in the graphics setting is that the shadow distance is now at medium. Lol

Everything else is still at Ultra. I'm definitely happy to be back up at 60fps.

Everyone jumps on God Rays because it's the Nvidia feature. I run mods that are way more demanding than God rays without an issue. The shadow distance jumps the number of draw calls, THAT has an impact on fps. An i7 will help, but won't save you. I've talked to people with 970s and they suffer from it too, just shit programming on Bethesda's part.

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

make sense.

Hooah! for gameworks then :)

Yeah this is all Bethesda employing blind monkeys as programmers. Good thing we have people modding the game that know what the hell they're doing. Already a few that help with fps while improving the look of the game. The weather mod that came out yesterday is really good.

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

The shadow distance jumps the number of draw calls, THAT has an impact on fps. An i7 will help, but won't save you. I've talked to people with 970s and they suffer from it too, just shit programming on Bethesda's part.

Yeah it can be an issue (GPU usage jumps), but I stll am above 60 fps (if the vsync was off of course lol bethesda), though I have a 4790k so it's not as bad, the only thing to dip me thus far was a explosion chain due to super mutant suiciders and cars

https://linustechtips.com/main/topic/631048-psu-tier-list-updated/ Tier Breakdown (My understanding)--1 Godly, 2 Great, 3 Good, 4 Average, 5 Meh, 6 Bad, 7 Awful

 

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah it can be an issue (GPU usage jumps), but I stll am above 60 fps (if the vsync was off of course lol bethesda), though I have a 4790k so it's not as bad, the only thing to dip me thus far was a explosion chain due to super mutant suiciders and cars

The 4790k vastly out performs the 4690k in Fallout. I'm going to a 4790k next month for that reason.

Not to play Fallout 4, but in case it's a trend.

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

The 4790k vastly out performs the 4690k in Fallout. I'm going to a 4790k next month for that reason.

Not to play Fallout 4, but in case it's a trend.

Idk if it is a trend or not, though dx12 will allow for better multi thread support in the future which may make it a more common thing, who knows the games industry may crash for all I know.

https://linustechtips.com/main/topic/631048-psu-tier-list-updated/ Tier Breakdown (My understanding)--1 Godly, 2 Great, 3 Good, 4 Average, 5 Meh, 6 Bad, 7 Awful

 

Link to comment
Share on other sites

Link to post
Share on other sites

Not to play Fallout 4, but in case it's a trend.

it is, just look benchmarks! it always had been the case and with more recent games it just has become more obvious...the i5 used to push 120FPS and the i7 say 145FPS...but lately some games have become more and more demanding and this threshold has dropped to i5 being 65FPS and i7 being 95...so people started to notice (these numbers are out of my ass BTW but if you check benchmarks for recent modern games you'll see what i mean!)

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

Both major console manufacturers have unlocked the 7th core, my money is on games needing more threads. Literally as I'm buying a i7 lol

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

Ditch Fallout 4 until it got fix and play Fallout 5 instead.

 

Or, just a thought. Listen to the guy who has 2 AMD cards and plays the game lol. I can save in a spot with a fps dip. God rays set to the lowest and highest setting do absolutely nothing. Shadow distance, when set to medium fix the dip every single time.

That because the performance already been bottlenecked by the draw calls of the shadow distance. Try find a save spot with optimum GPU usage + Godray. Then play with the Godray setting.

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

Ditch Fallout 4 until it got fix and play Fallout 5 instead.

https://www.youtube.com/watch?v=xG3HM3ZGwMo

That because the performance already been bottlenecked by the draw calls of the shadow distance. Try find a save spot with optimum GPU usage + Godray. Then play with the Godray setting.

I run God rays on ultra, never have a dip in fps that's not related to draw calls. Funny things is even with 13 mods going I can't stress my 390 with Fallout 4, I'm ether maxed out at 60fps or the CPU is crying in the corner.

I tied to run it at 1440, then I might see a difference with some of the other settings. The game bleeds onto my side monitor when I try though. If I could run the crossfire I'd spend some time trying to figure out why.

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

I run God rays on ultra, never have a dip in fps that's not related to draw calls. Funny things is even with 13 mods going I can't stress my 390 with Fallout 4, I'm ether maxed out at 60fps or the CPU is crying in the corner.

You probably already disabled Godray in .ini file that's why graphic settings had no effects.

 

Even the greatest single GPU to date had huge performance lost with Godray on Ultra.

fallout-4-god-rays-quality-performance-6

 

fallout4_godrays_comparison.jpg

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×