Jump to content

First Fallout 4 performance results

SteveGrabowski0

@Kloaked posted this one on the xbone thread so quoting:

 


So Nvidia posted a performance guide for Fallout 4, but it was taken down? I found a cached copy, though: http://webcache.googleusercontent.com/search?q=cache:http://www.geforce.com/whats-new/guides/fallout-4-graphics-performance-and-tweaking-guide

 

As I thought, God rays affect it the most, using 4x Tessellation should solve performance issues for AMD users.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

Except the amount of draw calls in a real game almost never reach the amount that are showing in a draw call test. It's like saying that Fire Strike is an accurate measurement for how intensive a game is. It's completely unrealistic and should never be compared to real world scenarios. 

As a matter of fact, how many times has Linus and Luke said that synthetic benchmarks are completely unreliable to use for real world performance? Like, what, every benchmark video?

 

Secondly, every single game that has had gameworks in it in the past half year has run horribly on all hardware that isn't Nvidia's 9xx series. But every other game that doesn't have gameworks in it has been reasonably optimized for both side, strange isn't it?

 

That's not entirely true.

Some games require even more draw calls than you can get with an Nvidia GPU. Batman Arkham Knight, for example. People blame GW, but it's a victim of D3D11 API. Devs are currently struggling with CPU intensive games, as they have reached the limitation of DX11. If you look at what happens when the game stutters, GPU usage drops, which indicates a CPU bottleneck. There's no reason for high-end CPUs like 5930K to be bottlenecking this game, unless it's the (single) rendering thread that's bottlenecking. Devs decided to display too much stuff on screen for single CPU core to handle. 

 

Benchmark tests can be indicators of real world performance, but you have to understand in what situations and scenarios. Graphics benchmarks like Fire Strike are meant to test your GPU to its limits and peg it at 100%. While games usually won't do that 100% of the time, depending on the game and what's going on in the game, you will encounter a scene that is as demanding, or close to Fire Strike. I've never had temps higher than 70 on my GTX 970 until I played Witcher 3. That's because other games I play are simply not that demanding. But even Witcher 3 is not always that demanding. BF4 is only that demanding when there's a huge explosion in front of you. In Witcher 3 it's when you go to the woods with a lot of foliage. 

FS is not to be used as an indicator of how much FPS you will get in games, but it does show how your GPU will handle heavy loads like its own. 

After all, we don't really use FS as an indicator of game performance, but rather compare different GPUs.

 

API Overhead tests the API and its efficiency. Since it's CPU that's making draw calls, and given that it's not MT, single core performance does matter, but that's not the point. It's again, the difference that matters. You can take any CPU you want, and obviously numbers will be lower/higher depending on the CPU, but you have to test both an AMD and an Nvidia card with the same CPU and system. API overhead test is not GPU taxing at all, it even renders at 720p. So if you get more draw calls with an Nvidia GPU in a CPU-bound scenario, then it's obvious that Nvidia's drivers have higher draw call throughput.

 

And about all GW games being broken, yes, the logical conclusion may seem to be that it's GW's fault. But it's been proven time and time again that the real reasons were rushed releases, tessellation, and driver overhead. I think Nvidia made really bad choices and partnered with bad companies. AMD were either smart or lucky to mostly pick those that didn't rush their releases. BF4, for example, was a mess, and it's a GE title. 

i7 9700K @ 5 GHz, ASUS DUAL RTX 3070 (OC), Gigabyte Z390 Gaming SLI, 2x8 HyperX Predator 3200 MHz

Link to comment
Share on other sites

Link to post
Share on other sites

That's not entirely true.

Some games require even more draw calls than you can get with an Nvidia GPU. Batman Arkham Knight, for example. People blame GW, but it's a victim of D3D11 API. Devs are currently struggling with CPU intensive games, as they have reached the limitation of DX11. If you look at what happens when the game stutters, GPU usage drops, which indicates a CPU bottleneck. There's no reason for high-end CPUs like 5930K to be bottlenecking this game, unless it's the (single) rendering thread that's bottlenecking. Devs decided to display too much stuff on screen for single CPU core to handle.

Benchmark tests can be indicators of real world performance, but you have to understand in what situations and scenarios. Graphics benchmarks like Fire Strike are meant to test your GPU to its limits and peg it at 100%. While games usually won't do that 100% of the time, depending on the game and what's going on in the game, you will encounter a scene that is as demanding, or close to Fire Strike. I've never had temps higher than 70 on my GTX 970 until I played Witcher 3. That's because other games I play are simply not that demanding. But even Witcher 3 is not always that demanding. BF4 is only that demanding when there's a huge explosion in front of you. In Witcher 3 it's when you go to the woods with a lot of foliage.

FS is not to be used as an indicator of how much FPS you will get in games, but it does show how your GPU will handle heavy loads like its own.

After all, we don't really use FS as an indicator of game performance, but rather compare different GPUs.

API Overhead tests the API and its efficiency. Since it's CPU that's making draw calls, and given that it's not MT, single core performance does matter, but that's not the point. It's again, the difference that matters. You can take any CPU you want, and obviously numbers will be lower/higher depending on the CPU, but you have to test both an AMD and an Nvidia card with the same CPU and system. API overhead test is not GPU taxing at all, it even renders at 720p. So if you get more draw calls with an Nvidia GPU in a CPU-bound scenario, then it's obvious that Nvidia's drivers have higher draw call throughput.

And about all GW games being broken, yes, the logical conclusion may seem to be that it's GW's fault. But it's been proven time and time again that the real reasons were rushed releases, tessellation, and driver overhead. I think Nvidia made really bad choices and partnered with bad companies. AMD were either smart or lucky to mostly pick those that didn't rush their releases. BF4, for example, was a mess, and it's a GE title.

At this point I'm convinced youre a troll.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

so how much would my 6970 get?  :P

"If a Lobster is a fish because it moves by jumping, then a kangaroo is a bird" - Admiral Paulo de Castro Moreira da Silva

"There is nothing more difficult than fixing something that isn't all the way broken yet." - Author Unknown

Spoiler

Intel Core i7-3960X @ 4.6 GHz - Asus P9X79WS/IPMI - 12GB DDR3-1600 quad-channel - EVGA GTX 1080ti SC - Fractal Design Define R5 - 500GB Crucial MX200 - NH-D15 - Logitech G710+ - Mionix Naos 7000 - Sennheiser PC350 w/Topping VX-1

Link to comment
Share on other sites

Link to post
Share on other sites

"Game works does not affect performance negatively on other vender's cards!" -Nvidia

- Fallout 4, 390X performs significantly worse than a 970

 

 I smell bullshit bethesda, fix you're game, and help stop this fucking vender bias.

Updated 2021 Desktop || 3700x || Asus x570 Tuf Gaming || 32gb Predator 3200mhz || 2080s XC Ultra || MSI 1440p144hz || DT990 + HD660 || GoXLR + ifi Zen Can || Avermedia Livestreamer 513 ||

New Home Dedicated Game Server || Xeon E5 2630Lv3 || 16gb 2333mhz ddr4 ECC || 2tb Sata SSD || 8tb Nas HDD || Radeon 6450 1g display adapter ||

Link to comment
Share on other sites

Link to post
Share on other sites

"Game works does not affect performance negatively on other vender's cards!" -Nvidia

- Fallout 4, 390X performs significantly worse than a 970

 

 I smell bullshit bethesda, fix you're game, and help stop this fucking vender bias.

Anyone else notice the release of the Fallout 4 edition 970 for 385us at Best Buy? Is Radeon asleep behind the wheel? Already had to prepare myself that I might be returning Fallout 4, the game I built my PC to run. Oh what a fanboy they'll lose in me.

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

Anyone else notice the release of the Fallout 4 edition 970 for 385us at Best Buy? Is Radeon asleep behind the wheel? Already had to prepare myself that I might be returning Fallout 4, the game I built my PC to run. Oh what a fanboy they'll lose in me.

Earlier you talked about a new driver AMD released recently and your crossfire setup.

Driver update not good enough?

This whole release was pretty sketch from the start with the minimum requirements.

A 7870 is a whole other level better than a 550 ti.

"If you ain't first, you're last"

Link to comment
Share on other sites

Link to post
Share on other sites

Hey, the 970 perform not bad in 1440P. But the 970 beat R9 390 in 1080P? Must be NVIDIA driver increase the performance.

Link to comment
Share on other sites

Link to post
Share on other sites

And about all GW games being broken, yes, the logical conclusion may seem to be that it's GW's fault. But it's been proven time and time again that the real reasons were rushed releases, tessellation, and driver overhead. I think Nvidia made really bad choices and partnered with bad companies. AMD were either smart or lucky to mostly pick those that didn't rush their releases. BF4, for example, was a mess, and it's a GE title.

Thing is it's not just about gameworks features. Most of these are Nvidia the way it's meant to be played titles and NVIDIA boasts about working closely with the game devs. So when things happen such as fallout4 not supporting enough resolutions or fps lock or games being over-tessellated it just looks bad on NVIDIA because they are supposed to be working closely with the game devs.

this is about more than AMD vs Nvidia. This is about best practices in game development, period.

Link to comment
Share on other sites

Link to post
Share on other sites

Earlier you talked about a new driver AMD released recently and your crossfire setup.

Driver update not good enough?

This whole release was pretty sketch from the start with the minimum requirements.

A 7870 is a whole other level better than a 550 ti.

Guess not, pretty salty about it. Way too late to go Nvidia now, really sucks.

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

Guess not, pretty salty about it. Way too late to go Nvidia now, really sucks.

I'm not planning on upgrading until Arctic Islands/Pascal so i still have time.

Jesus, i hope something can be done on current AMD cards or else my next card will be Pascal.

Even if this issue would be resolved with Arctic Islands i would have lost faith in AMD's product by then.

"If you ain't first, you're last"

Link to comment
Share on other sites

Link to post
Share on other sites

At this point I'm convinced youre a troll.

 

I'm convinced you're a troll. You have posted 0 evidence to back up your claims.

i7 9700K @ 5 GHz, ASUS DUAL RTX 3070 (OC), Gigabyte Z390 Gaming SLI, 2x8 HyperX Predator 3200 MHz

Link to comment
Share on other sites

Link to post
Share on other sites

Anyone else notice the release of the Fallout 4 edition 970 for 385us at Best Buy? Is Radeon asleep behind the wheel? Already had to prepare myself that I might be returning Fallout 4, the game I built my PC to run. Oh what a fanboy they'll lose in me.

 

I see it for $370. That's ridiculous since it doesn't come with the game. And because I'm a Bethesda fanboy I'll probably go drop $15 on ebay for the case badge if it has Vault Boy on it.  :D

Link to comment
Share on other sites

Link to post
Share on other sites

Anyone else notice the release of the Fallout 4 edition 970 for 385us at Best Buy? Is Radeon asleep behind the wheel? Already had to prepare myself that I might be returning Fallout 4, the game I built my PC to run. Oh what a fanboy they'll lose in me.

 

Fuck me, it comes with a backplate with Vault Boy on it. I think I'm not retarded enough to go SLI just to get a Vault Boy backplate.

 

70EXiQd.jpg

 

EDIT: Fuck that, I think I could paint that myself, I'm pretty good with an airbrush and there are so few colors. I'll just buy a regular backplate and see if I can do that shit.

Link to comment
Share on other sites

Link to post
Share on other sites

Wonder if it will play nice with crossfire... 

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

YES THIS IS SOME GREAT NVIDIA TECH RIGHT HERE.fallout-4-god-rays-quality-performance.p

 

That looks to be very tessellation heavy...

i5 2400 | ASUS RTX 4090 TUF OC | Seasonic 1200W Prime Gold | WD Green 120gb | WD Blue 1tb | some ram | a random case

 

Link to comment
Share on other sites

Link to post
Share on other sites

Nah, nevermind, the backplate that fits my 970 has way too many holes in the wrong places to paint that shit on.

 

100-BP-0972-B9_XL_8.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

That looks to be very tessellation heavy...

 

Yep, called it. In fact everyone did days ago when Beth published that article and mentioned tessellation. It's just ONE setting that kills performance. Instead of hair now it's god's rays....so in a metaphorical way, kinda hair again.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

That looks to be very tessellation heavy...

Would have hoped that 980ti stays well above 60fps even on ultra. Since this isn't witcher 3 quality visuals...
Link to comment
Share on other sites

Link to post
Share on other sites

without Gamewreck. 

 

390x gained 37% average fps and 55% min fps.
980 gained 22% average and 44% min fps.

 

3mAIUyW.png

an old quote..

 

"Gameworks hit performance for both vendors but it hit more on AMD hardwares." - crazy mofo

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

without Gamewreck. 

 

390x gained 37% average fps and 55% min fps.

980 gained 22% average and 44% min fps.

 

3mAIUyW.png

an old quote..

 

"Gameworks hit performance for both vendors but it hit more on AMD hardwares." - crazy mofo

 

The percentages look bad but look at the average numbers as well. Nvidia gained 22.4 FPS while AMD gained 22.6. Factoring in a margin of error they gained exactly the same amount of FPS.

 

For fun: Nvidia gained 16 FPS for the minimum while AMD gained 15, again, close enough to call exactly the same.

Link to comment
Share on other sites

Link to post
Share on other sites

Played for about 8 hours last night. Here's the tl;dr version:

 

1680x1050, everything maxed out, rock solid 30 FPS (VSync is ON). Will take a look at framerate with VSync OFF later today (after I get home).

 

0 crashes, minimal load times (although they're few and far between)

 

More than happy with the performance :)

Remember kids, the only difference between screwing around and science is writing it down. - Adam Savage

 

PHOΞNIX Ryzen 5 1600 @ 3.75GHz | Corsair LPX 16Gb DDR4 @ 2933 | MSI B350 Tomahawk | Sapphire RX 480 Nitro+ 8Gb | Intel 535 120Gb | Western Digital WD5000AAKS x2 | Cooler Master HAF XB Evo | Corsair H80 + Corsair SP120 | Cooler Master 120mm AF | Corsair SP120 | Icy Box IB-172SK-B | OCZ CX500W | Acer GF246 24" + AOC <some model> 21.5" | Steelseries Apex 350 | Steelseries Diablo 3 | Steelseries Syberia RAW Prism | Corsair HS-1 | Akai AM-A1

D.VA coming soon™ xoxo

Sapphire Acer Aspire 1410 Celeron 743 | 3Gb DDR2-667 | 120Gb HDD | Windows 10 Home x32

Vault Tec Celeron 420 | 2Gb DDR2-667 | Storage pending | Open Media Vault

gh0st Asus K50IJ T3100 | 2Gb DDR2-667 | 40Gb HDD | Ubuntu 17.04

Diskord Apple MacBook A1181 Mid-2007 Core2Duo T7400 @2.16GHz | 4Gb DDR2-667 | 120Gb HDD | Windows 10 Pro x32

Firebird//Phoeniix FX-4320 | Gigabyte 990X-Gaming SLI | Asus GTS 450 | 16Gb DDR3-1600 | 2x Intel 535 250Gb | 4x 10Tb Western Digital Red | 600W Segotep custom refurb unit | Windows 10 Pro x64 // offisite backup and dad's PC

 

Saint Olms Apple iPhone 6 16Gb Gold

Archon Microsoft Lumia 640 LTE

Gulliver Nokia Lumia 1320

Werkfern Nokia Lumia 520

Hydromancer Acer Liquid Z220

Link to comment
Share on other sites

Link to post
Share on other sites

No, they haven't. Have you been living under a rock? Why the hell do you think AMD performs bad in CPU-bound games? This is why:

 

AMD:

 

Nvidia:

 

That's 32% difference.

 

The test LITERALLY says you cannot compare the numbers between systems OR vendors. Your entire point is moot.

 

YES THIS IS SOME GREAT NVIDIA TECH RIGHT HERE.fallout-4-god-rays-quality-performance.p

 

Yup. @Kloaked made a sarcastic comment that it was NVidia's fault the game runs like crap. Ironically he wasn't wrong. Sure the game engine might not be very efficient or good per se, but this is the most taxing setting in the game, officially made in cooperation with NVidia. Odd to see NVidia delete their own performance guide. I wonder what they leaked out that they should not have.

 

without Gamewreck. 

 

390x gained 37% average fps and 55% min fps.

980 gained 22% average and 44% min fps.

an old quote..

 

"Gameworks hit performance for both vendors but it hit more on AMD hardwares." - crazy mofo

 

It's astounding that people still cannot see it. NVidia tech like GameWorks, god rays, etc. hurt performance for everyone, but especially AMD. With source code being unavailable for AMD, they will always suffer in these games, with the resulting blame on AMD drivers. It's getting old fast and people should know better by now.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

It's astounding that people still cannot see it. NVidia tech like GameWorks, god rays, etc. hurt performance for everyone, but especially AMD.

A lot of NVIDIA users look at TWIMTBP titles on launch day and feel good about things like 15% better performance than AMD. They don't stop to think about how much absolute performance has been sacrificed in order to achieve that goal, I.e. Is this game made in the optimal way with best possible performance in general.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×