Jump to content

Red Dead Redemption 2 PC benchmarks- move over Crysis ; UPDATED

Humbug
2 hours ago, CTR640 said:

Excuse me but why are you implying I spent $1000 TWO years ago on it? No, I didn't, I waited at the right moment for it to be under €700 and it did: €680 and that was last year. I had my 780 since April 2014 and September 2018 it got upgraded. I am very well aware the 1080Ti will be weak in future. I expected R* to optimise RDR2 as they did with GTAV but maybe most of us were naive.

 

And now, I don't play Fallout 76 but I heard it's bad. But I only play open-world games and in this case, only GTA and some older games too. I'll see when I'll get RDR2.

Because you're saying that you'll stick to console instead of playing this game on PC with much better hardware. The One X is the only console that even achieve 30fps, 4K flat by using techniques like interlacing the frames to achieve a "4K" resolution. On the PS4 Pro in 4K, it's highly noticeable. Just put the settings on high, enjoy it in 1080p or 1440p in its entire glory. The 1080ti is not going to be powerful forever, for every game. RDR2 is obviously ahead of it's time, much like GTA V was. GTA V favored the new 900 series over the 700 series at the time. 

 

The 780 was great for 1080p very high/high mixed settings at the time. But for 1440p, you weren't going to achieve above 60fps unless you had the 980 or Titan. It's highly obvious from benchmarks in 2015 upon release. 

 

The game became much more well optimized over time, just like any other game. This is a day 1 launch result and the game is amazing.

*Insert Name* R̶y̶z̶e̶n̶ Intel Build!  https://linustechtips.com/main/topic/748542-insert-name-r̶y̶z̶e̶n̶-intel-build/

Case: NZXT S340 Elite Matte White Motherboard: Gigabyte AORUS Z270X Gaming 5 CPU: Intel Core i7 7700K GPU: ASUS STRIX OC GTX 1080 RAM: Corsair Ballistix Sport LT 2400mhz Cooler: Enermax ETS-T40F-BK PSU: Corsair CX750M SSD: PNY CS1311 120GB HDD: Seagate Momentum 2.5" 7200RPM 500GB

 

Link to comment
Share on other sites

Link to post
Share on other sites

https://kotaku.com/grand-theft-auto-v-benchmarked-pushing-pc-graphics-to-1698670906

 

That is from back in 2015. The results look almost exactly like the results we have now. Notice the similarities? 900 series are better than the 700 series. Today, the 2000 series is just better than the 1000 series. 

*Insert Name* R̶y̶z̶e̶n̶ Intel Build!  https://linustechtips.com/main/topic/748542-insert-name-r̶y̶z̶e̶n̶-intel-build/

Case: NZXT S340 Elite Matte White Motherboard: Gigabyte AORUS Z270X Gaming 5 CPU: Intel Core i7 7700K GPU: ASUS STRIX OC GTX 1080 RAM: Corsair Ballistix Sport LT 2400mhz Cooler: Enermax ETS-T40F-BK PSU: Corsair CX750M SSD: PNY CS1311 120GB HDD: Seagate Momentum 2.5" 7200RPM 500GB

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/7/2019 at 11:31 PM, Brehohn said:

Because you're saying that you'll stick to console instead of playing this game on PC with much better hardware. The One X is the only console that even achieve 30fps, 4K flat by using techniques like interlacing the frames to achieve a "4K" resolution. On the PS4 Pro in 4K, it's highly noticeable. Just put the settings on high, enjoy it in 1080p or 1440p in its entire glory. The 1080ti is not going to be powerful forever, for every game. RDR2 is obviously ahead of it's time, much like GTA V was. GTA V favored the new 900 series over the 700 series at the time. 

 

The 780 was great for 1080p very high/high mixed settings at the time. But for 1440p, you weren't going to achieve above 60fps unless you had the 980 or Titan. It's highly obvious from benchmarks in 2015 upon release. 

 

The game became much more well optimized over time, just like any other game. This is a day 1 launch result and the game is amazing.

I'd like you to find that post where I said I'll stick to console.

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Mira Yurizaki said:

I don't think it's fair to solely blame NVIDIA for not optimizing drivers for Pascal, if that is even the case. Outside of trying to do some promotional/branding deal, It's not NVIDIA's responsibility to go out and optimize someone else's application, mostly because they need to have access to the shader code of the engine to begin with. It's up to the application's developer to contact the manufacturer for help in optimizing the shader code. If Rockstar did ask NVIDIA for help but NVIDIA chose to not incorporate the changes (which they could easily verify), Rockstar would, one would think, make a stink about this.

Yeah, so often we see people waiting for drivers to optimize for a game, but why is that up to the GPU maker?  imo that's the developer's responsibility to optimize for the hardware.  And I'm sure if anything nvidia would come to them offering all sorts of help and it would just be up to rockstar to allow it or not.  afaik that's generally how games end up with physx, hairworks, etc. nvidia wants those features in, they're not going to say no to a dev eager to help spread their brand and technology.

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Ryan_Vickers said:

Yeah, so often we see people waiting for drivers to optimize for a game, but why is that up to the GPU maker?  imo that's the developer's responsibility to optimize for the hardware.  And I'm sure if anything nvidia would come to them offering all sorts of help and it would just be up to rockstar to allow it or not.  afaik that's generally how games end up with physx, hairworks, etc. nvidia wants those features in, they're not going to say no to a dev eager to help spread their brand and technology.

Because that's how gpu drivers work? Granted there is more rockstar can do and less nvidia can do on a dx12 title but pascal is clearly having some issue no other architecture including nvidia's own turing. The fact that it runs pretty much fine on every card/arch from both vendors other than pascal tells me this is a driver problem imo. It's not like pascal has some optimization that say turing wouldnt that rockstar would include or not.

3 hours ago, Mira Yurizaki said:

I don't think it's fair to solely blame NVIDIA for not optimizing drivers for Pascal, if that is even the case. Outside of trying to do some promotional/branding deal, It's not NVIDIA's responsibility to go out and optimize someone else's application, mostly because they need to have access to the shader code of the engine to begin with. It's up to the application's developer to contact the manufacturer for help in optimizing the shader code. If Rockstar did ask NVIDIA for help but NVIDIA chose to not incorporate the changes (which they could easily verify), Rockstar would, one would think, make a stink about this.

What? Are you guys trying to give rockstar shit for it not being game works or something??? It literally is their responsibility that's why you get game ready drivers and I'm pretty fuckin sure Nvidia knows about the RAGE engine by now. Considering that Turing seems unaffected, and this is an engine well versed with Pascal, do you really think rockstar fucked up here? Maybe they somehow only contacted Nvidia for Turing and then Nvidia just didnt mind that pascal is shitting the bed????

What kind of logic is this?

MOAR COARS: 5GHz "Confirmed" Black Edition™ The Build
AMD 5950X 4.7/4.6GHz All Core Dynamic OC + 1900MHz FCLK | 5GHz+ PBO | ASUS X570 Dark Hero | 32 GB 3800MHz 14-15-15-30-48-1T GDM 8GBx4 |  PowerColor AMD Radeon 6900 XT Liquid Devil @ 2700MHz Core + 2130MHz Mem | 2x 480mm Rad | 8x Blacknoise Noiseblocker NB-eLoop B12-PS Black Edition 120mm PWM | Thermaltake Core P5 TG Ti + Additional 3D Printed Rad Mount

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, S w a t s o n said:

What? Are you guys trying to give rockstar shit for it not being game works or something??? It literally is their responsibility that's why you get game ready drivers and I'm pretty fuckin sure Nvidia knows about the RAGE engine by now. Considering that Turing seems unaffected, and this is an engine well versed with Pascal, do you really think rockstar fucked up here? Maybe they somehow only contacted Nvidia for Turing and then Nvidia just didnt mind that pascal is shitting the bed????

What kind of logic is this?

So NVIDIA has unfettered access to Rockstar's code repositories such that they can update drivers to account for any and all changes to the RAGE engine?

 

Some companies would absolutely kill to have privilege like that.

 

You can't make optimizations unless you know what they're doing first.

Link to comment
Share on other sites

Link to post
Share on other sites

I don't know what you guys complaining here, Nvidia or even AMD delayed optimization driver for older gen card is old news. 

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, Mira Yurizaki said:

So NVIDIA has unfettered access to Rockstar's code repositories such that they can update drivers to account for any and all changes to the RAGE engine?

 

Some companies would absolutely kill to have privilege like that.

 

You can't make optimizations unless you know what they're doing first.

Oh man, I guess AMD and other Nvidia cards just magically work huh? The largest install base being 10 series cards, just really needed that extra "optimization"

MOAR COARS: 5GHz "Confirmed" Black Edition™ The Build
AMD 5950X 4.7/4.6GHz All Core Dynamic OC + 1900MHz FCLK | 5GHz+ PBO | ASUS X570 Dark Hero | 32 GB 3800MHz 14-15-15-30-48-1T GDM 8GBx4 |  PowerColor AMD Radeon 6900 XT Liquid Devil @ 2700MHz Core + 2130MHz Mem | 2x 480mm Rad | 8x Blacknoise Noiseblocker NB-eLoop B12-PS Black Edition 120mm PWM | Thermaltake Core P5 TG Ti + Additional 3D Printed Rad Mount

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, xAcid9 said:

I don't know what you guys complaining here, Nvidia or even AMD delayed optimization driver for older gen card is old news. 

I normally don't see release notes specifying that a game was optimized for a specific architecture. There are also non-trivial architectural differences between Pascal and Turing outside of the tensor and RT cores that could give Turing a significant performance advantage, but people seem to not know that.

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, TigerHawk said:

But in this particular game, the 2080 and 5700XT are performing as expected but suddenly all of the pascal cards, as a GPU family, don't follow this trend? Why? The most obvious conclusion to draw is the drivers were not optimized for the pascal architecture. nVidia is a business after all, and they are just as aware as we are how strong of an architecture pascal was. It makes sense they would want to purposefully make them underperform in a new, very popular first-time-on-PC title like this to get people to buy into their new stuff. Maybe purposefully underperform are the wrong words. More like....not necessarily do their best to optimize the driver for them? Like they had their A team working on Turing optimizations and the C team who only works Tuesdays and Thursdays doing pascal? lol.

 

I actually don't think this thought is that tinfoilhat-y.

The other possibility is that since this game was built from the ground up as a modern graphics API game with a programming paradigm closer to vulkan / dx12 and no traditional dx11 backend; that RDNA, Turing and GCN are just better equipped at an architectural level than pascal is.

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, Humbug said:

The other possibility is that since this game was built from the ground up as a modern graphics API game with a programming paradigm along closer to vulkan / dx12 and no traditional dx11 backend; that RDNA, Turing and GCN are just better equipped at an architectural level than pascal is.

Perhaps Rockstar wanted to upgrade RAGE for next generation consoles and this is a good way to work out the kinks.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Mira Yurizaki said:

I normally don't see release notes specifying that a game was optimized for a specific architecture.

And let hell break loose? No way.

1 hour ago, Mira Yurizaki said:

There are also non-trivial architectural differences between Pascal and Turing outside of the tensor and RT cores that could give Turing a significant performance advantage, but people seem to not know that.

What make you said that? Imo Pascal is much closer to Maxwell than Turing. 

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

38 minutes ago, xAcid9 said:

And let hell break loose? No way.

The release notes only say game optimizations were added to the drivers, they don't specify any microarchitecture it was meant for:

image.png.10327f0731c5107a76d1564597e2fd8b.png

Quote

What make you said that? Imo Pascal is much closer to Maxwell than Turing. 

It's in the Turing whitepaper. In particular these parts that don't require a game to specifically target the feature:

Spoiler

 

image.png.9e46a0a1206bb6de42e8d5e6f56e1727.png

 

image.thumb.png.30aa85fd346048e158d8959939681d37.png

image.png.d727b57cf2ac939d93d6f3728a7f838d.png

 

There's another under-the-hood feature, Independent Thread Scheduling, that's explained in the Volta white paper. The rest of the features Turing has have to be explicitly targeted.

 

EDIT: Digging around on the web, I think it's important for people who don't have any experience writing a driver stack to read this: https://www.gamedev.net/forums/topic/666419-what-are-your-opinions-on-dx12vulkanmantle/?tab=comments#comment-5215019

 

This person's experiences in NVIDIA's driver team, along with what Raymond Chen often says in his Windows development blogs, continues to make me believe that 9 times out of 10, an application isn't using the tools correctly and the driver or OS development team has to find a "hack" to work around it so that said application at the minimum runs. But overall, the issue is more complicated than just "lol, manufacturer can't optimize"

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Mira Yurizaki said:

The release notes only say game optimizations were added to the drivers, they don't specify any microarchitecture it was meant for:

5 hours ago, Mira Yurizaki said:

It's in the Turing whitepaper. In particular these parts that don't require a game to specifically target the feature:

Yeah because Nvidia is very transparent about everything.

In what page in Maxwell white paper mentioned about tile based rendering again? ?

 

I just read TPU latest review on 5700 XT THICC 3 and saw The Surge 2(Vulkan) also show similar performance level. 

Spoiler

the-surge-2-1920-1080.png

I guess just another case of bad developer. ?‍♂️

 

 

Also this is Strange Brigade(DX12) on that same review.

Spoiler

strange-brigade-1920-1080.png

 

And this is on 2060 released review.

Spoiler

strange-brigade_1920-1080.png

2080 get 26 fps more and 1080 Ti get -2.3 fps compare to old result. bad bad dev.

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, xAcid9 said:

Yeah because Nvidia is very transparent about everything.

In what page in Maxwell white paper mentioned about tile based rendering again? ?

I don't understand the point of this comment other than to suggest to me Turing has some sort of secret sauce feature. I'm not convinced there is any secret sauce feature that makes Turing magically better in most regards over Pascal unless there's something spec-wise that looks odd. If anything, knowing that Maxwell has a tile based rasterizer only served to explain how it can perform better than the last gen with a narrower memory bus.

Link to comment
Share on other sites

Link to post
Share on other sites

Update - 10/11/2019 - turn on async compute for Vulkan

 

Some users including hardware unboxed had noticed some hitching/stuttering under vulkan. It seems like Rockstar had forgotten to turn async compute on. It needs to be manually enabled in the config files and makes the frametimes smoother. This should help for GCN, RDNA and possibly even Turing.

 

open the below file

Documents\Rockstar games\Red Dead Redemption 2\Settings\system.xml

look for the below line

<asynComputeEnabled value "false"  />

Change it to true and save

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/6/2019 at 12:42 AM, Levent said:

I havent checked RDR2 PC release but I wonder if there is a significant visual upgrade over XOX or PS4P?

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, BiG StroOnZ said:

 

Damn that looks like almost enough cooling to run a 9900KS at 5.2 GHz

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

It does need a beat of a computer to run maxed out thats for sure..

My Specs are as follows

AMD 3800x running 1usmus power profile.
32gb CAS 14 3200 ram

Nvidia 1080ti FE

OS on NVME and game on Separate NVME

 

Running Ultra wide 3440x1440 and have most things set to high and textures on Ultra.. lucky if I hit 60 fps mostly sits around 50 but its smooth enough at the moment.

The Pascal cards were never amazing with DX12 or Vulkan really.. 

Link to comment
Share on other sites

Link to post
Share on other sites

Been playing flat out since Sunday, 1440p Ultra Preset, not had a single hiccup, skip, stutter or anything else.

 

I have had some wired flickering in certain places and some pop in during cut scenes. Nothing to jarring and it didn't affect gameplay at all but definitely there.

 

Overall it's a solid experience, much better than some of the other shit that gets shovelled out.

 

Does anyone know how to measure FPS on Vulkan? None of my normal tools work for FPS.

 

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Master Disaster said:

Does anyone know how to measure FPS on Vulkan? None of my normal tools work for FPS.

Isn't there the OSD feature in the AMD drivers?

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

I for the most part have everything on Ultra and High, nothing lower. DX12 was super choppy on my rig even at high frame rates it still felt meh, Vulkan for me is smooth as butter. I was playing at 60fps with very minor drops to about 55fps and getting 75fps in some areas, I decided to goof around in settings and found that not only turning down the grass detail slider but also the water physics detail slider drastically affected my FPS  while still looking the same. With all the same settings I went from 55fps low, 60fps average, 75fps max to 80fps low, 86fps avg, and 90fps max. Just buy setting the sliders for grass detail and water physics detail to halfway instead of 100%.

Main Desktop: CPU - i9-14900k | Mobo - Gigabyte Z690 Aorus Elite AX DDR4 | GPU - ASUS TUF Gaming OC RTX 4090 RAM - Corsair Vengeance Pro RGB 64GB 3600mhz | AIO - H150i Pro XT | PSU - Corsair RM1000X | Case - Phanteks P500A Digital - White | Storage - Samsung 970 Pro M.2 NVME SSD 512GB / Sabrent Rocket 1TB Nvme / Samsung 860 Evo Pro 500GB / Samsung 970 EVO Plus 2tb Nvme / Samsung 870 QVO 4TB  |

 

TV Streaming PC: Intel Nuc CPU - i7 8th Gen | RAM - 16GB DDR4 2666mhz | Storage - 256GB WD Black M.2 NVME SSD |

 

Phone: Samsung Galaxy Z Fold 4 - Phantom Black 512GB |

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, porina said:

Isn't there the OSD feature in the AMD drivers?

Yep, doesn't work with Vulkan though. FPS shows as 0.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

I'm curious since this is a DX12 that this supports crossfire in some way...

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
AX1600i owner. https://docs.google.com/spreadsheets/d/1_GMev0EwK37J3zZL98zIqF-OSBuHlFEHmrc_SPuYsjs/edit?usp=sharing My WIP Power Supply Guide.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×