Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Humbug

Red Dead Redemption 2 PC benchmarks- move over Crysis ; UPDATED

Recommended Posts

3 hours ago, CTR640 said:

GTA4 was worse. At that time the most highest end GPU's were the ATi HD4870 and it ran like shit. The whole Gtaforums.com was overflooded with issues. The GTX780 was a bit old when GTAV came out and it does very well, actually much better than expected. But if GTAV PC is a port, then it's extraordinary optimized. The 980 and 980Ti are better than 780, that's true. 

I have a 1080Ti and am not complaining a single bit. This game IS intensive to run. It's running on an engine that has TONS of ways to cater to your experience. You're saying its buggy and unoptimized. Do we need to remind you how bad Fallout 76 was? This has a launcher and loading issue, that can be fixed. This game actually supports ultrawide, HDR, ingame voice chat, etc. It could be WAY WORSE. Just because you spent $1000 TWO years ago doesnt mean your card is going to outlast 2 or 3 years. Graphical technologies are in fast pace right now. Get a 2080Ti or wait for the 3000 series.

 

Meanwhile, dont complain about something you havent even bought. I have a 1080ti, 8700k, and 32gb of ram enjoying this game from 45 to 60fps on ultrawide settings. So unless you have 4K or ultrawide, I'm sure you will be fine at high settings at 1080p or 1440p.


*Insert Name* R̶y̶z̶e̶n̶ Intel Build!  https://linustechtips.com/main/topic/748542-insert-name-r̶y̶z̶e̶n̶-intel-build/

Case: NZXT S340 Elite Matte White Motherboard: Gigabyte AORUS Z270X Gaming 5 CPU: Intel Core i7 7700K GPU: ASUS STRIX OC GTX 1080 RAM: Corsair Ballistix Sport LT 2400mhz Cooler: Enermax ETS-T40F-BK PSU: Corsair CX750M SSD: PNY CS1311 120GB HDD: Seagate Momentum 2.5" 7200RPM 500GB

 

Link to post
Share on other sites
11 minutes ago, Brehohn said:

I have a 1080Ti and am not complaining a single bit. This game IS intensive to run. It's running on an engine that has TONS of ways to cater to your experience. You're saying its buggy and unoptimized. Do we need to remind you how bad Fallout 76 was? This has a launcher and loading issue, that can be fixed. This game actually supports ultrawide, HDR, ingame voice chat, etc. It could be WAY WORSE. Just because you spent $1000 TWO years ago doesnt mean your card is going to outlast 2 or 3 years. Graphical technologies are in fast pace right now. Get a 2080Ti or wait for the 3000 series.

 

Meanwhile, dont complain about something you havent even bought. I have a 1080ti, 8700k, and 32gb of ram enjoying this game from 45 to 60fps on ultrawide settings. So unless you have 4K or ultrawide, I'm sure you will be fine at high settings at 1080p or 1440p.

Excuse me but why are you implying I spent $1000 TWO years ago on it? No, I didn't, I waited at the right moment for it to be under €700 and it did: €680 and that was last year. I had my 780 since April 2014 and September 2018 it got upgraded. I am very well aware the 1080Ti will be weak in future. I expected R* to optimise RDR2 as they did with GTAV but maybe most of us were naive.

 

And now, I don't play Fallout 76 but I heard it's bad. But I only play open-world games and in this case, only GTA and some older games too. I'll see when I'll get RDR2.

Link to post
Share on other sites
56 minutes ago, Brehohn said:

I have a 1080Ti and am not complaining a single bit. This game IS intensive to run. It's running on an engine that has TONS of ways to cater to your experience. You're saying its buggy and unoptimized. Do we need to remind you how bad Fallout 76 was? This has a launcher and loading issue, that can be fixed. This game actually supports ultrawide, HDR, ingame voice chat, etc. It could be WAY WORSE. Just because you spent $1000 TWO years ago doesnt mean your card is going to outlast 2 or 3 years. Graphical technologies are in fast pace right now. Get a 2080Ti or wait for the 3000 series.

 

Meanwhile, dont complain about something you havent even bought. I have a 1080ti, 8700k, and 32gb of ram enjoying this game from 45 to 60fps on ultrawide settings. So unless you have 4K or ultrawide, I'm sure you will be fine at high settings at 1080p or 1440p.

My beef isn't that my $1000 card is getting old. My beef is that, in literally every other title(well, every other title that review sites benchmark with) it is performing roughly on par with the 2080 or 5700 XT. There are a few percents here and there swinging in either direction for all three, but they are mostly +/- 10%. You can use that and generalize over most of the AAA games out right now and be pretty safe and expect a certain performance.

 

But in this particular game, the 2080 and 5700XT are performing as expected but suddenly all of the pascal cards, as a GPU family, don't follow this trend? Why? The most obvious conclusion to draw is the drivers were not optimized for the pascal architecture. nVidia is a business after all, and they are just as aware as we are how strong of an architecture pascal was. It makes sense they would want to purposefully make them underperform in a new, very popular first-time-on-PC title like this to get people to buy into their new stuff. Maybe purposefully underperform are the wrong words. More like....not necessarily do their best to optimize the driver for them? Like they had their A team working on Turing optimizations and the C team who only works Tuesdays and Thursdays doing pascal? lol.

 

I actually don't think this thought is that tinfoilhat-y.

Link to post
Share on other sites
15 hours ago, Humbug said:

 

The question is- For those of you who have played it on PC- is it the best looking game to date technically?

Hard to judge from compressed online videos and static screenshots, games always look better in person. If the graphics horsepower has been put to good efficient use then it's very justified IMO.

It would be super useful if anyone could answer this. It looks pretty amazing from the videos but I can't be sure. The snow especially.

Link to post
Share on other sites

I don't think it's fair to solely blame NVIDIA for not optimizing drivers for Pascal, if that is even the case. Outside of trying to do some promotional/branding deal, It's not NVIDIA's responsibility to go out and optimize someone else's application, mostly because they need to have access to the shader code of the engine to begin with. It's up to the application's developer to contact the manufacturer for help in optimizing the shader code. If Rockstar did ask NVIDIA for help but NVIDIA chose to not incorporate the changes (which they could easily verify), Rockstar would, one would think, make a stink about this.

Link to post
Share on other sites
2 hours ago, CTR640 said:

Excuse me but why are you implying I spent $1000 TWO years ago on it? No, I didn't, I waited at the right moment for it to be under €700 and it did: €680 and that was last year. I had my 780 since April 2014 and September 2018 it got upgraded. I am very well aware the 1080Ti will be weak in future. I expected R* to optimise RDR2 as they did with GTAV but maybe most of us were naive.

 

And now, I don't play Fallout 76 but I heard it's bad. But I only play open-world games and in this case, only GTA and some older games too. I'll see when I'll get RDR2.

Because you're saying that you'll stick to console instead of playing this game on PC with much better hardware. The One X is the only console that even achieve 30fps, 4K flat by using techniques like interlacing the frames to achieve a "4K" resolution. On the PS4 Pro in 4K, it's highly noticeable. Just put the settings on high, enjoy it in 1080p or 1440p in its entire glory. The 1080ti is not going to be powerful forever, for every game. RDR2 is obviously ahead of it's time, much like GTA V was. GTA V favored the new 900 series over the 700 series at the time. 

 

The 780 was great for 1080p very high/high mixed settings at the time. But for 1440p, you weren't going to achieve above 60fps unless you had the 980 or Titan. It's highly obvious from benchmarks in 2015 upon release. 

 

The game became much more well optimized over time, just like any other game. This is a day 1 launch result and the game is amazing.


*Insert Name* R̶y̶z̶e̶n̶ Intel Build!  https://linustechtips.com/main/topic/748542-insert-name-r̶y̶z̶e̶n̶-intel-build/

Case: NZXT S340 Elite Matte White Motherboard: Gigabyte AORUS Z270X Gaming 5 CPU: Intel Core i7 7700K GPU: ASUS STRIX OC GTX 1080 RAM: Corsair Ballistix Sport LT 2400mhz Cooler: Enermax ETS-T40F-BK PSU: Corsair CX750M SSD: PNY CS1311 120GB HDD: Seagate Momentum 2.5" 7200RPM 500GB

 

Link to post
Share on other sites

https://kotaku.com/grand-theft-auto-v-benchmarked-pushing-pc-graphics-to-1698670906

 

That is from back in 2015. The results look almost exactly like the results we have now. Notice the similarities? 900 series are better than the 700 series. Today, the 2000 series is just better than the 1000 series. 


*Insert Name* R̶y̶z̶e̶n̶ Intel Build!  https://linustechtips.com/main/topic/748542-insert-name-r̶y̶z̶e̶n̶-intel-build/

Case: NZXT S340 Elite Matte White Motherboard: Gigabyte AORUS Z270X Gaming 5 CPU: Intel Core i7 7700K GPU: ASUS STRIX OC GTX 1080 RAM: Corsair Ballistix Sport LT 2400mhz Cooler: Enermax ETS-T40F-BK PSU: Corsair CX750M SSD: PNY CS1311 120GB HDD: Seagate Momentum 2.5" 7200RPM 500GB

 

Link to post
Share on other sites
On 11/7/2019 at 11:31 PM, Brehohn said:

Because you're saying that you'll stick to console instead of playing this game on PC with much better hardware. The One X is the only console that even achieve 30fps, 4K flat by using techniques like interlacing the frames to achieve a "4K" resolution. On the PS4 Pro in 4K, it's highly noticeable. Just put the settings on high, enjoy it in 1080p or 1440p in its entire glory. The 1080ti is not going to be powerful forever, for every game. RDR2 is obviously ahead of it's time, much like GTA V was. GTA V favored the new 900 series over the 700 series at the time. 

 

The 780 was great for 1080p very high/high mixed settings at the time. But for 1440p, you weren't going to achieve above 60fps unless you had the 980 or Titan. It's highly obvious from benchmarks in 2015 upon release. 

 

The game became much more well optimized over time, just like any other game. This is a day 1 launch result and the game is amazing.

I'd like you to find that post where I said I'll stick to console.

Link to post
Share on other sites
2 hours ago, Mira Yurizaki said:

I don't think it's fair to solely blame NVIDIA for not optimizing drivers for Pascal, if that is even the case. Outside of trying to do some promotional/branding deal, It's not NVIDIA's responsibility to go out and optimize someone else's application, mostly because they need to have access to the shader code of the engine to begin with. It's up to the application's developer to contact the manufacturer for help in optimizing the shader code. If Rockstar did ask NVIDIA for help but NVIDIA chose to not incorporate the changes (which they could easily verify), Rockstar would, one would think, make a stink about this.

Yeah, so often we see people waiting for drivers to optimize for a game, but why is that up to the GPU maker?  imo that's the developer's responsibility to optimize for the hardware.  And I'm sure if anything nvidia would come to them offering all sorts of help and it would just be up to rockstar to allow it or not.  afaik that's generally how games end up with physx, hairworks, etc. nvidia wants those features in, they're not going to say no to a dev eager to help spread their brand and technology.

Link to post
Share on other sites
14 minutes ago, Ryan_Vickers said:

Yeah, so often we see people waiting for drivers to optimize for a game, but why is that up to the GPU maker?  imo that's the developer's responsibility to optimize for the hardware.  And I'm sure if anything nvidia would come to them offering all sorts of help and it would just be up to rockstar to allow it or not.  afaik that's generally how games end up with physx, hairworks, etc. nvidia wants those features in, they're not going to say no to a dev eager to help spread their brand and technology.

Because that's how gpu drivers work? Granted there is more rockstar can do and less nvidia can do on a dx12 title but pascal is clearly having some issue no other architecture including nvidia's own turing. The fact that it runs pretty much fine on every card/arch from both vendors other than pascal tells me this is a driver problem imo. It's not like pascal has some optimization that say turing wouldnt that rockstar would include or not.

3 hours ago, Mira Yurizaki said:

I don't think it's fair to solely blame NVIDIA for not optimizing drivers for Pascal, if that is even the case. Outside of trying to do some promotional/branding deal, It's not NVIDIA's responsibility to go out and optimize someone else's application, mostly because they need to have access to the shader code of the engine to begin with. It's up to the application's developer to contact the manufacturer for help in optimizing the shader code. If Rockstar did ask NVIDIA for help but NVIDIA chose to not incorporate the changes (which they could easily verify), Rockstar would, one would think, make a stink about this.

What? Are you guys trying to give rockstar shit for it not being game works or something??? It literally is their responsibility that's why you get game ready drivers and I'm pretty fuckin sure Nvidia knows about the RAGE engine by now. Considering that Turing seems unaffected, and this is an engine well versed with Pascal, do you really think rockstar fucked up here? Maybe they somehow only contacted Nvidia for Turing and then Nvidia just didnt mind that pascal is shitting the bed????

What kind of logic is this?

Link to post
Share on other sites
2 hours ago, S w a t s o n said:

What? Are you guys trying to give rockstar shit for it not being game works or something??? It literally is their responsibility that's why you get game ready drivers and I'm pretty fuckin sure Nvidia knows about the RAGE engine by now. Considering that Turing seems unaffected, and this is an engine well versed with Pascal, do you really think rockstar fucked up here? Maybe they somehow only contacted Nvidia for Turing and then Nvidia just didnt mind that pascal is shitting the bed????

What kind of logic is this?

So NVIDIA has unfettered access to Rockstar's code repositories such that they can update drivers to account for any and all changes to the RAGE engine?

 

Some companies would absolutely kill to have privilege like that.

 

You can't make optimizations unless you know what they're doing first.

Link to post
Share on other sites

I don't know what you guys complaining here, Nvidia or even AMD delayed optimization driver for older gen card is old news. 


| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to post
Share on other sites
40 minutes ago, Mira Yurizaki said:

So NVIDIA has unfettered access to Rockstar's code repositories such that they can update drivers to account for any and all changes to the RAGE engine?

 

Some companies would absolutely kill to have privilege like that.

 

You can't make optimizations unless you know what they're doing first.

Oh man, I guess AMD and other Nvidia cards just magically work huh? The largest install base being 10 series cards, just really needed that extra "optimization"

Link to post
Share on other sites
1 hour ago, xAcid9 said:

I don't know what you guys complaining here, Nvidia or even AMD delayed optimization driver for older gen card is old news. 

I normally don't see release notes specifying that a game was optimized for a specific architecture. There are also non-trivial architectural differences between Pascal and Turing outside of the tensor and RT cores that could give Turing a significant performance advantage, but people seem to not know that.

Link to post
Share on other sites
Posted · Original PosterOP
9 hours ago, TigerHawk said:

But in this particular game, the 2080 and 5700XT are performing as expected but suddenly all of the pascal cards, as a GPU family, don't follow this trend? Why? The most obvious conclusion to draw is the drivers were not optimized for the pascal architecture. nVidia is a business after all, and they are just as aware as we are how strong of an architecture pascal was. It makes sense they would want to purposefully make them underperform in a new, very popular first-time-on-PC title like this to get people to buy into their new stuff. Maybe purposefully underperform are the wrong words. More like....not necessarily do their best to optimize the driver for them? Like they had their A team working on Turing optimizations and the C team who only works Tuesdays and Thursdays doing pascal? lol.

 

I actually don't think this thought is that tinfoilhat-y.

The other possibility is that since this game was built from the ground up as a modern graphics API game with a programming paradigm closer to vulkan / dx12 and no traditional dx11 backend; that RDNA, Turing and GCN are just better equipped at an architectural level than pascal is.

Link to post
Share on other sites
25 minutes ago, Humbug said:

The other possibility is that since this game was built from the ground up as a modern graphics API game with a programming paradigm along closer to vulkan / dx12 and no traditional dx11 backend; that RDNA, Turing and GCN are just better equipped at an architectural level than pascal is.

Perhaps Rockstar wanted to upgrade RAGE for next generation consoles and this is a good way to work out the kinks.

Link to post
Share on other sites
1 hour ago, Mira Yurizaki said:

I normally don't see release notes specifying that a game was optimized for a specific architecture.

And let hell break loose? No way.

1 hour ago, Mira Yurizaki said:

There are also non-trivial architectural differences between Pascal and Turing outside of the tensor and RT cores that could give Turing a significant performance advantage, but people seem to not know that.

What make you said that? Imo Pascal is much closer to Maxwell than Turing. 


| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to post
Share on other sites
38 minutes ago, xAcid9 said:

And let hell break loose? No way.

The release notes only say game optimizations were added to the drivers, they don't specify any microarchitecture it was meant for:

image.png.10327f0731c5107a76d1564597e2fd8b.png

Quote

What make you said that? Imo Pascal is much closer to Maxwell than Turing. 

It's in the Turing whitepaper. In particular these parts that don't require a game to specifically target the feature:

Spoiler

 

image.png.9e46a0a1206bb6de42e8d5e6f56e1727.png

 

image.thumb.png.30aa85fd346048e158d8959939681d37.png

image.png.d727b57cf2ac939d93d6f3728a7f838d.png

 

There's another under-the-hood feature, Independent Thread Scheduling, that's explained in the Volta white paper. The rest of the features Turing has have to be explicitly targeted.

 

EDIT: Digging around on the web, I think it's important for people who don't have any experience writing a driver stack to read this: https://www.gamedev.net/forums/topic/666419-what-are-your-opinions-on-dx12vulkanmantle/?tab=comments#comment-5215019

 

This person's experiences in NVIDIA's driver team, along with what Raymond Chen often says in his Windows development blogs, continues to make me believe that 9 times out of 10, an application isn't using the tools correctly and the driver or OS development team has to find a "hack" to work around it so that said application at the minimum runs. But overall, the issue is more complicated than just "lol, manufacturer can't optimize"

Link to post
Share on other sites
5 hours ago, Mira Yurizaki said:

The release notes only say game optimizations were added to the drivers, they don't specify any microarchitecture it was meant for:

5 hours ago, Mira Yurizaki said:

It's in the Turing whitepaper. In particular these parts that don't require a game to specifically target the feature:

Yeah because Nvidia is very transparent about everything.

In what page in Maxwell white paper mentioned about tile based rendering again? 🤔

 

I just read TPU latest review on 5700 XT THICC 3 and saw The Surge 2(Vulkan) also show similar performance level. 

Spoiler

the-surge-2-1920-1080.png

I guess just another case of bad developer. 🤷‍♂️

 

 

Also this is Strange Brigade(DX12) on that same review.

Spoiler

strange-brigade-1920-1080.png

 

And this is on 2060 released review.

Spoiler

strange-brigade_1920-1080.png

2080 get 26 fps more and 1080 Ti get -2.3 fps compare to old result. bad bad dev.


| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to post
Share on other sites
5 hours ago, xAcid9 said:

Yeah because Nvidia is very transparent about everything.

In what page in Maxwell white paper mentioned about tile based rendering again? 🤔

I don't understand the point of this comment other than to suggest to me Turing has some sort of secret sauce feature. I'm not convinced there is any secret sauce feature that makes Turing magically better in most regards over Pascal unless there's something spec-wise that looks odd. If anything, knowing that Maxwell has a tile based rasterizer only served to explain how it can perform better than the last gen with a narrower memory bus.

Link to post
Share on other sites
Posted · Original PosterOP

Update - 10/11/2019 - turn on async compute for Vulkan

 

Some users including hardware unboxed had noticed some hitching/stuttering under vulkan. It seems like Rockstar had forgotten to turn async compute on. It needs to be manually enabled in the config files and makes the frametimes smoother. This should help for GCN, RDNA and possibly even Turing.

 

open the below file

Documents\Rockstar games\Red Dead Redemption 2\Settings\system.xml

look for the below line

<asynComputeEnabled value "false"  />

Change it to true and save

 

 

Link to post
Share on other sites
Posted · Original PosterOP
On 11/6/2019 at 12:42 AM, Levent said:

I havent checked RDR2 PC release but I wonder if there is a significant visual upgrade over XOX or PS4P?

 

Link to post
Share on other sites

445xmE8FbK5Ypvu0qdwN6JTDheIs.thumb.jpg.61ef7e02855ff64b5adfe88da243095c.jpg


                                                                                                                     "..:   Y Gwir Yn Erbyn Y Byd      :.."

           "-] Peace Love Unity Respect [-"   

          " O la vittoria, o tutti accoppati! "  

 

                                                   How to free up space on your SSD                                                       Kymatica Revision:                                                                                                                      

Spoiler

CPU: Intel Core i7-2600k @ 4.4GHz Motherboard: ASRock Z68 Extreme4 Gen3 GPU: Gigabyte GeForce GTX 1660 Ti OC 6G 2x Windforce Memory: G.Skill Ripjaws X Series 16GB @ 2133MHz @ 9-10-11-28 SSD: Crucial M500 240GB (OS/Programs/Path of Exile/Grim Dawn) HDD1: WD 1TB Blue (Diablo III/Other Games/Storage/Media) HDD2: Seagate Barracuda 7.2K 500GB (Backup) HDD3: WD Caviar 7.2K 500GB (Backup) HDD4: WD Elements 4TB External WDBWLG0040HBK-NESN (Backup/Additional Storage)  CPU Cooling: Corsair Hydro Series H100 in Pull (w/ 2x Delta FFB1212EH 120mm) Case Fans: Noctua NF F12 industrialPPC-2000 (x3 120mm) PSU: Seasonic X-Series X-1050 1050W Case: Cooler Master HAF 922 Monitor: Samsung C27F396 Curved 27-Inch Freesync Monitor (@ 1440p @ 72Hz) Keyboard: Cooler Master Storm Trigger Z (Cherry MX Brown Switches) Mouse: Roccat Kone XTD Mousepad: Corsair MM350 Premium Audio: Logitech X-530 5.1 Speaker System Headset: Corsair VOID Stereo Gaming Headset (w/ Sennheiser 3D G4ME 7.1 Surround Amplifier) OS: Windows 10 Professional (Version 1903 OS Build 18362.535)

 

Link to post
Share on other sites
5 hours ago, BiG StroOnZ said:

 

Damn that looks like almost enough cooling to run a 9900KS at 5.2 GHz

Link to post
Share on other sites

It does need a beat of a computer to run maxed out thats for sure..

My Specs are as follows

AMD 3800x running 1usmus power profile.
32gb CAS 14 3200 ram

Nvidia 1080ti FE

OS on NVME and game on Separate NVME

 

Running Ultra wide 3440x1440 and have most things set to high and textures on Ultra.. lucky if I hit 60 fps mostly sits around 50 but its smooth enough at the moment.

The Pascal cards were never amazing with DX12 or Vulkan really.. 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×