Jump to content

A PLAGUE TALE REQUIEM - What's wrong with them?

fAED

I saw some YT videos and it feels disgusting to watch... Apparently, NVIDIA has a partnership so the game is optimized for Nvidia, but that's totally fine, AMD also have their ''own'' games which are optimized better for their gpu's.

 

At the recomended system requirements:

  1. Nvidia GPU: RTX 3070
  2. Amd GPU: RX 6800 XT.

- The game actually requires you to activate DLSS in order to play it with 'good fps'. (no FSR) and it's likely forcing you with RayTracing.

- A 6600 XT on ULTRA settings at 1920x1080 will get you around 45 fps with a 5900X and 32Gb at 3200 Mhz.

 

- I mean, is this the way gaming industry is going? - Shouldn't everyone be able to play the game at max settings at least with 60 fps with a card like 6600 XT? The game looks awesome, but, I hope developers really think on the gamers community and not everyone has the possibility to own the last-gen GPU's on the market.

The Nvidia  4090 markting team is using this game to show it's power and it obviously seems that these type of games (AAA) are looking more for those FPS numbers that last gen cards can bring to you, rather than actually be playable with lower tire of GPU's.. 

 

Coments even relate that even an 3080 with DLSS turned off brings stutters, in which the game drops from 100fps to 60fps. Which shows that afterall, not even for an RTX itself like a 3080 feels confortable to play.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

This is a confusing post. 

 

Partner games are normal. Hilariously, some have even performed worse on partner cards than the competition. Nvidia certainly has had more partner titles though, but they're the ones willing to throw money and AFAIK engineer support at developers, AMD hasn't done so as often. 

 

4 minutes ago, fAED said:

I mean, is this the way gaming industry is going?

Has been. Have you not heard the "can it run crysis" memes? And weren't around for Rise of the Tomb Raider, it was a similarly tough game to run at max settings with the current hardware when it released. 

 

Just drop settings if you can't run the game. I use high/medium/low very often as I usually run stuff at 4K with just a 2060 Super, it looks fine. Currently I'm playing things at all low 1080p with a GTX 780, stuff like Fallout 76 looks rather potato but other older AAA titles (Destiny 2 in my case) run fine, look fine. The insistence that you need to run high/ultra settings to enjoy a game is pretty goofy. 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, fAED said:

 A 6600 XT on ULTRA settings at 1920x1080 will get you around 45 fps with a 5900X and 32Gb at 3200 Mhz.

This sounds playable? Whats wrong? Besides don't you want the gaming industry to keep moving forward? 

Link to comment
Share on other sites

Link to post
Share on other sites

Playing great on my 7900x/3090 rig on High settings with DLSS set to Quality. (3440x1440)

 

I'll install it on my 5800x/RX6600 rig now and play around with it. If it doesn't support FSR (hadn't checked) i'll just throw on RSR and see how it goes. My experience with RSR has been pretty great. (2560x1600)

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, fAED said:

I saw some YT videos and it feels disgusting to watch... Apparently, NVIDIA has a partnership so the game is optimized for Nvidia, but that's totally fine, AMD also have their ''own'' games which are optimized better for their gpu's.

 

At the recomended system requirements:

  1. Nvidia GPU: RTX 3070
  2. Amd GPU: RX 6800 XT.

- The game actually requires you to activate DLSS in order to play it with 'good fps'. (no FSR) and it's likely forcing you with RayTracing.

- A 6600 XT on ULTRA settings at 1920x1080 will get you around 45 fps with a 5900X and 32Gb at 3200 Mhz.

 

- I mean, is this the way gaming industry is going? - Shouldn't everyone be able to play the game at max settings at least with 60 fps with a card like 6600 XT? The game looks awesome, but, I hope developers really think on the gamers community and not everyone has the possibility to own the last-gen GPU's on the market.

The Nvidia  4090 markting team is using this game to show it's power and it obviously seems that these type of games (AAA) are looking more for those FPS numbers that last gen cards can bring to you, rather than actually be playable with lower tire of GPU's.. 

 

Coments even relate that even an 3080 with DLSS turned off brings stutters, in which the game drops from 100fps to 60fps. Which shows that afterall, not even for an RTX itself like a 3080 feels confortable to play.

 

 

Watched my friend played it with 5900x + 3080 12gb on 1920x1080. High settings.
Was funny.

FPS was a roller coaster.
GPU util also can be somewhat weird sometimes, at one particular chapter his fps suddenly tanked to 35-45 with GPU util staying at 30-40%, but if he opens menu, his GPU util went up to 70-80%. The chapter didn't even have any rats around.

CPU util was like 20-30% all the time.

Thermal was fine IIRC, his GPU stayed at around 70c max. CPU was at 60-70c.

 

I also see lots of complain about pretty much the same thing in their steam discussion page.
So yeah, i think the game's engine is quite wonky.


Whether or not there's some shady shit happening behind the screen that involves Nvidia, idk.
I just hope this won't make the game dev goes into obscurity, since I quite enjoyed their first game. But yep, gonna wait and see for more patches before I play the 2nd.

There is approximately 99% chance I edited my post

Refresh before you reply

__________________________________________

ENGLISH IS NOT MY NATIVE LANGUAGE, NOT EVEN 2ND LANGUAGE. PLEASE FORGIVE ME FOR ANY CONFUSION AND/OR MISUNDERSTANDING THAT MAY HAPPEN BECAUSE OF IT.

Link to comment
Share on other sites

Link to post
Share on other sites

That's odd, my FPS hovered between 90 and 110 @ 3440x1440 with DLSS on Quality. 

 

Anyway just did some testing on my 5800x/RX6600 and with no RSR, performance was kind of crap at 2560x1600, as expected, as its not really a 1440p/1600p class GPU with FPS in the 20's. Enabling RSR to do 1920x1200 upscaled to 2560x1600 with Medium settings, I was playing smoothly between 45 and 55fps. Easily playable with a game like this on a relatively cheap GPU. Even more so if you play with a controller as I do for games like this.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Poinkachu said:

Watched my friend played it with 5900x + 3080 12gb on 1920x1080. High settings.
Was funny.

FPS was a roller coaster.
GPU util also can be somewhat weird sometimes, at one particular chapter his fps suddenly tanked to 35-45 with GPU util staying at 30-40%, but if he opens menu, his GPU util went up to 70-80%. The chapter didn't even have any rats around.

CPU util was like 20-30% all the time.

Thermal was fine IIRC, his GPU stayed at around 70c max. CPU was at 60-70c.

 

I also see lots of complain about pretty much the same thing in their steam discussion page.
So yeah, i think the game's engine is quite wonky.


Whether or not there's some shady shit happening behind the screen that involves Nvidia, idk.
I just hope this won't make the game dev goes into obscurity, since I quite enjoyed their first game. But yep, gonna wait and see for more patches before I play the 2nd.

That's exactly what I meant, I mean, the game looks awesome but yeah. It was just too strange that the game that stutters even with a 3080 with DLSS off.

 

What about GTA VI? How playable will it be in DDR4 gen...

Link to comment
Share on other sites

Link to post
Share on other sites

That you say it is optimized for Nvidia is highly debatable. It may be so for new Nvidia and AMD cards.

But the fact is that the most popular gaming GPU is still the Nvidia GTX 1060 6GB.

Let's take a look at the performance: 

 

Then let's compare the performance of the most popular Nvidia GPU currently with the RX 570 which has always been significantly cheaper than this GTX 1060 6GB:

 

Suppose you say that the newer AMD cards are slow in this game, then this also needs to be nuanced:

Q: From the reviews I've seen the game runs terrible on PCs better than what this is recorded with. Was there a patch that fixed performance or does this just run better with proton than native windows?

A: I did not test Windows, but I am using a mix of High/medium settings and low drawing distance, as it kills performance

Q:  The numbers you see on VKD3D with that hardware are awesome! A comparision to Windows would be very interesting.

OS: FreeBSD 13.3  WM: bspwm  Hardware: Intel 12600KF -- Kingston dual-channel CL36 @6200 -- Sapphire RX 7600 -- BIOSTAR B760MZ-E PRO -- Antec P6 -- Xilence XP550 -- ARCTIC i35 -- EVO 850 500GB

Link to comment
Share on other sites

Link to post
Share on other sites

 Perhaps it's just a heavy game to run (and it does seem to have its share of performance/optimisation issues)?

On 10/20/2022 at 10:48 PM, fAED said:

- The game actually requires you to activate DLSS in order to play it with 'good fps'. (no FSR) and it's likely forcing you with RayTracing.

Does it force you to enable DLSS in the menu? Do they force ray tracing on? Is this actually a confirmed thing or is this just bashing a game that performs best with DLSS on? DLSS and RTX were a "sequence break" of GPUs. A new technology came out that made the generations before it instantly "obsolete". That happens.

On 10/20/2022 at 10:48 PM, fAED said:

- A 6600 XT on ULTRA settings at 1920x1080 will get you around 45 fps with a 5900X and 32Gb at 3200 Mhz.

What's wrong with that? The recommended specs clearly state that for 60 FPS Ultra the recommendation is a 6800 XT and not a 6600 XT. 45 FPS is also a totally playable frame rate.

On 10/20/2022 at 10:48 PM, fAED said:

- Shouldn't everyone be able to play the game at max settings at least with 60 fps with a card like 6600 XT? The game looks awesome, but, I hope developers really think on the gamers community and not everyone has the possibility to own the last-gen GPU's on the market.

Generally, no, why? This is why graphics settings exist. Max settings should mean max hardware. Otherwise, if 3060/6600 level hardware can run max settings at high FPS, what's the point of both max settings and max hardware existing. For mid-tier cards one should expect mid-tier settings and mid-tier FPS. It is hard to say that X hardware should reach Y FPS at Z settings. Some prefer higher FPS at lower quality, others can sacrifice FPS for eye candy and others want a balance. Different types of games will also have different targets.

On 10/20/2022 at 10:48 PM, fAED said:

The Nvidia  4090 markting team is using this game to show it's power and it obviously seems that these type of games (AAA) are looking more for those FPS numbers that last gen cards can bring to you, rather than actually be playable with lower tire of GPU's.. 

Which is honestly how the industry moves forward. If last-gen or low-tier cards could still run games maxed out at high FPS, there would be no need or push to develop better GPUs. If there are no better GPUs, there's no push for games to expand the boundaries. 45 FPS is playable.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

The game's optimization is trash. At least at launch it hammered one core and the rest of the cpu basically did nothing.

Link to comment
Share on other sites

Link to post
Share on other sites

To be fair, the game probably has the best graphics of any PC game. It's locked at 30FPS on PS5. The real FPS drain seems to be all those damn rats! I was watching a guy with a RTX 4090 do a playthrough with no DLSS and even with a 4090 he was dropping under 60FPS (in 4k resolution) whenever those mobs of like 1000 rats would spawn. In my personal opinion, I'm fine with the performance considering how amazing the graphics are. I get more displeased when games nerf gfx for people on last gen consoles and stuff like that.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×