Jump to content

Another day, another awful console to PC port release (The last of us: Part 1)

35 minutes ago, Dracarris said:

yeah if you use motion blur so smear away any stuttering, sure.

Sure but it's also very important to factor the type of game, also I turn motion blur effects off so only blur from low fps would be possible and it's actually not that bad.

 

This is a 3rd person shooter explorer/action game, on almost everything using the recommended settings it's averaging 50fps and above. It's fine, people are losing it over the wrong thing.

 

Criticize the game for the horrific "building shaders" or the constant crashing and question why it needs so much VRAM. But if you have any semi recent gpu, render at 1080p and don't try and only run on ultra performance is more than acceptable. Anyone saying otherwise has NOT looked at actual performance benchmarks, like at all.

 

Played it on PS3 at 30FPS, anyone saying the game isn't enjoyable like that is soooooo out of touch.

 

Low FPS != stutter. Stutter is stutter, please be correct.

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, starsmine said:

So... zen 2 with rdna 2 can run the game on Ps5 at 1080p with high textures at well over 60 fps or 4k with low textures and just under 60 
COOL

so we are not that far off the PC requirements which have more OS and driver overhead

That just sounds like its plenty optimized when you put it like that and they did just fine.

Why do people think their PC is so much stronger then the ps5 and naughty dog is a bastard for giving you ultra which is significantly higher fidelity for those who do have PCs that are stronger.

Imagine being that guy, with a PC as strong or weaker then a console, complaining they can not play a console game, at console settings.

I agree that you cannot expect any computer to run Ultra quality even on 1080p. When the 3070 launched it was pointed out by many that 8 GB of VRAM might be problematic in the near future. And there you have it - GPUs being limited by VRAM instead of their compute power. The 3070 is still the current gen performance class GPU. So people should direct their disappointment towards Nvidia.

However, the game is / was plagued with other problems. And this does often shift the blame to the publisher and in many cases rightfully so. It's lucky for Nvidia that the Ultra setting isn't the only problem.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, leadeater said:

Anyone saying otherwise has NOT looked at actual performance benchmarks, like at all.

I agree. Hardware Unboxed sums it up pretty good.

 

15 minutes ago, leadeater said:

Played it on PS3 at 30FPS, anyone saying the game isn't enjoyable like that is soooooo out of touch.

I disagree. I have enjoyed many games in my life being limited to a stuttering mess, but not any more. Things I did not notice in the past are now unbearable, like a mouse with a high input latency. That doesn't take the enjoyment away from you but you should not expect anybody to feel the same way.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, HenrySalayne said:

I disagree. I have enjoyed many games in my life being limited to a stuttering mess, but not any more. Things I did not notice in the past are now unbearable, like a mouse with a high input latency. That doesn't take the enjoyment away from you but you should not expect anybody to feel the same way.

Again low FPS != stutter. Stutter is a very specific thing. On a pure flat unwavering frame time chart at 33ms what is this showing? Low FPS or stutter? So then what does stutter look like on a frame time graph?

 

Like sure the gameplay experience is better if the FPS is higher but at a locked 30 FPS are you really saying that the majority would be unable to enjoy the game. I don't expect everyone to feel the same way but I'll point out that it's on balance BS to say it's not enjoyable to play this game at 30 FPS without stutter.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, leadeater said:

Again low FPS != stutter. Stutter is a very specific thing. On a pure flat unwavering frame time chart at 33ms what is this showing? Low FPS or stutter? So then what does stutter look like on a frame time graph?

 

Like sure the gameplay experience is better if the FPS is higher but at a locked 30 FPS are you really saying that the majority would be unable to enjoy the game. I don't expect everyone to feel the same way but I'll point out that it's on balance BS to say it's not enjoyable to play this game at 30 FPS without stutter.

I know what you mean, but there is a point when low FPS becomes stutter. 10 FPS even with perfect frametimes is stutter. 24 fps (film) with perfect frametimes stutters when the camera pans. The underlying principle is the same. Pictures are so far spaced apart that we can perceive the two different pictures individually.

There is an argument to be made that perfectly smooth 30 fps are acceptable, but there is still lag and stutter while panning. Performance (or VRAM) limited systems are generally more over the place with their minimum fps, so they will feel even worse.

 

However, that wasn't my point. I gamed on systems not capable of outputting a smooth 30 fps and I enjoyed it. But that time is long gone and the perception changes.

 

If I could play The Last of Us Part 1 limited to only (smooth) 30 fps, I would be disappointed.

I played Cyberpunk 2077 in 4K (with DLSS) on a 2070S with ~ 50 fps lows and that was a great experience. With tweaked settings and 30 fps lows it was not enjoyable. It felt slow, it wasn't responsive. In my experience with story driven games the perceived spaghettification begins with frametimes between 50 ms and 25 ms and games start to not feel right for me. Using a mouse and keyboard.

And game controllers generally feel less responsive for me, so 30 fps on TV a few feet away might be perfectly fine.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, HenrySalayne said:

I know what you mean, but there is a point when low FPS becomes stutter. 10 FPS even with perfect frametimes is stutter. 24 fps (film) with perfect frametimes stutters when the camera pans

It's a different thing but yea I know what you mean. Judder is what I think it's called. Also film (24 Hz) judders when viewed on a not 24 Hz display, in a theatre or other display actually at 24 Hz the pan judder is either not there or much less noticeable. You get it a lot in really fast pans, but the bad judders you are talking about is attributed to viewing content on misaligned refresh rate screens.

 

1 hour ago, HenrySalayne said:

There is an argument to be made that perfectly smooth 30 fps are acceptable, but there is still lag and stutter while panning. Performance (or VRAM) limited systems are generally more over the place with their minimum fps, so they will feel even worse.

Even low power GPUs don't actually suffer more frame variance, in fact faster ones do. While a GPU might be struggling to put out 30 FPS it may only minimally drop to 20-24 FPS, what this game did originally on the PS3. You can go watch the Digital Foundry performance analysis of the game on PS3 and you'll see that it's very good at maintaining 30 FPS but does drop below that slightly in the most demanding parts however that doesn't really impact the visual or gameplay experience. Without the onscreen frame counter and graph you'd be hard pressed to know. Problem is at least the video I was watching on it was only 720p so detail wise was quite bad making it a little more challenging to do a good assessment overall.

 

Higher end GPUs tend to have larger and more variance in frame rates and frame times, which can lead to noticing stutters even if the frame time isn't bad itself just 3 times slower than the last one etc. Frame time spikes however generally aren't that much of a problem since they are overall rare to be large enough, and for the total number of frame rendered can be spaced well apart. There have been games known to be really bad for this though I just forget what they are, Gamers Nexus did an analysis on one of them last year maybe?

 

1 hour ago, HenrySalayne said:

If I could play The Last of Us Part 1 limited to only (smooth) 30 fps, I would be disappointed.

Sorry but I just do not believe that you would be unable to enjoy it. If I sat you down in front of the game and you had zero idea what the frame rate was I'm certain you'd be able to enjoy it. Would you notice it's a bit sluggish, probably, but the only thing that would prevent you from enjoying it is the immediate dismissal of it and therefore being overly analytical and critical of it. That's simply a mindset problem, one you've created for yourself.

 

Even then I'd still bet you'd be able to enjoy the game. Wishing the experience was better isn't the same thing as being disappointed it wasn't better and not enjoying. And that's the difference and what I mean. I'm talking about the sheer and absolute total inability at all in any possible way to enjoy the game at 30 FPS and I still, even for you say, nah you'll be able to enjoy it.

 

I totally believe you'd refuse to play it at 30 FPS, sure, that I do believe.

 

1 hour ago, HenrySalayne said:

And game controllers generally feel less responsive for me, so 30 fps on TV a few feet away might be perfectly fine.

This would really be the actual difference, because TVs have bad input latency as well and the distance you view at and the controller vs keyboard/mouse have a vastly different experience. I still choose a controller for some of my games on PC because it's a better experience. On a console you can't turn game setting off like motion blur, you can on a PC. There's quite a lot of differences overall. That still doesn't take away that on the PS3 at the 30 FPS it was doing still being able to be enjoyed. Literally every single review of the game praised it for it's performance and gameplay at the time, nobody was complaining about frame rates.

Link to comment
Share on other sites

Link to post
Share on other sites

52 minutes ago, HenrySalayne said:

24 fps (film) with perfect frametimes stutters when the camera pans. The underlying principle is the same. Pictures are so far spaced apart that we can perceive the two different pictures individually.

24 fps film and 24 fps game aren't exactly the same. In general for film/video the shutter time should normally be set to half that of the frame time, unless different is used for artistic effect. This gives a natural motion blur and smooths out the view. I suspect this is a thing many games don't get right. Motion blur is often too strong and looks bad, so people turn it off which then goes too far the other way. At high fps the target blur per frame tends towards zero.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, porina said:

24 fps film and 24 fps game aren't exactly the same. In general for film/video the shutter time should normally be set to half that of the frame time, unless different is used for artistic effect. This gives a natural motion blur and smooths out the view. I suspect this is a thing many games don't get right. Motion blur is often too strong and looks bad, so people turn it off which then goes too far the other way. At high fps the target blur per frame tends towards zero.

It's also because most end up viewing on a 60 Hz display and 24 does not evenly divide in to 60

image.thumb.png.673262c7c69abb9da330e8fd506e9c1a.png

 

My projector has a 24 Hz mode so this isn't a problem.

 

Film and game rendering are quite different though, VRR has made them a little closer though. Games don't generally experience these consistent extra frames being displayed on the screen because the frame rate isn't perfectly stable usually. FPS cap a game to 24 and I think the above will happen.

 

The above is why 30 FPS is chosen and not 35,40,45 etc.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, leadeater said:

It's also because most end up viewing on a 60 Hz display and 24 does not evenly divide in to 60

That's a separate problem to the one I was describing. The effect I was talking about would apply to any frame rate, I just followed the 24fps example from earlier. In the case of video it is part of the capture process. In the case of gaming, it is a rendering decision.

 

What you describe is about the display. Remember Taran? He did a great video with examples on that topic not long ago. https://www.youtube.com/watch?v=p3Jb3UPAw-w

 

 

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, porina said:

That's a separate problem to the one I was describing. The effect I was talking about would apply to any frame rate, I just followed the 24fps example from earlier. In the case of video it is part of the capture process. In the case of gaming, it is a rendering decision.

 

What you describe is about the display. Remember Taran? He did a great video with examples on that topic not long ago. https://www.youtube.com/watch?v=p3Jb3UPAw-w

Sorry I was more meaning people notice the judders a lot because of that. Thanks for the video, I know about the shutter half the frame time thing but not a lot about why. I'll def watch that video though.

 

Edit:

oh god it's an hour and half lol. Ok maybe I'll watch some of it and skip through  etc 😅

 

Edit 2:

"The iPhone do not disturb feature was more revolutionary than the discovery of penicillin" LOL!

Link to comment
Share on other sites

Link to post
Share on other sites

Haven't watched it yet but this just dropped. It's an hour long.

 

Edit: watched it. The tldw would be:

Their "mid spec" system is 3600 + 2700 Super, which is also chosen to approximate PS5 hardware.
Settings for above: 1440p DLSS quality, high overall with medium textures

 

Their high end system (12900k+4090) did look and behave better but still differences to PS5 version, even putting aside probable glitches that need to be fixed.

 

On VRAM: the game leaves a portion of VRAM for OS/other apps. It seems to scale with VRAM on the model, and does not consider what is actually being used outside of the game. 

"high" textures on PC comparable to PS5, no further improvement on "ultra".
Medium textures is blurry.

They considered more than 8GB required for high textures. HUB benchmarks didn't reveal problems with "high" at 8GB, may depend on where they tested.

 

Game is CPU intensive, and as much of a limiter as GPU. Their testing suggests 3600 is not going to deliver consistent 60fps when GPU is not limiting.

Game loads data for new areas as you advance, resulting in lower perf while it does so

They commented the game files appears to include a library to handle PS5 textures. Have to wonder if this is less optimal than one made for PC.

Post launch patch has reduced shader compilation time.

 

 

Edit 19 May 2023

 

I didn't want to bump this thread but thought following was worth adding to it. 

 

Recent patches have increased texture quality at lower settings, and also reduced VRAM usage. You can run high textures with 8GB GPU now. 

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×