Jump to content

Do you think current gen. consoles will eventually go back to the dark ages of 30FPS?

Man

It seems like both PS5 and Xbox SX are struggling to maintain 60FPS in some games under the 'performance mode'. The most glaring example being Guardians of the Galaxy which often dips all the way down to mid 40s on XSX and PS5. It's kind of ridiculous; considering both consoles are just a little over an year old at this stage and I suspect that performance will drop all the way down to 30FPS in "true" next-gen. titles.

 

Personally; I think they should offer at least a 40FPS performance mode à la Rachet & Clank's 120Hz mode which runs at 40FPS (1/3rd vsync). They can also harness the power of VRR and offer games with a 48FPS cap as literally all VRRs have a minimum range of at least 48Hz which will take away 20% load off of the GPU; while looking more or less similar to 60Hz.

 

30FPS shouldn't be the only option available to console users as it looks absolutely horrid.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

The original halo was 30fps and i wouldnt consider halo to be unpopular or the dark ages.

 

That being said if 30fps is "horrid" then i would highly recommend not watching any movies or tv shows or series that were at that framerate.

Link to comment
Share on other sites

Link to post
Share on other sites

they probably wont go back to 30fps standard but they will sure include the option. i feel like they will mostly try to get 60fps with upscaling and wont be a problem later on in technology. consoles evolve and so will framerate

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, emosun said:

The original halo was 30fps and i wouldnt consider halo to be unpopular or the dark ages.

 

That being said if 30fps is "horrid" then i would highly recommend not watching any movies or tv shows or series that were at that framerate.

TV shows and movies at 24fps are fine because you're not feeling any latency. It's not the visual smoothness that makes 30fps gaming bad, it's the feeling, that almost "lag" when you switch from 60fps to 30fps mode. 

Also, when making a movie, they (typically) account for the 24fps presentation and try to avoid anything that would cause obvious stutter. Though, there are examples where there's no thought put into this - 

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, emosun said:

That being said if 30fps is "horrid" then i would highly recommend not watching any movies or tv shows or series that were at that framerate.

Your argument is moot at best and... well, just plain ridiculous at worst!

The sole reason film makers are sticking with ~24FPS is solely because of production budget and/or the limited space on Blu-ray disks. And let's not forget about the VHS tapes that universally ran at 24FPS. You'd need well over twice as much tape and a VCR player with heavy duty internals and much tighter tolerances for 60Hz content. It just wasn't commercially feasible in the 70s and 80s.

And now 24FPS has become sort of a tradition, even though it's no place in a modern world, as far as I'm concerned at least.

11 hours ago, mrekk said:

they probably wont go back to 30fps standard but they will sure include the option. i feel like they will mostly try to get 60fps with upscaling and wont be a problem later on in technology. consoles evolve and so will framerate

That's a possibility, as RDNA2 can apparently run Intel's upcoming XeSS, thanks to having DP4a instructions set.

Still, it's a shame that AMD just sat on its hands and did absolutely nothing to counter DLSS. And no, their FSR is just a glorified bilinear upscaler à la Lanczos which has got nothing on DLSS. Personally, temporal upscaling is the future and I sure hope that XeSS make its way into current Gen. consoles. 

Hopefully, the 'enhanced versions' of current gen. consoles (PS5 Pro?) will come with either Nvidia GPUs or hopefully AMD will bake Intel's alleged 'open-source' XeSS upscaling hardware into their future RDNA architecture. 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Man said:

 

I like that my job is video and film restoration and I specialize in magnetic tape restoration but am still being told what frame rate is.

I'd love to retort but I'm literally not allowed to. But i'll sum it up with if you cannot play a game at 30fps then don't.

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, Colty said:

It's not the visual smoothness that makes 30fps gaming bad, it's the feeling, that almost "lag" when you switch from 60fps to 30fps mode. 

I here I was playing halo for 21 years with lag and didn't even notice it.

 

All jokes aside , i don't mind 30fps and neither did a lot of people. I'm sure if I played some sort of competitive online shooter where I had to do the single stupid door gap gimmick test to sell sponsered products for content I might care , but otherwise no I don't

Link to comment
Share on other sites

Link to post
Share on other sites

22 hours ago, Man said:

It's kind of ridiculous; considering both consoles are just a little over an year old at this stage and I suspect that performance will drop all the way down to 30FPS in "true" next-gen. titles.

Even though this generation is over a year old, we've had a global pandemic during the entirety of its generation snowballing with extremely high demand causing shortages. Game developers are still making games on previous gen consoles due to most people still play on that generation, and just upping the render resolution +framerate on current gen in patches. GotG and Ratchet and Clank are still what I'd consider release day games on current gen despite their respective release dates. Development of those games started long before devkits of the PS5 and XSX were in devs hands. Release day games are rarely, if ever representative of an entire generations potential. Its going to be 2-4 years at least, given how long game development is now days to see that potential.

 

22 hours ago, Man said:

30FPS shouldn't be the only option available to console users as it looks absolutely horrid.

An overwhelming majority that play console don't care about framerate and leave the games on quality preset if they touch the settings at all.

5800X3D / ASUS X570 Dark Hero / 32GB 3600mhz / EVGA RTX 3090ti FTW3 Ultra / Dell S3422DWG / Logitech G815 / Logitech G502 / Sennheiser HD 599

2021 Razer Blade 14 3070 / S23 Ultra

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, emosun said:

I like that my job is video and film restoration and I specialize in magnetic tape restoration but am still being told what frame rate is.

I'd love to retort but I'm literally not allowed to. But i'll sum it up with if you cannot play a game at 30fps then don't.

Well, I didn't mean to be disrespectful + I think it was a fair counterargument. 

 

It's not like I can't play on 30FPS, I absolutely can. In fact, I consider 30FPS to be the minimum playable frame rate so I was by no mean completely dismissing it or perhaps proclaiming it to be unplayable like a typical PCMR character! It's just that... well, I can only show you:

 

UFO Test: Frame Rate Versus (testufo.com)

 

If you view it at 60Hz; you can easily notice this slight stutter which I, for one, can barely tolerate. Even 40Hz feels remarkably smoother than 30FPS even though we are only talking about a mere 10FPS bump; albeit a rather hefty 8.33ms reduction in frame latency (33.33ms > 25ms). I've a VRR that bottoms out at 40Hz and the difference between 40 and 30 is simply night and day.

 

Personally, 40FPS should be the "new 30FPS" for console users with 120Hz monitors.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, emosun said:

I like that my job is video and film restoration and I specialize in magnetic tape restoration but am still being told what frame rate is.

I'd love to retort but I'm literally not allowed to. But i'll sum it up with if you cannot play a game at 30fps then don't.

Oh and since we are on the topic... 24FPS on 60Hz feels absolutely horrid thanks to all that juddering:

 

UFO Test: Frame Rate Versus (testufo.com)

 

Just saying!

Link to comment
Share on other sites

Link to post
Share on other sites

Guardians of the Galaxy just has kind of shitty optimization in performance mode, otherwise everything else I have played on PS5 holds 60 fps pretty well unless it's a PS4 game with no framerate unlock patch, like Yakuza 6.

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/29/2022 at 12:00 AM, Man said:

Your argument is moot at best and... well, just plain ridiculous at worst!

The sole reason film makers are sticking with ~24FPS is solely because of production budget and/or the limited space on Blu-ray disks. And let's not forget about the VHS tapes that universally ran at 24FPS. You'd need well over twice as much tape and a VCR player with heavy duty internals and much tighter tolerances for 60Hz content. It just wasn't commercially feasible in the 70s and 80s.

And now 24FPS has become sort of a tradition, even though it's no place in a modern world, as far as I'm concerned at least.

Blu-Rays have no issues with storage sizes, the dual layer discs are 50GB. You could easily fit a 48/60fps movie on them. Maybe they would need to do a bit more compression, but remember, original Blu-rays are H.264, UHD's are H.265. You can get Blu-Ray discs with 128GB of space available. Plus, if they just had a movie disc and a bonus features disc, you could pack a lot on there. There are 4k/60fps UHD released movies. They're not "good movies", but go ahead. Buy "Gemini Man" or "Billy Lynn's Long Halftime Walk." Tell me what you think of the 60fps presentation. Because I've seen Gemini Man in 60fps. It didn't help. 

So now that we're past the "blu-ray small" thing...
They still shoot at 24fps. And we're still used to it. I take a moment to adjust when watching 60fps IRL stuff on YouTube. And in film, they're still using 35mm to record. Most cinema content is not shot digitally. Yes, they could increase the rate of film used, we already had that with the Hobbit/Gemini Man. But half of the people said, "Yeah, it was kind of cool." And the other half that saw it said it was very distracting. There's no way they made their money back on it. Between effectively having an additional release and going through so much more material... 

The other thing, at least for me, a film nerd that cares far too much about little small details, it's a better style. They're something fashionable and cool about it. To me, it highlights that this is "cinema." And that's not organic, it's because of the experiences I've had. But it's a tradition I appreciate. And, based off of the Hobbit/Gemini response, I'm not alone. 

It's like film-grain. It's a style that says, "Hollywood" to me. No, it's not perfect. It's literally film grain, defects. But if you digitally remove that in post or use your TV's options to remove it, something is lost there. And it goes further than altering an image and getting a smeary mess. Even when done right, it takes something away. Hell, the "4k77" release of Star Wars looks INCREDIBLE. With all of it's film grain, burnt cells, and little quirks, it feels authentic. It feels more real. 

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...
On 1/29/2022 at 3:19 AM, Colty said:

TV shows and movies at 24fps are fine because you're not feeling any latency

i feel this is all really subjective, i strongly dislike movies and videos sub 60fps… it feels like stop motion picture really , and conversly i dont feel any "latency" while playing  games as long the framerate is in synch with the refreshrate, so exactly  the same thing as videos actually.

 

I can feel input lag though, which has nothing to do with "latency" or framerate  however.

 

Quote

Input lag is the amount of time it takes for your TV to display a signal on the screen from when the source sends it. It's especially important for playing reaction-based video games because you want the lowest input lag possible for a responsive gaming experience. Having low input lag tends to come at the cost of less image processing on TVs, which is why there are specific Game Modes for low input lag, and even though TVs aren't as good as monitors in this regard, technology is slowly catching up.

We measure the input lag using a specialized tool, and we test for it at different resolutions using different settings.

 

As you can see input lag is mostly influenced by the *screen* being used, framerates etc play a negligible role if any (hence its not even mentioned )

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I think people who go on and on about 30fps being like a crime against humanity are being ridiculous, go and watch the latest Digital Foundry video on Horizon: Forbidden West. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×