Jump to content

A Developer's Defense Of 30 Frames Per Second

Misanthrope

1) Sorry, it was a bad example. I know you cant use three Titan Z's, but I was just trying to make a point that you need something insanely powerful and far too expensive just to play a game at such a high resolution and framerate.

 

2) Yes I have. If you can't tell, I am primarily a console gamer. Now I'll probably get flak for that, since I'm on a message board for PC users. But I use a PC as well, I build systems for myself and for friends, so I do have a somewhat sense on what I'm talking about. (Course, I've yet to have a friend who had the budget to build a really high-end system. Most are mid-range, $1000 systems). That being said, most of the games I play on consoles are third-person action games. Recently, I finished Infamous: Second Son. Now, this game does run at 1080p, but locked at 30fps. But I have no complaints, whatever they did with the visuals, it FEELS like it runs at a higher frame rate. It's very silky smooth and quite frankly, is the best looking console game that I've ever seen so far. Period. The gameplay is nowhere near clunky or uncomfortable. Granted, it's not a game where aiming is important. Sure you can shoot projectiles, but most projectiles hit a larger area, so you don't need to be all that accurate. I don't play first-person-shooters, but for those games yes I do understand the need for a smoother, higher framerate.

 

3) maybe it's just me and the games that I play, or that my PCs tend to be more of a mid-range system, but usually when I play, if I can get a frame rate of around 30-40fps, I'm happy. Hell, I don't even pull up a frame counter most of the time. I just adjust my settings until things look smooth to me. I guess I shouldn't even be in this discussion, since I honestly don't place that much importance on frame rate in the games that I play, as long as they are at least 30fps. (and yes, I've played my fair share of games at 60fps, but it really doesn't make that much difference to me)

 

Basically what I am saying is, yes while the majority of PC gamers strive to achieve 60fps in the types of games they play, there are some of us that do not place that much of importance on it. Sure, that is a minority for a lot of fast-paced shooters and whatnot. And of course, until I get my hands on The Order, I cannot say weather or not their decision to stick with 30fps is ultimately a good or bad decision from a gameplay aspect. But as of now, for the type of gameplay it looks like to me, it really wouldn't benefit that much from a higher framerate in terms of playability. The weapons look very broad-ranging for the most part (as in, you don't have to be that accurate when you blast a huge radius of shrapnel at your opponents!), I've yet to see if there are any longer range rifles or something that needs to be more accurate for opponents in a distance. Time will tell though.

ANy and every game will always be better at 60fps then 30fps the only example is if your animation is in the style of a TV show such as South park the Stick of Truth. And it will make a difference to any other games. It might feel ok but if the game was running at 60fps it will feel better, it's not what you prefer it is fact. I am not very good at examples but it's like saying £1000 is better than £500. The more the better. If it helps go and play the Spec Ops game and play it at both 30fps and 60fps and tell me it was just as enjoyable at 30fps as it was at 60fps because it simply isn't. I do understand why they had to make the game run at 30fps but they shouldn't be making it out like it was a design decision. Simply put they had the option of either going in between frame rate, resolution and graphical settings. They decided to go for resolution and graphical settings but sacrifice the frame rate. If they had a real choice or the game was made on PC they could easily not lock the frame rate or any of the options. They could just leave it open for people to decide what hey want more or have everything running on Ultra.

 (\__/)

 (='.'=)

(")_(")  GTX 1070 5820K 500GB Samsung EVO SSD 1TB WD Green 16GB of RAM Corsair 540 Air Black EVGA Supernova 750W Gold  Logitech G502 Fiio E10 Wharfedale Diamond 220 Yamaha A-S501 Lian Li Fan Controller NHD-15 KBTalking Keyboard

Link to comment
Share on other sites

Link to post
Share on other sites

snip.

I'm thankful for your honesty. Lower frame rates may be irritating at first but after a short period of time the brain will get used to the less fluid look of a game - believe me, I've tested it, after a short period of time it doesn't bother that much anymore. I've tried framerates from 30 up to 120 and although the aesthetics changed and looked a lot smoother, it didn't bother me much when playing with a controller (tried it out with Battlefield 4, no motion blur enabled). 

In gaming business, most devs go for eye-candy because it sells better than a game that looks worse but has a much higher framerate. Also, I guess they lock framerate in favor of an experience that's rock solid than either a fluctuating refreshrate or one that is locked somewhere between 30 and 60. There's nothing more that bothers me than a fluctuating refresh rate - it's just aweful. Instead of a capped 60 that regularly drops into the 40s I'd rather have 30 that are always 30.

 

Even if the developers had more horsepower I guess they'd go for 1080p/30, it's all about graphical fidelity after all. The PS4 can't be too weak if you look at games like Driveclub, it just eats everything in terms of graphics and looks quite smooth for its low frame rate.

144Hz goodness

Link to comment
Share on other sites

Link to post
Share on other sites

In gaming business, most devs go for eye-candy because it sells better than a game that looks worse but has a much higher framerate. Also, I guess they lock framerate in favor of an experience that's rock solid than either a fluctuating refreshrate or one that is locked somewhere between 30 and 60. There's nothing more that bothers me than a fluctuating refresh rate - it's just aweful. Instead of a capped 60 that regularly drops into the 40s I'd rather have 30 that are always 30.

 

Even if the developers had more horsepower I guess they'd go for 1080p/30, it's all about graphical fidelity after all. The PS4 can't be too weak if you look at games like Driveclub, it just eats everything in terms of graphics and looks quite smooth for its low frame rate.

You have a very good point there. Most developers would go for more eye candy and sacrifice frame rate and/or resolution. (as an FYI, though I just got Watch Dogs on PS4, and it's supposed to be 900p only, my eyes can't tell the difference on my HDTV! Very good anti-alaising). And, I also would prefer a locked-down optimized frame rate over an unstable one. Nothing takes you more out of the game than frame rate fluctuations.

 

As for what kuddlesworth said, if you really look at every game that came out over the past 5 years only, there would be tons of games that probably wouldn't benfit from more than 30fps. Any 2D game for one thing (even ones with 3D elements like Limbo, Fez, and a myriad of other titles), and in my opinion, games like Diablo really doesn't need it either. 3D action games (god of war-style). Heck, even Minecraft doesn't need more than 30fps (even though I know it runs 60fps even on Xbox 360). Granted, this is probably be like 15-20% of the games out there that would not benefit from higher frame rates, but those games are out there.

Link to comment
Share on other sites

Link to post
Share on other sites

for what kuddlesworth said, if you really look at every game that came out over the past 5 years only, there would be tons of games that probably wouldn't benfit from more than 30fps. Any 2D game for one thing (even ones with 3D elements like Limbo, Fez, and a myriad of other titles), and in my opinion, games like Diablo really doesn't need it either. 3D action games (god of war-style). Heck, even Minecraft doesn't need more than 30fps (even though I know it runs 60fps even on Xbox 360). Granted, this is probably be like 15-20% of the games out there that would not benefit from higher frame rates, but those games are out there.

Of course they don't need 60fps, but they would be better if they were 60 fps. Just like you could play God of War without normal maps and it would still be playable, but it would look like ass and hence be inferior experience.
Link to comment
Share on other sites

Link to post
Share on other sites

Guys, remember the Sega vs Nintendo days? Even back then with something like a Mario or Sonic title, lower frame rate was terrible. When too much got on the screen and you felt the lag with the framerate just dropping. It wasn't acceptable THEN and it sure shouldn't be acceptable NOW.

 

This isn't even arguable, it's fact. The Devs are just shy of saying low frames are superior and it just plain isn't, at all.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

May I link this video player project that can make 24 Hz or whatever to 60 HZ: http://www.svp-team.com/ One of the few things I miss in Linux (The OS)! You need a decent GPU to be able to play high bit rate 1920x1080 movies. But it looks awesome!

It's a good idea but I doubt it would work as it should, simply because while in a video file the computer knows exactly what is on the next frame before it is displayed, when rendering a videogame frames are just being pushed on to the screen as soon as they're ready and calculating the next frame in advance would mean effectively rendering 60fps. Implementing that technology on a 30fps game would require a lot of trying to guess which way everything will move and would result in artifacts and weird ghosting. Or simply not work. Besides this process would require some amount of processing power which is entirely dedicated to the game in that moment.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

60fps is giving peasants a headache? But I thought they could only see 30 of the frames the rest is invisible to the human eye, maybe the peasants are evolving!! /r/pcmasterrace 

AMD Ryzen 5900x, Nvidia RTX 3080 (MSI Gaming X-trio), ASrock X570 Extreme4, 32GB Corsair Vengeance RGB @ 3200mhz CL16, Corsair MP600 1TB, Intel 660P 1TB, Corsair HX1000, Corsair 680x, Corsair H100i Platinum

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

 

It's a good idea but I doubt it would work as it should, simply because while in a video file the computer knows exactly what is on the next frame before it is displayed, when rendering a videogame frames are just being pushed on to the screen as soon as they're ready and calculating the next frame in advance would mean effectively rendering 60fps. Implementing that technology on a 30fps game would require a lot of trying to guess which way everything will move and would result in artifacts and weird ghosting. Or simply not work. Besides this process would require some amount of processing power which is entirely dedicated to the game in that moment.

 

Well, that wasn't my proposal. I just wanted to link it so people can watch some videos in original vs 60Hz and decide for them self what the difference feels like.

Link to comment
Share on other sites

Link to post
Share on other sites

Well, that wasn't my proposal. I just wanted to link it so people can watch some videos in original vs 60Hz and decide for them self what the difference feels like.

oh ok, I thought you were proposing to use that for games :)

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

I'm going to throw this out there in response to the corresponding WAN Show segment:

 

I played Dark Souls on the PC without a 60 FPS mod, so it was running at 30 FPS. When I booted up Dark Souls II and noticed that it was 60 FPS, I instantly missed the cinematic look of Dark Souls with 30 FPS. I didn't search to see if there was a config setting or mod to set FPS to 30 but, looking back, I would've preferred it. It contributed significantly to the original's aesthetic.

 

I like TotalBiscuit and agree with him on several different things (including his newish stance on social media and comment sections if he still feels the same way) but he's just a gamer who's YouTube famous. He isn't any more an authority on anything than any other gamer who's observant and has some idea of what they're talking about. I also remember a handful of times when analysis of something specific on the WAN Show was a little off so I wouldn't be so quick to lay down the law on things like this. It's typically something so minute that it isn't even worth mentioning but this time it was worth bringing up.

 

This kind of debate is sort of along the same lines as people saying phone specs are the end-all-be-all and having a "to hell with experience" attitude or at least putting experience second to specs. It doesn't make a lot of sense. Having higher numbers isn't the only factor when it comes to tailoring an experience. In this case, more FPS doesn't necessarily equate to having a better experience. In fact, now that I look back, there have been plenty of times when I thought games I'd been playing could've used a more cinematic feel to enhance the experience (whether by limiting frame rate or working some other magic I don't know about). Granted, limiting FPS while playing games with mouse/keyboard might be too much of a trade-off to warrant limiting FPS (I've never tried it) but it doesn't mean that I wouldn't still wish the game had a more cinematic feel. I'd venture to say that, if I were playing a game on PC with a 360 controller, I would almost always prefer to play at 30 FPS.

 

As a semi-related aside, when I walk into a big box store and look at the high refresh rate TVs and how smooth the motion is with the motion blur reduction and other fancy features they use, I always walk away thinking, "I'd really hate to have to watch movies or anything on that thing. It's too smooth--not cinematic at all." You can turn those features off, of course, but that's not the point.

 

I'll also note that I've been an avid PC gamer my entire life. The last console game I played with any seriousness was for the PS2.

 

Edit: It should be a given but I want to make it clear that there are all sorts of different types of games that I prefer to run at the highest FPS possible. Competitive games (FPS, MOBA), MMORPGs, and others. My preference for 30 FPS is pretty much limited to single-player film-like/cinematic games, usually played with a 360 controller.

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah but what you call "film like/cinematic" is in the end, a word for "accustomed to". I won't talk tvs that interpolate frame that don't exist but movies that are filmed @ 48 fps sure does look sharp and smooth... You get used to it. 

 

I kind of like the idea of Linus with the toggle for consoles. It would raise the level of awareness about higher frames. There's just not enough games on consoles at 60 fps for people to know about and ask for it. I remember playing WipeOut HD @60fps and it was gorgeous. That's on PS3... 

Link to comment
Share on other sites

Link to post
Share on other sites

I hear what you're saying but if it were just a matter of being "accustomed to" it, then I'd want to play every game at 30hz which isn't the case.

 

Also, if I were going to be "accustomed to" anything, it would be 60hz because I've spent most of my time and most of my years playing games and using a computer with a monitor set to 60hz.

 

It's more than just being "accustomed to" something. A big reason the film industry continues to use 24p is because, yes, we're accustomed to it, but also because it adds an effect to the viewing experience.

 

Just as speculation, it seems like any time I watch a video shot at a higher frame rate with (for example) human actors, my very first reaction is "I'm watching actors being recorded by a camera." Whereas there's something about 24p that allows you to naturally fall into that 'willful suspension of disbelief' mode and more readily accept that those actors on screen are actually the characters they're playing. I'm sure there's a sort of training we go through that helps us along with this but that's not the only factor and it does add an effect that's important in specific instances that we can point to and identify as being definitely separate from other instances. There's a window of FPS amounts that's optimal for getting the brain to instinctually interpret what's being watched on screen as being fantasy as opposed to higher frame rate content where, even though we know we're watching video content on a screen, the brain is still trying to grapple with the smoothness of the frame rates which is drawing closer to what we see when we look at things in the real world. Also, the reason the film industry uses 24p is because anything below that has too much flicker which becomes distracting. While I'm sure keeping the frame rates low was done to save costs, they purposely chose the best option and, now that film is increasingly digital, they're continuing to use 24p because it gives you the unique film experience that just isn't satisfying or believable at higher frame rates. It's an amount of unrealness (yes, I said unrealness) that gives the film a certain magic.

Link to comment
Share on other sites

Link to post
Share on other sites

If devs chose to lower the games graphical quality so 60fps could be achieved on consoles/would be easier to achieve on pc, people would then complain about how bad the games look.

Link to comment
Share on other sites

Link to post
Share on other sites

 people would then complain about how bad the games look.

that's why consoles are shit! PC can have 60 FPS or 120 FPS with very good GRAPHICS! 

 

LOL PS4 and XBOX one are already at 792p/900p  and they still can't play with 60 FPS! :D

 

devs better make games for PC 

 

see here 30 FPS vs 60 FPS 30 fps looks like shit lol

http://boallen.com/fps-compare.html

here set first ball 60 FPS, 1.0 realistic + 200 pixel/s   (then change it to 500 px/s)

set second ball    30 FPS, 1.0 realistic + 200 pixel/s   (then change it to 500 px/s)

http://frames-per-second.appspot.com/

Computer users fall into two groups:
those that do backups
those that have never had a hard drive fail.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×