Jump to content

30fps better for storytelling than 60fps?

Testbuilds

It doesn't simply becuase the console versions tend not to keep that 30fps like AC2. But yea the AC series is all perfectly playable in 30 fps, and you wont be angry about it like battlefield. But it never in any way  felt better than in 60 fps on pc to me. 

Never said it was better, only playable. 60 is obviously better, but 30 is tolerable. 

Interested in Linux, SteamOS and Open-source applications? Go here

Gaming Rig - CPU: i5 3570k @ Stock | GPU: EVGA Geforce 560Ti 448 Core Classified Ultra | RAM: Mushkin Enhanced Blackline 8GB DDR3 1600 | SSD: Crucial M4 128GB | HDD: 3TB Seagate Barracuda, 1TB WD Caviar Black, 1TB Seagate Barracuda | Case: Antec Lanboy Air | KB: Corsair Vengeance K70 Cherry MX Blue | Mouse: Corsair Vengeance M95 | Headset: Steelseries Siberia V2

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Never said it was better, only playable. 60 is obviously better, but 30 is tolerable. 

Right, but games like bf3 and ac2 dip to around 20 fps frequently and that's one of my main issues with 30 fps games. Anything under 30 fps is unplayable really. 

muh specs 

Gaming and HTPC (reparations)- ASUS 1080, MSI X99A SLI Plus, 5820k- 4.5GHz @ 1.25v, asetek based 360mm AIO, RM 1000x, 16GB memory, 750D with front USB 2.0 replaced with 3.0  ports, 2 250GB 850 EVOs in Raid 0 (why not, only has games on it), some hard drives

Screens- Acer preditor XB241H (1080p, 144Hz Gsync), LG 1080p ultrawide, (all mounted) directly wired to TV in other room

Stuff- k70 with reds, steel series rival, g13, full desk covering mouse mat

All parts black

Workstation(desk)- 3770k, 970 reference, 16GB of some crucial memory, a motherboard of some kind I don't remember, Micomsoft SC-512N1-L/DVI, CM Storm Trooper (It's got a handle, can you handle that?), 240mm Asetek based AIO, Crucial M550 256GB (upgrade soon), some hard drives, disc drives, and hot swap bays

Screens- 3  ASUS VN248H-P IPS 1080p screens mounted on a stand, some old tv on the wall above it. 

Stuff- Epicgear defiant (solderless swappable switches), g600, moutned mic and other stuff. 

Laptop docking area- 2 1440p korean monitors mounted, one AHVA matte, one samsung PLS gloss (very annoying, yes). Trashy Razer blackwidow chroma...I mean like the J key doesn't click anymore. I got a model M i use on it to, but its time for a new keyboard. Some edgy Utechsmart mouse similar to g600. Hooked to laptop dock for both of my dell precision laptops. (not only docking area)

Shelf- i7-2600 non-k (has vt-d), 380t, some ASUS sandy itx board, intel quad nic. Currently hosts shared files, setting up as pfsense box in VM. Also acts as spare gaming PC with a 580 or whatever someone brings. Hooked into laptop dock area via usb switch

Link to comment
Share on other sites

Link to post
Share on other sites

Ill have to agree with Palmer from oculus on this one. if you are doing 30 FPS then you failed. pro tip to game devs. games are not god damn movies. movies "work" at 24fps because you don't interact with them. movies are passive experiences. if these guys are in a rush to make movies then damn they should just go and make movies.

If you're making a movie, sure 24fps is good for that cinematic motion blur effect (except for movies with high motion like the Hobbit, where I can see why they made it 48fps)

 

So I would actually agree with them on this one. Sorta.

 

Movies are only in 24fps because when film was still used they worked out that 24fps is about the point where our brain stops seeing pictures and sees movement. This was figured out so filmmakers could spend less money of film and still give people an enjoyable experience. Because most people don't know any better 24fps is still used for movies today, it also provides the advantage of using less digital storage and is faster for VFX artists to work with 24fps as opposed to higher frame rates.

 

Spoiler

4790k @ 4.5Ghz 1.180v NZXT Kraken X31 | MSI Z97 Krait | Kingston Hyper X Fury 32GB 1866Mhz, 2 DIMMs white and 2 black | GTX 980 Ti - G1 Gaming | GTX 680 - Reference | SilverStone ST75F-P | Phanteks Enthoo Pro

Link to comment
Share on other sites

Link to post
Share on other sites

300fps in Minecraft makes me motion sick, like seriously not kidding

haha IKR!!!!

Its all looks these days

Link to comment
Share on other sites

Link to post
Share on other sites

I'm sorry but i could see very clearly who's banners they were :P

When they ran past I wasn't able to see much more than just the colours so my opinion is still unchanged even though the time that I was watching it at might have had something to do with my ability to see clearly.

Link to comment
Share on other sites

Link to post
Share on other sites

Right, but games like bf3 and ac2 dip to around 20 fps frequently and that's one of my main issues with 30 fps games. Anything under 30 fps is unplayable really. 

Which is why I said, originally, that a locked 30 fps without dips would make it better and more playable than a capped 30 fps. 

Interested in Linux, SteamOS and Open-source applications? Go here

Gaming Rig - CPU: i5 3570k @ Stock | GPU: EVGA Geforce 560Ti 448 Core Classified Ultra | RAM: Mushkin Enhanced Blackline 8GB DDR3 1600 | SSD: Crucial M4 128GB | HDD: 3TB Seagate Barracuda, 1TB WD Caviar Black, 1TB Seagate Barracuda | Case: Antec Lanboy Air | KB: Corsair Vengeance K70 Cherry MX Blue | Mouse: Corsair Vengeance M95 | Headset: Steelseries Siberia V2

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

300fps in Minecraft makes me motion sick, like seriously not kidding

 

But.. what? If you have a 300Hz monitor I really want a link. I need one of those for the bragging rights.


 

[spoiler = "My Computer Stuff"]

My ITX:

240 Air ; Z87I-Deluxe ; 4770K ; H100i ; G1 GTX 980TI ; Vengeance Pro 2400MHz (2x8GB) ; 3x 840 EVO (250GB) ; 2x WD Red Pro (4TB) ; RM650 ; 3x Dell U2414H ; G710+ ; G700s ; O2 + ODAC + Q701 ; Yamaha HTR-3066 + 5.1 Pioneer.

 

Things I Need To Get Off My Shelf:

250D ; 380T ; 800D ; C70 ; i7 920 ; i5 4670K ; Maximus Hero VI ; G.Skill 2133MHz (4x4GB) ; Crucial 2133MHz (2x4GB) ; Patriot 1600MHz (4x4GB) ; HX750 ; CX650M ; 2x WD Red (3TB) ; 5x 840 EVO (250GB) ; H60H100iH100i ; H100i ; VS247H-P ; K70 Reds ; K70 Blues ; K70 RGB Browns ; HD650.


Link to comment
Share on other sites

Link to post
Share on other sites

OffPIMu.png

 

Every PC gamer in the world right now.

Mobo: Z97 MSI Gaming 7 / CPU: i5-4690k@4.5GHz 1.23v / GPU: EVGA GTX 1070 / RAM: 8GB DDR3 1600MHz@CL9 1.5v / PSU: Corsair CX500M / Case: NZXT 410 / Monitor: 1080p IPS Acer R240HY bidx

Link to comment
Share on other sites

Link to post
Share on other sites

Movies are only in 24fps because when film was still used they worked out that 24fps is about the point where our brain stops seeing pictures and sees movement. This was figured out so filmmakers could spend less money of film and still give people an enjoyable experience. Because most people don't know any better 24fps is still used for movies today, it also provides the advantage of using less digital storage and is faster for VFX artists to work with 24fps as opposed to higher frame rates.

Actually, 15 FPS is when the human eye starts seeing movement and not frames. VSauce has a video that explains it. Go to 2:56 for where it's stated that 15 FPS is the minimum frame rate. The whole thing is definitely worth a watch and fairly appropriate to the topic.

 

That said, I agree that a film maker would want as low an FPS as possible so they can save on the cost of film. 24 FPS would seem to be a pretty good middle ground. However, the more and more digital cameras are used, the less 24 FPS becomes relevant. Storage could become a problem, but HDDs are cheep compared to film. There's really no good excuse not to raise the frame rate of movies at this point.

 

All that said, I'm fine with 24 FPS in movies. I'm not sitting there making a fuss over the frame rate. Frame rate has nothing to do with how I enjoy movies.

 

As far as an FPS closer to 24 providing better storytelling, total bullshit. What story is made better by frame rate? How would frame rate even make a story better? If your story is shit, if your story is great, the frame rate isn't going to make the movie better or worse. Sure, with movies, for a while audiences will notice that something is different. They'll notice a soap opera type of effect, but as movie studios learn how to alleviate some of the headaches of higher frame rates and as audiences get used to it, it wont be a big deal anymore. People will wonder why we ever used 24 FPS in the first place.

 

For games, it's a pretty well established that higher frame rates generally provides a better play experience. Games should run at 60FPS minimum. You could then make cut scenes in 30 FPS so you get the more cinematic feel, but after playing the game at 60 for so long, you're gonna see the difference and it's going to distract you from the story the developer is trying to tell, completely negating any supposed benefit of the "closer to a cinematic feel."

 

When are developers gonna wake up and realize games are not fucking movies?! When is the industry going realize that they don't have to confine themselves to "what would the movie industry do"? When are these developers going to get their fucking heads out of there asses AND TREAT THEIR OWN INDUSTRY AS ITS OWN ENTITY?! The point has come and gone when the game industry needs to realize that it is it's own person and aren't confined to their "parent's" (aka the movie industry) way of thinking. They also need to stop treating consumers like idiots. No one is falling for this bullshit they keep feeding us. We know you aren't making your games run at 30FPS because it's "more cinematic." We know that the current gen of consoles are shit and wont last 10 years. We're not stupid and we deserve better, the industry deserves better.

 

Didn't intend to go on a rant, but there ya go.

Link to comment
Share on other sites

Link to post
Share on other sites

Prerendered things may be better that way because they are practically small movie clips and contain the normal motion blur. But something else, no. It's neveer acceptable to 'choose' for 30fps.

i5 4670k - MSI GTX 770 gaming - Fractal design define R4 (windowed) - MSI Z87-G45 gaming - be quiet! Dark Rock Pro 2 - Corsair vengeance 8 gb (lp) - WB black 1tb - 256GB SSD - Corsair TX 750M - Ducky Shine 3

Link to comment
Share on other sites

Link to post
Share on other sites

45 is low as i'll go for slower games, but fast games require 60+ fps

 

don't know why it's always 60 or 30 for console games though, 45fps could be their sweet spot

Link to comment
Share on other sites

Link to post
Share on other sites

30 fps is not even good to watch porn.

You would know bro! 

 

Honestly, if I wanna watch film I'll watch freaking film!  Grab the projector kids, I'm watchin crap on film! 

I'm Batman!

Steam: Rukiri89 | uPlay: Rukiri89 | Origin: XxRukiriXx | Xbox LIVE: XxRUKIRIxX89 | PSN: Ericks1989 | Nintendo Network ID: Rukiri

Project Xenos: Motherboard: MSI Z170a M9 ACK | CPU: i7 6700k | Ram: G.Skil TridentZ 16GB 3000mhz | PSU: EVGA SuperNova 850w G2 | Case: Caselabs SMA8 | Cooling: Custom Loop | Still in progress 

Link to comment
Share on other sites

Link to post
Share on other sites

Framerate has nothing at all to do with the ability of a game designer to right a good storyline. This makes no sense. 60fps is better in every single way to 30fps. The only game that has ever been made that is better in 30fps compared to 60fps is South Park and the Stick of Truth because the game designers wanted to create a game that was like the episode with the same animations and those animations where running at 30fps ish on the TV so they had to emulate that. So if game designers want to run their games at 30fps then go and make a tv show or a film not a game.

 (\__/)

 (='.'=)

(")_(")  GTX 1070 5820K 500GB Samsung EVO SSD 1TB WD Green 16GB of RAM Corsair 540 Air Black EVGA Supernova 750W Gold  Logitech G502 Fiio E10 Wharfedale Diamond 220 Yamaha A-S501 Lian Li Fan Controller NHD-15 KBTalking Keyboard

Link to comment
Share on other sites

Link to post
Share on other sites

We're done with this discussion! 30fps is a failure! Now please accept that once and for all.

Personal Build Project "Rained-On"

helped building up the CPU Overclocking Database and GPU Overclocking Database, check them out ;)

#KilledMyWife #MakeBombs #LinusIsNotFunny || Please, dont use non-default grey font colors. Think about the night-theme users! ;)

Link to comment
Share on other sites

Link to post
Share on other sites

of course 30 FPS is shit!  LOL

 

they make next gen games 30 FPS becouse consoles are too shitty to run games at 60 FPS!!!!   Do you understand console fanboy noobs? 

 

configure this one at 30 FPS second at 60 FPS and tell me that 30 FPS is better becouse of its "magical shitty shit"

http://frames-per-second.appspot.com/

Computer users fall into two groups:
those that do backups
those that have never had a hard drive fail.

Link to comment
Share on other sites

Link to post
Share on other sites

The only way i would find 30fps acceptable for a game would be something like a point and click adventure, or something like the telltale games, basically movies.

 

The problem with 30fps is not JUST that the graphics is a slow blurry mess but it FEELS sluggish, it makes input feel slower than it does at 60fps. Yes some games may not benefit much by running 60fps but they will not be harmed either, where there are plenty more that are harmed by being forced to run at 30.

 

With all the attention this is getting I feel like devs will be forced to make 60fps the new benchmark so they can go to the media and be like "we heard our audience and we have made everything run at 60fps... (at 640x480 no doubt)

Desktop - Corsair 300r i7 4770k H100i MSI 780ti 16GB Vengeance Pro 2400mhz Crucial MX100 512gb Samsung Evo 250gb 2 TB WD Green, AOC Q2770PQU 1440p 27" monitor Laptop Clevo W110er - 11.6" 768p, i5 3230m, 650m GT 2gb, OCZ vertex 4 256gb,  4gb ram, Server: Fractal Define Mini, MSI Z78-G43, Intel G3220, 8GB Corsair Vengeance, 4x 3tb WD Reds in Raid 10, Phone Oppo Reno 10x 256gb , Camera Sony A7iii

Link to comment
Share on other sites

Link to post
Share on other sites

30 fps is playable only if the frame rate is locked to 30 without any dips. The issue with low frame rate isn't really the number itself, but the frame rate dip which much more noticeable at lower frame rates. 

 

Nnnnnnnnnno.. no ... noonononononon... nooooo! No.

 

No.

34839801.jpg

Security Analyst & Tech Enthusiast

Ask me anything.

Link to comment
Share on other sites

Link to post
Share on other sites

"30 FPS isn't a design choice... It's a failure" Luke's Quote of Palmer :D

PC Specs: - *NZXT Phantom 410 Black/Orange* - *AMD FX-8320 3.5GHz* - *CM Hyper 212 EVO* - *Gigabyte 990FXA-UD3* - *Corsair Vengeance 8gb 1600mHz* - *Gigabyte 780 Ti* - *Seagate Barracuda 500gb 7200rpm HDD* - *ModXStream PRO 600W PSU* -

Monitors: 2x BenQ GL2450 and 1x Some 22" 1080P Tv

Link to comment
Share on other sites

Link to post
Share on other sites

it feels like they are trying everything to make us accept 30 FPS, but seriously fuck off, we wont play at 30 FPS.

Dark souls 1 was horrible at 30, i installed a mod to make it run at 60 so i could actually play

Link to comment
Share on other sites

Link to post
Share on other sites

Okey, this post is going to cause some cringe's, heart attacks, laughters, crying and more.

 

Currently I am bound down to a sony vaio laptop(4gb, i3, no dedicated gpu, 60 Hz screen.) and I'm playing Blacklight: Retribution mostly, quite a nice shooter and all but that is not my point.

I am playing that game on 15-30fps and have no trouble unless it stays on 15FPS for longer times(5 secs and above) but bad scores due to lower fps? nope. I recently I played a match with 19/5 k/d.

Noticeable drop in FPS? nope, atleast not unless my web is spacing at the same time.

hurt of my eyes and troubeld sight? nope neither that, never had it neither.

So yeah, in my opinion(internet+opinion= not good) this whole discussion is based on multiple factors, the refresh rate of your screen, the fps of the game and probarlly the distance and quality of the picture.

Remember when the refresh rate of PC's used to be 30Hz mostly? then it was still possible to play games at 15fps without much trouble, same could be noticed on 60Hz screens with 30FPS.

But recently alot of gamers started moving to 120Hz screens while the games are still aimed at 60Hz and there for are meant to run 30-60FPS without trouble.

I am sure we will get all this trouble again once the market is moving on and goes even higher on refresh rates and the games are still staying behind.

But since we are right now  moving(or not) towards 4k screens and 120Hz screens at the same time. That is where the problem is, the manufacturer's can keep up with 2 requests at the same time form the audience, okey the 4k is winning, but that is only temporary since it is all new(altough a canon 350D can shoot at 3456x2304, wich is near 4k, higher end models go even further, thought 5.6kx3-4ish k at the 600D) and shiny and stuff like that, but it won't catch on at the market due to the price's and the low amount of support there is at the moment.

May the light have your back and your ISO low.

Link to comment
Share on other sites

Link to post
Share on other sites

24 FPS only in movies (The soap opera effect makes movies look fucking stupid to look at imo)

60+ FPS in games.

 

That should be a standard

"It seems we living the American dream, but the people highest up got the lowest self esteem. The prettiest people do the ugliest things, for the road to riches and diamond rings."- Kanye West, "All Falls Down"

 

Link to comment
Share on other sites

Link to post
Share on other sites

Why do we give a shit about what these idiots think? Posting ridiculous crap for click-baiting + appeal the ignorant console user base.

 

These are video games, the higher the better, why the fuck is there still an argument....oh wait i know why, because many people are ignorant then we get big companies and developers saying ridiculous crap like the Order 1886 devs recently, further feeding this ignorance and delusion.

 

I'm so fucking sick of hearing about this, and you can replace framerate with various things such as resolution...can we please stop...

“The mind of the bigot is like the pupil of the eye; the more light you pour upon it the more it will contract” -Oliver Wendell Holmes “If it can be destroyed by the truth, it deserves to be destroyed by the truth.” -Carl Sagan

Link to comment
Share on other sites

Link to post
Share on other sites

After playing a game at 30fps for an hour my eyes will get used to it, and it will look perfectly fine to me.

30fps is fine once youre eyes adjust.

Although I feel this article is intended for cinematic heavy games rather than action intensive games.

I would much rather a better looking game at 30fps than a terrible looking game at 60fps.

Although 60fps>30fps, I would rather better graphics at a lower frame rate than a higher frame rate with lower graphic fidelity for a story driven game.

Link to comment
Share on other sites

Link to post
Share on other sites

Oh! I found a way to prove this wrong! @Slick @LinusTech The only games that would benefit would be AR games because the camera would add in its own motion blur, creating a true to life scene at 30fps. This would be the only example that I can think of and even an AR fps game similar to the AR paintball recently discussed would have to be at more than 30. Next next gen consoles better have 30 more fps in addition to 30% more gen if they are ever going to be competitive in the gaming market.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×