Jump to content

Why do we need more than 24fps in games...

CyberJesus88

the point where it becomes indistinguishable is about 200-400fps depending on the person.

no it's not, I've always said it's about 75 but that may be a bit high.

 

http://dwb4.unl.edu/Chem/CHEM869P/CHEM869PLinks/www.ece.wpi.edu/infoeng/textbook/node71.html

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

if i play GO on my 144hz monitor even 90 fps is laggy

Specs of my PC:

CPU: AMD FX 8350  Motherboard: Gigabyte 990XA UD3  GPU: Gigabyte GTX 770 Windforce 2GB  HDD: WD Green 2TB SSD:  Corsair Force GT 120GB SSD RAM: Corsair 8GB(2X4) PSU: CoolerMaster G650M

Link to comment
Share on other sites

Link to post
Share on other sites

no it's not, I've always said it's about 75 but that may be a bit high.

 

http://dwb4.unl.edu/Chem/CHEM869P/CHEM869PLinks/www.ece.wpi.edu/infoeng/textbook/node71.html

lool so u saying i cant tell the difference between 75 and say, 144? because to me its night and day!

Specs of my PC:

CPU: AMD FX 8350  Motherboard: Gigabyte 990XA UD3  GPU: Gigabyte GTX 770 Windforce 2GB  HDD: WD Green 2TB SSD:  Corsair Force GT 120GB SSD RAM: Corsair 8GB(2X4) PSU: CoolerMaster G650M

Link to comment
Share on other sites

Link to post
Share on other sites

i cant even stand less then 120Fps if it drops i can tell.

24Fps thing is complete bull shit and the scientists that put this theory out there clearly do not play video games. 

Link to comment
Share on other sites

Link to post
Share on other sites

I've never used a 120hz screen but I have kicked my laptop screen from 60 to 95hz. There is a pretty definite difference.

NZXT Phantom|FX-8320 @4.4GHz|Gigabyte 970A-UD3P|240GB SSD|2x 500GB HDD|16GB RAM|2x AMD MSI R9 270|2x 1080p IPS|Win 10

Dell Precision M4500 - Dell Latitude E4310 - HTC One M8

$200 Volvo 245

 

Link to comment
Share on other sites

Link to post
Share on other sites

I found this link which describes very aptly why it's more subjective than anything else. Not everyone shares the exact same physiology, and the same can be said for the eyes. the connection between the eyes and the brain it self. I was born blind and remained blind the first 3-4 months of my life. My cornea is a bit offset, or "bent" as we say here in Norway, meaning I have bad perephrial vision and sometimes can't distinguish details that are far off in the distance.

 

But: I have experienced 24hz, 30hz, 60hz and 120hz and I can without a doubt say that it makes a difference. But the reason is a bit more complex than people would first perceive.

 

When we increase the frame rate we not only increase the amount of frames but the amount of light and colour and the increase, decrease or general changing of light and colour. Film-makers know this and it's the reason why the latest installment of The Hobbit was shown not only in IMAX but in glorious 120Hz. That means more pixels, more light, colouring and more detailed transitions.

 

Why would they throw money in to incredibly expensive 120Hz cameras and projectors If there is no perceivable effect? The producer wouldn't have allowed the move in the first place if it were the case. Consider 3D technology. Originally I said I couldn't see the difference -- until I saw Dread 3D. Just watching that movie outside of a 3D environment screams gimmick. But once you're in the theatre with a screen that supports said technology (provided you wear glasses) the 3D scenes become something of an experience rather than a gimmick. So... yeah. I never thought I would write that, but there you go.

 

Add to the fact that hundreds, thousands of reviewers, gamers, benchmarkers and so on get 120Hz screens and rave about the experience you'd have to be some kind of paranoid conspiracy theorist to deduce that all of these people are working in cahoots to sell 120Hz screens.

 

Bullocks, I say.

 

If you don't see the difference there might be something wrong with your eyesight, like mine is (I can only perceive newer 3D technology which is why I originally thought it was nothing more than a gimmick).

 

Also, comparing still images in regards to FPS... is completely missing the point.

Thoroughness rating
#########

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not sure about you, but I can definitely see more than 24FPS.

 

A consistent FPS is more important than a higher FPS in my opinion, as you notice the drops in FPS more than anything. 

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not sure about you, but I can definitely see more than 24FPS.

 

A consistent FPS is more important than a higher FPS in my opinion, as you notice the drops in FPS more than anything. 

 

Definitely... which kind of adds to the point that there is a perceivable difference.

Thoroughness rating
#########

Link to comment
Share on other sites

Link to post
Share on other sites

ever tried 120hz? the mouse is way more smooth its awesome 

i got better aim in shooter games

and in windows the mouse is way smoother 

you should try it

yes I have and it is smooth but I'd much rather have better colors and viewing angles compared to my last 120hz panel

Link to comment
Share on other sites

Link to post
Share on other sites

Definitely... which kind of adds to the point that there is a perceivable difference.

Roll on G-Sync!

Link to comment
Share on other sites

Link to post
Share on other sites

The human eye can see beyond 24 FPS. And there are people in the world that can see beyond 30 FPS. So it's become standard that 30 FPS and faster is the "butter" zone.

Link to comment
Share on other sites

Link to post
Share on other sites

lool so u saying i cant tell the difference between 75 and say, 144? because to me its night and day!

no, the guys over at WPI.edu are, and apparently so are these guys who have based there research of peer reviewed work from MIT

 

http://www.google.com.au/url?sa=t&rct=j&q=&esrc=s&source=web&cd=82&ved=0CCsQFjABOFA&url=http%3A%2F%2Fweb.cs.wpi.edu%2F~claypool%2Fpapers%2Ffr%2Ffulltext.pdf&ei=wCvhUpTtDsagkwXnv4DwAQ&usg=AFQjCNHGUvVvV6ECtR0XZrePA-Rd6qi2Jg&bvm=bv.59568121,d.dGI

 

What you can perceive, as I have already said, is essentially input lag or the lack there of.

 

And here's a bit from good old wiki, although I do think they need to update because many other scientists including Dr Tim Smith have release substantial papers regarding what and why we percieve what we do.

 

The human eye and its brain interface, the human visual system, can process 10 to 12 separate images per second, perceiving them individually.[1] The threshold of human visual perception varies depending on what is being measured. When looking at a lighted display, people begin to notice a brief interruption of darkness if it is about 16 milliseconds or longer.[2] When given very short single-millisecond visual stimulus people report a duration of between 100 ms and 400 ms due to persistence of vision in the visual cortex. This may cause images perceived in this duration to appear as one stimulus, such as a 10 ms green flash of light immediately followed by a 10 ms red flash of light perceived as a single yellow flash of light.[3] Persistence of vision may also create an illusion of continuity, allowing a sequence of still images to give the impression of motion.

 

 

The visual persistance can be as low as 10ms, which would give some people (not many) the ability to see 100fps.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

I found this link which describes very aptly why it's more subjective than anything else. Not everyone shares the exact same physiology, and the same can be said for the eyes. the connection between the eyes and the brain it self. I was born blind and remained blind the first 3-4 months of my life. My cornea is a bit offset, or "bent" as we say here in Norway, meaning I have bad perephrial vision and sometimes can't distinguish details that are far off in the distance.

 

But: I have experienced 24hz, 30hz, 60hz and 120hz and I can without a doubt say that it makes a difference. But the reason is a bit more complex than people would first perceive.

 

When we increase the frame rate we not only increase the amount of frames but the amount of light and colour and the increase, decrease or general changing of light and colour. Film-makers know this and it's the reason why the latest installment of The Hobbit was shown not only in IMAX but in glorious 120Hz. That means more pixels, more light, colouring and more detailed transitions.

 

Why would they throw money in to incredibly expensive 120Hz cameras and projectors If there is no perceivable effect? The producer wouldn't have allowed the move in the first place if it were the case. Consider 3D technology. Originally I said I couldn't see the difference -- until I saw Dread 3D. Just watching that movie outside of a 3D environment screams gimmick. But once you're in the theatre with a screen that supports said technology (provided you wear glasses) the 3D scenes become something of an experience rather than a gimmick. So... yeah. I never thought I would write that, but there you go.

 

Add to the fact that hundreds, thousands of reviewers, gamers, benchmarkers and so on get 120Hz screens and rave about the experience you'd have to be some kind of paranoid conspiracy theorist to deduce that all of these people are working in cahoots to sell 120Hz screens.

 

Bullocks, I say.

 

If you don't see the difference there might be something wrong with your eyesight, like mine is (I can only perceive newer 3D technology which is why I originally thought it was nothing more than a gimmick).

 

Also, comparing still images in regards to FPS... is completely missing the point.

That link is kinda misinforming, whilst a number of people can see 100fps it is a rather rare number,  His inclusion of the pilot test to prove frame rates is wrong, all that tests proves is you can see one frame out of 220 not 220 frames per second.  As for the rest I think I have posted enough peer reviewed articles to explain how it works and why.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

So 24fps is good enough for movies and according to science we can only see can only see 24fps.

So why do we need higher for games.

Opinions?

 

Seeing and feeling are two very different things.

Rig: i7 2600K @ 4.2GHz, Larkooler Watercooling System, MSI Z68a-gd80-G3, 8GB G.Skill Sniper 1600MHz CL9, Gigabyte GTX 670 Windforce 3x 2GB OC, Samsung 840 250GB, 1TB WD Caviar Blue, Auzentech X-FI Forte 7.1, XFX PRO650W, Silverstone RV02 Monitors: Asus PB278Q, LG W2243S-PF (Gaming / overclocked to 74Hz) Peripherals: Logitech G9x Laser, QPad MK-50, AudioTechnica ATH AD700

Link to comment
Share on other sites

Link to post
Share on other sites

Movies are different. They dont have input lag like gaming does therefore when filming it only captures whats going on in the camera but for gaming it has to process what you decide to do, where to move, ect. Hope that makes sense.

CPU: i7 8700 GPU: Asus ROG Strix GTX 1080 Mobo: Gigabyte Z370 Gaming 5 Ram: 16GB EVGA SuperSC SSD: Samsung 850 EVO 250GB 
HDD: Seagate Barracuda 2TB PSU: TX650M Case: NZXT S340 Elite OS: Windows 10 Mouse: Logitech G403 Mouse Mat: HyperX Fury S Pro XL 
Keyboard: CM Masterkeys Pro S (reds) Headphones: Sennheiser HD598 Monitor: Asus 24' MG248QR Devices: IPhone 11 Pro Max + 13' Macbook Pro

Link to comment
Share on other sites

Link to post
Share on other sites

That link is kinda misinforming, whilst a number of people can see 100fps it is a rather rare number,  His inclusion of the pilot test to prove frame rates is wrong, all that tests proves is you can see one frame out of 220 not 220 frames per second.  As for the rest I think I have posted enough peer reviewed articles to explain how it works and why.

 

no point, I posted a single post here i believe and all it stated is that we're digging an old thread already.

 

at some point this thread just needs to die

Check out the build: Used to be Obot, now Lilith

Shameless: Me

Link to comment
Share on other sites

Link to post
Share on other sites

Try running minecraft with 30&60fps. theres a huge difference. 45 is really low for me, thats why i dont want to record MW2 with fraps :-/

My Rig: AMD Ryzen 5800x3D | Scythe Fuma 2 | RX6600XT Red Devil | B550M Steel Legend | Fury Renegade 32GB 3600MTs | 980 Pro Gen4 - RAID0 - Kingston A400 480GB x2 RAID1 - Seagate Barracuda 1TB x2 | Fractal Design Integra M 650W | InWin 103 | Mic. - SM57 | Headphones - Sony MDR-1A | Keyboard - Roccat Vulcan 100 AIMO | Mouse - Steelseries Rival 310 | Monitor - Dell S3422DWG

Link to comment
Share on other sites

Link to post
Share on other sites

That link is kinda misinforming, whilst a number of people can see 100fps it is a rather rare number,  His inclusion of the pilot test to prove frame rates is wrong, all that tests proves is you can see one frame out of 220 not 220 frames per second.  As for the rest I think I have posted enough peer reviewed articles to explain how it works and why.

 

(I'm not trying to start a huge argument, nor am I trying to put-down your information, because I do agree with most of it)

 

 

I don't really care what research says or doesn't say, I'd rather go with my personal experience. There's THOUSANDS of articles saying it's 60, 30, 120, 200, 1000, and with that many different arguments out there I find it hard to trust just one.

 

When I play games and they dip below 60-80, I can definitely notice a difference.  Moving from a 60hz 5ms panel to my vg248qe's has been a massive improvement, not only did it make my aim better, I feel as if I'm half a second ahead of everyone else in my game.  Could it be that it's a better panel then the crappy HP one I moved from, maybe, but I know what I see, and I think a lot of others will agree.

 

Arguing over this makes no sense to me, seeing as there's so many different articles/studies on the internet about it, and especially since they say "oh well some people can train themselves to see more, and some are different than the others" There's to many variables for their to be a "definitive answer" in my opinion. 

 

But I think it does get to a point of diminishing returns, is there a difference between 100 and 120? Maybe, but I think it would take some serious getting used to the game that was being played, and all other variables would have to be taken out, (other players using different things, different spots of the map, etc)

Stuff:  i7 7700k @ (dat nibba succ) | ASRock Z170M OC Formula | G.Skill TridentZ 3600 c16 | EKWB 1080 @ 2100 mhz  |  Acer X34 Predator | R4 | EVGA 1000 P2 | 1080mm Radiator Custom Loop | HD800 + Audio-GD NFB-11 | 850 Evo 1TB | 840 Pro 256GB | 3TB WD Blue | 2TB Barracuda

Hwbot: http://hwbot.org/user/lays/ 

FireStrike 980 ti @ 1800 Mhz http://hwbot.org/submission/3183338 http://www.3dmark.com/3dm/11574089

Link to comment
Share on other sites

Link to post
Share on other sites

That link is kinda misinforming, whilst a number of people can see 100fps it is a rather rare number, His inclusion of the pilot test to prove frame rates is wrong, all that tests proves is you can see one frame out of 220 not 220 frames per second. As for the rest I think I have posted enough peer reviewed articles to explain how it works and why.

It wouldn't be the first misinforming link in this thread.

But I still stand be the fact that it is individual. Not everyone are born the same and the difference can be chalked up to strong attributes and deficiencies. We can create an understanding of the similarities between people in general, but I feel that the scientific research behind this is not complete. To be scientifically validated means peer reviews, lots of extra testing, since the process is largely about disproving rather than proving, and these claims we are making (they are nothing more) don't hold weight under scientific scrutiny without exhaustive research in to the subject.

So yeah, this post needs to die.

Thoroughness rating
#########

Link to comment
Share on other sites

Link to post
Share on other sites

Try playing at 60 then 100+ Its all icky and teary.

Like E-Sports? Check out the E-Sports forum for competitive click click pew pew

Like Anime? Check out Heaven Society the forums local Anime club

I was only living because it was too much trouble to die.

R9 7950x | RTX4090

 

Link to comment
Share on other sites

Link to post
Share on other sites

In a movie, you have the advantage of motion blur in between frames, which helps to blend frames together from your point of view. This is not, however ideal in games, as you may have experienced - I always turn motion blur off due to the lack of clarity you end up with. Also, as games are interactive, you can 'feel' the difference in the input lag. Your brain notices the difference in time between you moving the mouse and the update appearing, and you get a jutter as each image appears on the screen, followed by stillness, even if you are moving the mouse smoothly.

Check out my video work at youtube.com/SuperUserTech


Acer Aspire 5750G - i5-2410m | 8GB 1333Mhz RAM | 500GB 5400RPM HDD | Nvidia GT540m


2xBenQ EW2440L | Sennheiser HD600 | Corsair Vengeance K65 | Corsair Vengeance M65 | Panasonic Lumix G7 | Panasonic 20mm F1.7

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×