Jump to content

How many FPS can a human eye see?

GamerGuy

I think we see around 120fps, but things that we see is not smooth as we see on 120Hz TV because we have 360 degree shutter angle, that mean more motion blur than TV

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 month later...

Folks, there is an underlying misunderstanding of the science behind this whole argument.

 

That problem is that people confuse frames per second and hertz. Monitors refresh their "frame" at 60 hertz - and now we have new spiffy monitors that can do 120 hertz.

 

A "frame" is actually a compilation of TWO images called "fields" that the monitor draws. So, when the monitor refreshes at 60 hz it is only really drawing at 30 fps. In other words, one "frame" takes 2 HZ to draw.

 

No matter how many "FPS" you feed it from your video card - it is still only displaying at a MAXIMUM of 30 fps.

 

So, up until the advent of the new 120hz monitors, you have NEVER seen an image more than 30 fps. Guess what, do the math - the new monitors can only draw 60 frames per second max. SO, you will not be "seeing" the 90 fps your computer tells you it is outputting - because the monitor simply cant draw that many frames per second.

 

Now, as to how much "fps" the eye can see. First off many have said - correctly - that we do not see in "fps." True. What we can "see" and what we can "discern" however, are two different animals. The human eye/brain system CAN "see" more than 30 fps, but it cannot DISCERN the difference in an image more than 30 fps.

 

Movies are at 24 FPS by the way. Yes, the multi-trillion dollar movie industry has figured out that they don't have to feed you movies faster than 24 fps.

 

So, what have we learned so far:

 

1) A "frame" is a compilation of two "fields" that the monitor has to draw.

2) A 60hz monitor takes 2 fields and turns them into a frame. A 60hz monitor can only draw 30 frames per second.

3) Anyone who tells you they can tell the difference between 30 fps and 100 fps on a 60hz screen does not understand science - since they are only being given 30fps MAXIMUM anyway.

 

The "perception" that gamers get from being able to "tell" the difference between more fps is quite simple. Any time you are playing a game that requires a lot of computing power - you might be getting 120 fps. Ok, well thats good. But then you start running around and adding eefects and particles and explosions and the FPS "dips." In other words you aren't getting "120 fps" - you are getting a HIGH of 120fps and a low under 30fps (which is where you can see a visible stutter).

 

So, I am looking forward to playing Battlefield 4. I have played the beta and I am getting 90 fps, but it is dipping into the 20s when under a lot of stress. So, what I need to do is to tweak and or upgrade my computer so that the LOW spikes stay above 30fps at all time. Even though your monitor can't DRAW faster than 30fps or 60fps - it does benefit from have producing MORE fps so that your low spikes don't dip into the visible range.

 

What does your monitor do with all the extra fps your computer is sending it? Nothing, it just ignores it. It can't draw beyond its technical capacity.

 

Hope this helps you all understand the myth and the reality of this seemingly endless "what can the human eye see" debate.

Link to comment
Share on other sites

Link to post
Share on other sites

Folks, there is an underlying misunderstanding of the science behind this whole argument.

 

That problem is that people confuse frames per second and hertz. Monitors refresh their "frame" at 60 hertz - and now we have new spiffy monitors that can do 120 hertz.

 

A "frame" is actually a compilation of TWO images called "fields" that the monitor draws. So, when the monitor refreshes at 60 hz it is only really drawing at 30 fps. In other words, one "frame" takes 2 HZ to draw.

 

No matter how many "FPS" you feed it from your video card - it is still only displaying at a MAXIMUM of 30 fps.

 

So, up until the advent of the new 120hz monitors, you have NEVER seen an image more than 30 fps. Guess what, do the math - the new monitors can only draw 60 frames per second max. SO, you will not be "seeing" the 90 fps your computer tells you it is outputting - because the monitor simply cant draw that many frames per second.

 

Now, as to how much "fps" the eye can see. First off many have said - correctly - that we do not see in "fps." True. What we can "see" and what we can "discern" however, are two different animals. The human eye/brain system CAN "see" more than 30 fps, but it cannot DISCERN the difference in an image more than 30 fps.

 

Movies are at 24 FPS by the way. Yes, the multi-trillion dollar movie industry has figured out that they don't have to feed you movies faster than 24 fps.

 

So, what have we learned so far:

 

1) A "frame" is a compilation of two "fields" that the monitor has to draw.

2) A 60hz monitor takes 2 fields and turns them into a frame. A 60hz monitor can only draw 30 frames per second.

3) Anyone who tells you they can tell the difference between 30 fps and 100 fps on a 60hz screen does not understand science - since they are only being given 30fps MAXIMUM anyway.

 

The "perception" that gamers get from being able to "tell" the difference between more fps is quite simple. Any time you are playing a game that requires a lot of computing power - you might be getting 120 fps. Ok, well thats good. But then you start running around and adding eefects and particles and explosions and the FPS "dips." In other words you aren't getting "120 fps" - you are getting a HIGH of 120fps and a low under 30fps (which is where you can see a visible stutter).

 

So, I am looking forward to playing Battlefield 4. I have played the beta and I am getting 90 fps, but it is dipping into the 20s when under a lot of stress. So, what I need to do is to tweak and or upgrade my computer so that the LOW spikes stay above 30fps at all time. Even though your monitor can't DRAW faster than 30fps or 60fps - it does benefit from have producing MORE fps so that your low spikes don't dip into the visible range.

 

What does your monitor do with all the extra fps your computer is sending it? Nothing, it just ignores it. It can't draw beyond its technical capacity.

 

Hope this helps you all understand the myth and the reality of this seemingly endless "what can the human eye see" debate.

That might be true for interlaced monitors.

But since all modern monitors use progressive scan, the whole frame is drawed in a single hertz.

Link to comment
Share on other sites

Link to post
Share on other sites

Its a case of deminishing returns, if you start at 7.5fps youll notice the frame change, if you double that to 15 things start to appear smoother, if we keep doubling, at 30fps things start to look fluid like actual motion (hence modern movies and consoles use it being the smoothest with the minimum power consumption), 60fps things are as fluid as most people will need (you have to more than double what is already some serious gpu power to hit 120hz and put up with overclcoked tn panels, at 120fps it looks just as fluid but in very quick movements like when you snap the mouse to turn around (as seen in linus's "can linus see more than 60fps" video), its hard to tell but there is a minor givaway in small spaces and walkways but like i said the benefits gained probably outweight the downsides (added gpu and tn panels)

if you were to double again to 240fps and 480fps (if monitors could display it) the benefits would be minimal.
i try to get 60fps in fast paced stuff like bf3/4 and such where responses are needed. and 30fps if the game isnt as fast paced (ill turn it up and get maybe 40fps givingmyself some headroom then turn on vsync to lock it to 30fps to prevent tearing)
that said the eye does have something called persistence of vision which is essentially motion blur, our eyes do have a brief resetting period but its covered up by our brain, which then sees the next image and melds them together to give the illusion that something moved.
http://www.youtube.com/watch?v=W-yLfm5HsHc
these guys made me want to buy a pentium 4, glad I didnt...

Falcon: Corsair 750D 8320at4.6ghz 1.3v | 4GB MSI Gaming R9-290 @1000/1250 | 2x8GB 2400mhz Kingston HyperX Beast | Asus ROG Crosshair V Formula | Antec H620 | Corsair RM750w | Crucial M500 240GB, Toshiba 2TB, DarkThemeMasterRace, my G3258 has an upgrade path, my fx8320 doesn't need one...total cost £840=cpu£105, board£65, ram£105, Cooler £20, GPU£200, PSU£88, SSD£75, HDD£57, case£125.

 CASE:-NZXT S340 Black, CPU:-FX8120 @4.2Ghz, COOLER:-CM Hyper 212 EVO, BOARD:-MSI 970 Gaming, RAM:-2x4gb 2400mhz Corsair Vengeance Pro, GPU: SLI EVGA GTX480's @700/1000, PSU:-Corsair CX600m, HDD:-WD green 160GB+2TB toshiba
CASE:-(probably) Cooltek U1, CPU:-G3258 @4.5ghx, COOLER:-stock(soon "MSI Dragon" AiO likely), BOARD:-MSI z87i ITX Gaming, RAM:-1x4gb 1333mhz Patriot, GPU: Asus DCU2 r9-270 OC@1000/1500mem, PSU:-Sweex 350w.., HDD:-WD Caviar Blue 640GB
CASE:-TBD, CPU:-Core2Quad QX9650 @4Ghz, COOLER:-OCZ 92mm tower thing, BOARD:-MSI p43-c51, RAM:-4x1GB 800mhz Corsair XMS2, GPU: Zotac GTX460se @800/1000, PSU:-OCZ600sxs, HDD:-WD green 160GBBlueJean-A
 CASE:-Black/Blue Sharkoon T9, CPU:-Phenom2 x4 B55 @3.6Ghz/1.4v, COOLER:-FX8320 Stock HSF, BOARD:-M5A78L-M/USB3, RAM:-4GB 1333mhz Kingston low profile at 1600mhz, GPU:-EVGA GTX285, PSU:-Antec TP550w modu, STORAGE:-240gb  M500+2TB Toshiba
CASE:-icute zl02-3g-bb, CPU:-Phenom2 X6 1055t @3.5Ghz, COOLER:-Stock, BOARD:-Asrock m3a UCC, RAM:2x2GB 1333mhz Zeppelin (thats yellow!), GPU: XFX 1GB HD6870xxx, PSU:-some 450 POS, HDD:-WD Scorpio blue 120GB
CASE:-Packard Bell iMedia X2424, Custom black/red Aerocool Xpredator fulltower, CPU's:-E5200, C2D [email protected]<script cf-hash='f9e31' type="text/javascript"> /* */</script>(so e8500), COOLER:-Scythe Big shuriken2 Rev B, BFG gtx260 sp216 OC, RAM:-tons..
Gigabyte GTX460, Gigabyte gt430,
GPU's:-GT210 1GB,  asus hd6670 1GB gddr5, XFX XXX 9600gt 512mb Alpha dog edition, few q6600's
PICTURES CASE:-CIT mars black+red, CPU:-Athlon K6 650mhz slot A, COOLER:-Stock, BOARD:-QDI Kinetiz 7a, RAM:-256+256+256MB 133mhz SDram, GPU:-inno3d geforce4 mx440 64mb, PSU:-E-Zcool 450w, STORAGE:-2x WD 40gb "black" drives,
CASE:-silver/red raidmax cobra, CPU:-Athlon64 4000+, COOLER:-BIG stock one, BOARD:-MSI something*, RAM:-(matched pair)2x1GB 400mhz ECC transcend, GPU:-ati 9800se@375core/325mem, PSU:-pfft, HDD:-2x maxtor 80gb,
PICTURES CASE:-silver/red raidmax cobra (another), CPU:-Pentium4 2.8ghz prescott, COOLER:-Artic Coolering Freezer4, BOARD:-DFI lanparty infinity 865 R2, RAM:-(matched pair)2x1GB 400mhz kingston, GPU:-ati 9550@375core/325mem, PSU:-pfft, HDD:-another 2x WD 80gb,
CASE:-ML110 G4, CPU:-xeon 4030, COOLER:-stock leaf blower, BOARD:-stock raid 771 board, RAM:-2x2GB 666mhz kingston ECC ddr2, GPU:-9400GT 1GB, PSU:-stock delta, RAID:-JMicron JMB363 card+onboard raid controller, HDD:-320gb hitachi OS, 2xMaxtor 160gb raid1, 500gb samsungSP, 160gb WD, LAPTOP:-Dell n5030, CPU:-replaced s*** cel900 with awesome C2D E8100, RAM:-2x2GB 1333mhz ddr3, HDD:-320gb, PHONE's:-LG optimus 3D (p920) on 2.3.5@300-600mhz de-clock (batteryFTW)
Link to comment
Share on other sites

Link to post
Share on other sites

humans can indivudually identify about 24 fps, but you dont see in fps

Intel 3570k 3,4@4,5 1,12v Scythe Mugen 3 gigabyte 770     MSi z77a GD55    corsair vengeance 8 gb  corsair CX600M Bitfenix Outlaw 4 casefans

 

Link to comment
Share on other sites

Link to post
Share on other sites

The eyes are analog and in theory sees the world with an infinite (yes, infinite) amount of FPS.

It's the brain however that processes this information. However, it doesn't work on a linear scale. When you feed it information at a fixed FPS it's all up to the speed of the brain and how trained your eyes are to determine how many fps you can tell the difference between.

 

Otherwise the brain picks up visual changes more "as needed" rather than at a fixed meaurable FPS count. A camera flash for example flashes for a -very- short period of time. One could compare it to a single frame at maybe 15000fps. The brain will pick it up because it's a sudden change and not a constant change,

 

It's really not measurable, but more has to do with what you are used to (the placebo effect also comes into play here).

------------------------ Liquidfox R3 ------------------------

Fractal Design Meshify 2 Compact – Corsair AX860i – Asus ROG Crosshair VIII Dark Hero – AMD Ryzen 7 5900X – Nvidia GTX1070 Founders

 

Link to comment
Share on other sites

Link to post
Share on other sites

Stuff:  i7 7700k @ (dat nibba succ) | ASRock Z170M OC Formula | G.Skill TridentZ 3600 c16 | EKWB 1080 @ 2100 mhz  |  Acer X34 Predator | R4 | EVGA 1000 P2 | 1080mm Radiator Custom Loop | HD800 + Audio-GD NFB-11 | 850 Evo 1TB | 840 Pro 256GB | 3TB WD Blue | 2TB Barracuda

Hwbot: http://hwbot.org/user/lays/ 

FireStrike 980 ti @ 1800 Mhz http://hwbot.org/submission/3183338 http://www.3dmark.com/3dm/11574089

Link to comment
Share on other sites

Link to post
Share on other sites

The human eye isn't calculated in fps. It sees unlimited "fps" but there's natural motion blur. That's why looking at 120hz monitors vs. 60hz monitors looks way different.

Intel i5 4670K 3.4GHz | EVGA 780Ti Classified | ASUS Gryphon Z87 w/ Armor Kit | G.Skill Sniper 8GB @1866MHz

 

Samsung 840 Evo 500GB | Seagate Barracuda 3TB | Kingwin Lazer 850W Bronze PSU | Corsair 350D Window

 

Razer Blackwidow Ultimate BF3 Edition | Razer Naga Molten

 

Link to comment
Share on other sites

Link to post
Share on other sites

Folks, there is an underlying misunderstanding of the science behind this whole argument.

 

That problem is that people confuse frames per second and hertz. Monitors refresh their "frame" at 60 hertz - and now we have new spiffy monitors that can do 120 hertz.

 

A "frame" is actually a compilation of TWO images called "fields" that the monitor draws. So, when the monitor refreshes at 60 hz it is only really drawing at 30 fps. In other words, one "frame" takes 2 HZ to draw.

 

No matter how many "FPS" you feed it from your video card - it is still only displaying at a MAXIMUM of 30 fps.

 

So, up until the advent of the new 120hz monitors, you have NEVER seen an image more than 30 fps. Guess what, do the math - the new monitors can only draw 60 frames per second max. SO, you will not be "seeing" the 90 fps your computer tells you it is outputting - because the monitor simply cant draw that many frames per second.

 

Now, as to how much "fps" the eye can see. First off many have said - correctly - that we do not see in "fps." True. What we can "see" and what we can "discern" however, are two different animals. The human eye/brain system CAN "see" more than 30 fps, but it cannot DISCERN the difference in an image more than 30 fps.

 

Movies are at 24 FPS by the way. Yes, the multi-trillion dollar movie industry has figured out that they don't have to feed you movies faster than 24 fps.

 

So, what have we learned so far:

 

1) A "frame" is a compilation of two "fields" that the monitor has to draw.

2) A 60hz monitor takes 2 fields and turns them into a frame. A 60hz monitor can only draw 30 frames per second.

3) Anyone who tells you they can tell the difference between 30 fps and 100 fps on a 60hz screen does not understand science - since they are only being given 30fps MAXIMUM anyway.

 

The "perception" that gamers get from being able to "tell" the difference between more fps is quite simple. Any time you are playing a game that requires a lot of computing power - you might be getting 120 fps. Ok, well thats good. But then you start running around and adding eefects and particles and explosions and the FPS "dips." In other words you aren't getting "120 fps" - you are getting a HIGH of 120fps and a low under 30fps (which is where you can see a visible stutter).

 

So, I am looking forward to playing Battlefield 4. I have played the beta and I am getting 90 fps, but it is dipping into the 20s when under a lot of stress. So, what I need to do is to tweak and or upgrade my computer so that the LOW spikes stay above 30fps at all time. Even though your monitor can't DRAW faster than 30fps or 60fps - it does benefit from have producing MORE fps so that your low spikes don't dip into the visible range.

 

What does your monitor do with all the extra fps your computer is sending it? Nothing, it just ignores it. It can't draw beyond its technical capacity.

 

Hope this helps you all understand the myth and the reality of this seemingly endless "what can the human eye see" debate.

 

 

I saved this for future use in a debate class. Hope you don't mind me taking it. Love you

Intel i5 4670K 3.4GHz | EVGA 780Ti Classified | ASUS Gryphon Z87 w/ Armor Kit | G.Skill Sniper 8GB @1866MHz

 

Samsung 840 Evo 500GB | Seagate Barracuda 3TB | Kingwin Lazer 850W Bronze PSU | Corsair 350D Window

 

Razer Blackwidow Ultimate BF3 Edition | Razer Naga Molten

 

Link to comment
Share on other sites

Link to post
Share on other sites

Infinity_Wallpaper_by_Dfiantt.png

AMD FX8320 @3.5ghz |  Gigabyte 990FXA-UD3  |  Corsair Vengeance 8gb 1600mhz  |  Hyper 412s  |  Gigabyte windforceR9 290  |  BeQuiet! 630w  |  Asus Xonar DGX  |  CoolerMast HAF 912+  |  Samsung 840 120gb


2 WD red 1tb RAID0  |  WD green 2tb(external, backup)  |  Asus VG278He  |  LG Flatron E2240  |  CMstorm Quickfire TK MXbrown  |  Sharkoon Fireglider  |  Audio Technica ATH700X


#KILLEDMYWIFE

Link to comment
Share on other sites

Link to post
Share on other sites

I saved this for future use in a debate class. Hope you don't mind me taking it. Love you

Except what he said is not true. He is talking about interlaced video (what people would call '1080i') while monitors with progressive scan (what people call 1080p) can draw 60 frames per second. So if you were to play battlefield 3 and your framerate would be 40 fps constantly and then it would spike to 60 fps you would notice more smoothness. 

Case: Xigmatek Alfar (Black) | CPU: Intel Core i5 4460 @3.2GHz | MoBo: Gigabyte GA-Z87-D3HP | RAM: Crucial Ballistix Sport 8GB 1600MHz | GFX: Gigabyte GTX 660Ti 2GB
Cooler: beQuiet! Shadow Rock | SSD: Samsung 840 EVO 120GB | HDD: Seagate 'Barracuda' 2TB| OS: Windows 8.1 PRO (64-bit) | PSU: Zalman ZM600-GT 600W 80+ Bronze
 Screen: ASUS VS228 | Mouse: Razer Naga Molten | Headphones: Qpad QH-85 | Keyboard: Razer Blackwidow 2013 w/ Cherry MX Blue

 

Link to comment
Share on other sites

Link to post
Share on other sites

We don't see in frames: http://amo.net/NT/02-21-01FPS.html

This is a good article... I was going to post this in here but u beat me too it

Motherboard- P8Z77V-LK CPU- i5 3570K @4.3 RAM- 8gb of Corsair Vengence GPU- MSI 660ti Power Edition PSU- CX750M SSD- Samsung 840t HDD- 1tb of WD Black Case- Modded Bitfenix Shinobi Cooling- H80i and Bitfenix Spectre fans and Bitfenix LED fans

Link to comment
Share on other sites

Link to post
Share on other sites

more frames reduces blur, and that isn't always a good thing. it's good for games, but personally i don't like it in video...24 fps is what they use for movies, in europe i believe is I believe 25 fps. Mexican soap operas 30 fps... Hobbit 48...(that's why it looks like a Mexican soap opera)...so in video/movies it basically comes to preferences...also in cameras when you see like 1080p with 120 FPS it's basically to be used for slow motion...

"Play the course as you find it. Play the Ball as it lies. And if you can't do either, do what's fair."

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×