Jump to content

Whats the framerate and resolution of your eys

7 minutes ago, LAwLz said:

Haven't bothered to read all the replies yet but I have seen enough of these threads to know what the answers will be anyway.

 

Humans does not perceive things in frames per seconds or pixels, and once you try to convert things like cons and rods to "pixels" things quickly just turns into "what is a pixel/frame" and "under what conditions", and also it varies from person to person.

 

Here are some facts though:

  • Humans can perceive more than 60 frames per second as smoother motion. What is the cap where no person could see a difference regardless of how many FPS we increase by? No idea. What is the cap where humans can no longer distinguish between individual frames? No idea either but my guess is that it's pretty low. 
  • Under ideal conditions, a human can make out fine details from an imagine that we only saw for 1/200 of a second. If that means we can see 200 FPS is up for debate, but the fact of the matter is that our eyes and brain can create a clear imagine of something using only 1/200 of a second of exposure time.
  • When it comes to pixels, the "resolution" is quite low but varies in different parts of the eyes, and we are more sensitive to certain things like misaligned lines. ~900 PPI is not really that far fetched for human eyes assuming perfect conditions and average vision. If you have slightly better than average vision then being able to see above 1000 PPI is entirely possible. You just have to be uncomfortably close to the subject.

I linked to a few peer reviewed articles that show the current frame rate perceivable as being about 76 (asus killer also linked one after that).  After 76 the ability to determine individual frames becomes as probable as guessing.  And visual acuity research explains how the rods and cones gives us an effective limit.     Also linked to a Royal society lecture on the visual system and how our brains fill in almost all the information that we think we see.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, mr moose said:

I linked to a few peer reviewed articles that show the current frame rate perceivable as being about 76 (asus killer also linked one after that).  After 76 the ability to determine individual frames becomes as probable as guessing.  And visual acuity research explains how the rods and cones gives us an effective limit.     Also linked to a Royal society lecture on the visual system and how our brains fill in almost all the information that we think we see.

I'll take a look at those posts but I think the problem with questions like these are that they are very much up for interpretation.

Quickly looked through this thread and your definition of "frame rate of the human eye" seems to be "when can we no longer distinguish between two similar images displayed one after the other". That's fine and all, but if we go by the definition "what is the shortest exposure needed for the eye and brain to create an image" then the answer becomes widely different (above 1/200). And before you don't think "shortest exposure" is relevant to frame rate, just imagine there being 100 completely black images, then 1 single photograph, and then 99 completely black images again in a video file that's played at 200 FPS playback speed.

 

 

At the end of the day I think there are multiple answers to both the question of what the frame rate limit and resolution of the eye is, and which answer you think is correct depends on how you interpret the question. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/18/2017 at 9:20 AM, Daring said:

Demonstrably false. The US Air Force found the eyes of pilot trainees were able to identify a simulated enemy aircraft flashed at 220Hz, while a more recent study found that the eye is able to perceive images flashed at 500Hz.

 

You haven´t demonstrated that what I said is false.  In the pilot study, the pilots used the afterimage to identify the object shown.  Show them the image for 1/220s in bright daylight, and they will not be able to identify the object, or show them 220 different objects per second and ask them to identify them.  Don´t "flash" the image into their eyes while they´re in a dark room to create that afterimage in the first place, but use the dim lighting conditions in a not spefically darkned room, or within a forest, and ask them to identify an object that passes their field of view within 1/220s.

 

This more recent study seems to address some particularly special case and even points out that "Traditional TVs show a sequence of images, each of which looks almost like the one just before it, and each of these images has a spatial distribution of light intensities that resembles the natural world. The existing measurements of a relatively low critical flicker fusion rate are appropriate for these displays."

 

The OP didn´t ask about special or unnatural lighting conditions.  I´m not sure what this more recent study is referring to.  My monitor does 60Hz, and I don´t see it flickering.  How did they actually measure "flickering" in that study?

 

Do you have a study showing at what FPS rate human eyes perceive things when looking at them in nature and not on a display?

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, heimdali said:

You haven´t demonstrated that what I said is false.  In the pilot study, the pilots used the afterimage to identify the object shown.  Show them the image for 1/220s in bright daylight, and they will not be able to identify the object, or show them 220 different objects per second and ask them to identify them.  Don´t "flash" the image into their eyes while they´re in a dark room to create that afterimage in the first place, but use the dim lighting conditions in a not spefically darkned room, or within a forest, and ask them to identify an object that passes their field of view within 1/220s.

 

This more recent study seems to address some particularly special case and even points out that "Traditional TVs show a sequence of images, each of which looks almost like the one just before it, and each of these images has a spatial distribution of light intensities that resembles the natural world. The existing measurements of a relatively low critical flicker fusion rate are appropriate for these displays."

 

The OP didn´t ask about special or unnatural lighting conditions.  I´m not sure what this more recent study is referring to.  My monitor does 60Hz, and I don´t see it flickering.  How did they actually measure "flickering" in that study?

 

Do you have a study showing at what FPS rate human eyes perceive things when looking at them in nature and not on a display?

 

This guy. I post some top-quality research and he's still like "This doesn't prove what I'm saying is false"

 

If the human eye can't see more than 50FPS, then why are high refresh rate monitors (e.g. 120Hz, 144Hz or 165Hz) so popular? That'd just be a lot of wasted frames, right? The truth is, the human eye does not see in frames. It receives a constant stream of information from the environment, with a delay of about 100 miliseconds. I don't know where you got "the human eye can't see past 50FPS" from, but it's grade A bullshit.

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, mr moose said:

so we are not allowed to have a civil discussion about anything if the topic was originally intended to be a troll post?

 

How ironic for someone who has "Read the CoC it's like a guide on how not to be moron." in their signature. If you actually read it you'd see that it's against Da Rulez to make troll posts. ?

Main rig on profile

VAULT - File Server

Spoiler

Intel Core i5 11400 w/ Shadow Rock LP, 2x16GB SP GAMING 3200MHz CL16, ASUS PRIME Z590-A, 2x LSI 9211-8i, Fractal Define 7, 256GB Team MP33, 3x 6TB WD Red Pro (general storage), 3x 1TB Seagate Barracuda (dumping ground), 3x 8TB WD White-Label (Plex) (all 3 arrays in their respective Windows Parity storage spaces), Corsair RM750x, Windows 11 Education

Sleeper HP Pavilion A6137C

Spoiler

Intel Core i7 6700K @ 4.4GHz, 4x8GB G.SKILL Ares 1800MHz CL10, ASUS Z170M-E D3, 128GB Team MP33, 1TB Seagate Barracuda, 320GB Samsung Spinpoint (for video capture), MSI GTX 970 100ME, EVGA 650G1, Windows 10 Pro

Mac Mini (Late 2020)

Spoiler

Apple M1, 8GB RAM, 256GB, macOS Sonoma

Consoles: Softmodded 1.4 Xbox w/ 500GB HDD, Xbox 360 Elite 120GB Falcon, XB1X w/2TB MX500, Xbox Series X, PS1 1001, PS2 Slim 70000 w/ FreeMcBoot, PS4 Pro 7015B 1TB (retired), PS5 Digital, Nintendo Switch OLED, Nintendo Wii RVL-001 (black)

Link to comment
Share on other sites

Link to post
Share on other sites

Most of the "rods" in the human eye is in the center of your vision, and that area is probably smaller than you realize before actually trying to notice.

 

When I look at my 6" smartphone at normal distance the center of vision that have most "rods" is smaller than the size of the screen is.

“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, LAwLz said:

I'll take a look at those posts but I think the problem with questions like these are that they are very much up for interpretation.

Quickly looked through this thread and your definition of "frame rate of the human eye" seems to be "when can we no longer distinguish between two similar images displayed one after the other". That's fine and all, but if we go by the definition "what is the shortest exposure needed for the eye and brain to create an image" then the answer becomes widely different (above 1/200). And before you don't think "shortest exposure" is relevant to frame rate, just imagine there being 100 completely black images, then 1 single photograph, and then 99 completely black images again in a video file that's played at 200 FPS playback speed.

 

 

At the end of the day I think there are multiple answers to both the question of what the frame rate limit and resolution of the eye is, and which answer you think is correct depends on how you interpret the question. 

I don't really think a single exposure can be considered a rate (rate being a recurring event/observation) because it is only one image thus it is a single observation.  The study Asus killer linked was using dissimilar images.   So the answer to frame "rate" is 76 according to MIT.   The answer to how short a burst of light does the eye require to identify an image, then the answer is 1/220 (although I can't seem to find this study anywhere, just references to it).

 

EDIT: Just being my usual semantic self though.  This topic keeps coming up and people keep trying to dismiss the actual evidence in favor of battlenet threads and tech journalist articles.  It's a pet erk of mine to see people pick and choose when science is relevant. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Daring said:

 It receives a constant stream of information from the environment,

Yes it does receive a constant stream from the environment, however it can't convert that stream to a constant stream of information. It is broken and piece meal information that the brain reassembles to give you your vision.

 

 

2 hours ago, tmcclelland455 said:

How ironic for someone who has "Read the CoC it's like a guide on how not to be moron." in their signature. If you actually read it you'd see that it's against Da Rulez to make troll posts. ?

Huh?

 

The conversation is not trolling,  just be cause the OP tried to troll(in your opinion), does not make the thread a lockable offense. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

https://books.google.com/books?id=jzbUUL0xJAEC&pg=PA24#v=onepage&q&f=false

 

10-12 images per second. Everything else looks like motion. It's called visual continuity.

 

I've also read this in a science text book years ago... this  points out that it varies between people, indicating that.. wait for it... 'my' eyes may perceive at higher 'framerates' than yours???? Oh, man. Now how do we benchmark that? FurMark for the eyeball... bring me the fuzzy donut.

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, Tiberiusisgame said:

https://books.google.com/books?id=jzbUUL0xJAEC&pg=PA24#v=onepage&q&f=false

 

10-12 images per second. Everything else looks like motion. It's called visual continuity.

 

I've also read this in a science text book years ago... this  points out that it varies between people, indicating that.. wait for it... 'my' eyes may perceive at higher 'framerates' than yours???? Oh, man. Now how do we benchmark that? FurMark for the eyeball... bring me the fuzzy donut.

 

I'm afraid that 10-12 is the minimum required to see motion as continuous for most people. It however is not the maximum we can see nor the ideal. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, mr moose said:

Huh?

 

The conversation is not trolling,  just be cause the OP tried to troll(in your opinion), does not make the thread a lockable offense. 

lolok bye Felicia

Main rig on profile

VAULT - File Server

Spoiler

Intel Core i5 11400 w/ Shadow Rock LP, 2x16GB SP GAMING 3200MHz CL16, ASUS PRIME Z590-A, 2x LSI 9211-8i, Fractal Define 7, 256GB Team MP33, 3x 6TB WD Red Pro (general storage), 3x 1TB Seagate Barracuda (dumping ground), 3x 8TB WD White-Label (Plex) (all 3 arrays in their respective Windows Parity storage spaces), Corsair RM750x, Windows 11 Education

Sleeper HP Pavilion A6137C

Spoiler

Intel Core i7 6700K @ 4.4GHz, 4x8GB G.SKILL Ares 1800MHz CL10, ASUS Z170M-E D3, 128GB Team MP33, 1TB Seagate Barracuda, 320GB Samsung Spinpoint (for video capture), MSI GTX 970 100ME, EVGA 650G1, Windows 10 Pro

Mac Mini (Late 2020)

Spoiler

Apple M1, 8GB RAM, 256GB, macOS Sonoma

Consoles: Softmodded 1.4 Xbox w/ 500GB HDD, Xbox 360 Elite 120GB Falcon, XB1X w/2TB MX500, Xbox Series X, PS1 1001, PS2 Slim 70000 w/ FreeMcBoot, PS4 Pro 7015B 1TB (retired), PS5 Digital, Nintendo Switch OLED, Nintendo Wii RVL-001 (black)

Link to comment
Share on other sites

Link to post
Share on other sites

The human eye is a complex system for sure.

 

Like been stated, the eye has cones for sensing color and rods for sensing light.  On top of that, eyes can vary between genders and among people on sensitivity to motion and colors.  Another interesting thing, we all have a blind spot in our eyes which our brain fills in.  Why?  Our eyes have a lack of receptors (rods or cones) where the optic nerve and blood vessels leave the eye .  Makes for some fun optical illusions to do.  http://www.scholarpedia.org/article/The_Blind_Spot

 

Want something weird, only the female gender can genetically wind up with an extra color cone within their eyes allowing a slight higher color sensitivity.  The offset, if they have kids of the male gender, the poor kid winds up color blind.  So, our genetics play a big role in how eyes develop for each person.

 

Want an interesting video to watch on the subject.  Look up Cuphead on how they managed to do the hand drawn animation while keeping the game running at 60FPS.

2023 BOINC Pentathlon Event

F@H & BOINC Installation on Linux Guide

My CPU Army: 5800X, E5-2670V3, 1950X, 5960X J Batch, 10750H *lappy

My GPU Army:3080Ti, 960 FTW @ 1551MHz, RTX 2070 Max-Q *lappy

My Console Brigade: Gamecube, Wii, Wii U, Switch, PS2 Fatty, Xbox One S, Xbox One X

My Tablet Squad: iPad Air 5th Gen, Samsung Tab S, Nexus 7 (1st gen)

3D Printer Unit: Prusa MK3S, Prusa Mini, EPAX E10

VR Headset: Quest 2

 

Hardware lost to Kevdog's Law of Folding

OG Titan, 5960X, ThermalTake BlackWidow 850 Watt PSU

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/20/2017 at 6:39 PM, Daring said:

This guy. I post some top-quality research and he's still like "This doesn't prove what I'm saying is false"

 

If the human eye can't see more than 50FPS, then why are high refresh rate monitors (e.g. 120Hz, 144Hz or 165Hz) so popular? That'd just be a lot of wasted frames, right? The truth is, the human eye does not see in frames. It receives a constant stream of information from the environment, with a delay of about 100 miliseconds. I don't know where you got "the human eye can't see past 50FPS" from, but it's grade A bullshit.

You didn´t look at what the study is about and what it says, and you didn´t understand what I´m saying.

 

Monitors showing more than 50 or 60fps can be an advantage when playing games under some circumstances because they might allow you to see things earlier than you otherwise would: There´s nothing being said about which of the frames you´re being presented with you actually perceive.  That´s the only explanation that comes to mind --- perhaps you have a better one.

 

Motion looks fluid at about 50fps, and it still does at 165, so what really is the advantage of a monitor that can display 165fps other than the questionable advantage for playing games and being a marketing thing supposed to bring more money into the pockets of the manufacturers.  Can you see text in an editor or numbers of a spreadsheet on it somehow better --- or a movie, which likely hasn´t been recorded at 165fps?

 

Such monitors being popular, if they are, is due to marketing.  If not, please do tell what their actual advantage is.  Do they have a better picture?  Do they consume less electricity?  Do they generate less heat?  Do they last longer?  Are they more reliable?  Easier to transport or simpler to adjust?

 

Can you show a study clearly showing some evidence that observers can see more on a display at 165fps than they do on a display at 60Hz, "more" meaning that a significantly larger amount of information is being perceived by them during a given amount of time?  Mind you, you got more than twice the fps, so observers should be able to get at least twice as much information from a 165fps display than they can get from a 60fps one.

 

Sure you can display the frames of a movie each for a shorter interval of time without dropping any on a display that shows more fps, so you could watch two movies, or one movie twice, within the same amount of time.  You can do the same with the slower display while droping half of the frames, and yet you´d have a hard time to show in which case an observer perceives more or less information because a movie is a bad example for this.  I´m sure you can find a better one ...

 

Go ahead, put 165fps displays into the schools, and you´ll need less teachers while the kids will learn twice as fast.  Yeah, sure ...

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, heimdali said:

Motion looks fluid at about 50fps

The illusion of motion begins at 24 frames per second, but to avoid damaging our eyes, we need a minimum of at least 48 frames per second. Notice how I said "minimum"; that's because our eyes don't see at a maximum frame rate.

 

2 minutes ago, heimdali said:

If not, please do tell what their actual advantage is.

Response time. Rounding off a bit, 30FPS has a response time of 33.3ms, 60FPS is 16.6ms and 120FPS is 8.3ms. Not only would your games or even operating system look smoother, but also be more responsive. That is why competitive gamers like to buy high refresh rate monitors, because the reduced input lag means the game will accept their inputs faster. Makes sense, yes?

Link to comment
Share on other sites

Link to post
Share on other sites

I may be completely wrong but I notice a huge change going from a 60hz to 144hz monitor and have seen 240hz monitors and can detect the differences pretty easily with my eyes. If I had to guess the point where it will make very little difference in fps would probably be around 500-600hz and anything above that would have such a diminishing return that it wouldn't make sense even though technically your eyes see all the frames even if exposed to higher hz.

OS: Windows 10 Pro

CPU: AMD Ryzen Threadripper 1950X

Cooling: Enermax LiqTech 360mm TR4
Motherboard: AORUS X399 Gaming 7
PSU: Corsair RM1000i

RAM: 32 GB G.Skill Trident Z RGB 3600MHz
M.2: 250GB Samsung 960 EVO
GPU: AORUS GeForce GTX 1080 8GB Xtreme OC Edition

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, Daring said:

The illusion of motion begins at 24 frames per second, but to avoid damaging our eyes, we need a minimum of at least 48 frames per second. Notice how I said "minimum"; that's because our eyes don't see at a maximum frame rate.

 

Response time. Rounding off a bit, 30FPS has a response time of 33.3ms, 60FPS is 16.6ms and 120FPS is 8.3ms. Not only would your games or even operating system look smoother, but also be more responsive. That is why competitive gamers like to buy high refresh rate monitors, because the reduced input lag means the game will accept their inputs faster. Makes sense, yes?

That´s what I said.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, heimdali said:

That´s what I said.

You seemed to imply it was marketing when it's much more than just that.

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, awolive said:

I may be completely wrong but I notice a huge change going from a 60hz to 144hz monitor and have seen 240hz monitors and can detect the differences pretty easily with my eyes. If I had to guess the point where it will make very little difference in fps would probably be around 500-600hz and anything above that would have such a diminishing return that it wouldn't make sense even though technically you're eyes see all the frames even if exposed to higher hz.

What actually changed, and how did you perceive this difference?  Did you see the numbers in your spreadsheets faster or something?

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Daring said:

You seemed to imply it was marketing when it's much more than just that.

Ok, I should have been more clear about it maybe.  Yet a few gamers being fond of faster monitors because they give them an actual or imaginary advantage doesn´t really constitute a great popularity of such monitors.  How many of those are being sold, and how do the numbers compare to the ones for slower monitors?

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, heimdali said:

Ok, I should have been more clear about it maybe.  Yet a few gamers being fond of faster monitors because they give them an actual or imaginary advantage doesn´t really constitute a great popularity of such monitors.  How many of those are being sold, and how do the numbers compare to the ones for slower monitors?

Yeah, I can agree there. They're not as common among more casual folk, because they... well, high refresh rate monitors get pretty damn expensive. The one I've been looking at (Dell S2417DG) runs for about $400, wew. Outside of enthusiasts, people don't really seem to like putting so much into a computer or peripherals.

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, heimdali said:

What actually changed, and how did you perceive this difference?  Did you see the numbers in your spreadsheets faster or something?

 

Do yourself a favor, buy a 144 hz monitor and play a game that you can run at 144+ FPS and compare it to a standard 60hz monitor. If you can't detect the difference in smoothness of motion then you should probably RMA your eyes lol

OS: Windows 10 Pro

CPU: AMD Ryzen Threadripper 1950X

Cooling: Enermax LiqTech 360mm TR4
Motherboard: AORUS X399 Gaming 7
PSU: Corsair RM1000i

RAM: 32 GB G.Skill Trident Z RGB 3600MHz
M.2: 250GB Samsung 960 EVO
GPU: AORUS GeForce GTX 1080 8GB Xtreme OC Edition

Link to comment
Share on other sites

Link to post
Share on other sites

Here is something people need to remember, you arent seeing frames with a screen, you dont physically see the frames change every second. You are viewing the smoothness of the movement and the perception of movement and blur. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, awolive said:

Do yourself a favor, buy a 144 hz monitor and play a game that you can run at 144+ FPS and compare it to a standard 60hz monitor. If you can't detect the difference in smoothness of motion then you should probably RMA your eyes lol

So this is the one special case again in which it might have an advantage.

 

But why or how would you say that motion appears smoother when it already appears smooth at 60fps?  It can´t appear smoother than smooth, and at some point it may blurr because the movement is too fast for your eyes to follow.

 

Link to comment
Share on other sites

Link to post
Share on other sites

If you're implying that above 60hz or 60FPS has no advantages outside of gaming then that's probably mostly true. But the only people interested in high refresh rates are....you guessed it! GAMERS!! so if you don't play high motion video games then yes its a complete waste of money and will have little to no benefit browsing the web, watching 24fps movies, listening to music, editing photos.

OS: Windows 10 Pro

CPU: AMD Ryzen Threadripper 1950X

Cooling: Enermax LiqTech 360mm TR4
Motherboard: AORUS X399 Gaming 7
PSU: Corsair RM1000i

RAM: 32 GB G.Skill Trident Z RGB 3600MHz
M.2: 250GB Samsung 960 EVO
GPU: AORUS GeForce GTX 1080 8GB Xtreme OC Edition

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, awolive said:

If you're implying that above 60hz or 60FPS has no advantages outside of gaming then that's probably mostly true. But the only people interested in high refresh rates are....you guessed it! GAMERS!! so if you don't play high motion video games then yes its a complete waste of money and will have little to no benefit browsing the web, watching 24fps movies, listening to music, editing photos.

120Hz is actually very good for video watching.

120Hz is the only monitor refresh rate which can evenly be divided by 24, 30 and 60 FPS, thus removing the need for three-two pulldown which makes the video juddery.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×