Jump to content

Why do we need more than 30 FPS in virtual environments

IAmAndre

Hi,

 

I know it's a question that has been tackled quite a few times by Linus, but I'm still confused as to why 24/30 FPS is fine with movies, but some people want up to 144FPS when gaming. I've been watching both gaming and real world videos on YouTube at 60 FPS and I can tell there's a difference between 30 and 60 FPS, but as far as I can see it's not something that would make me upgrade my GPU.

So to summarize, I know it has something to do with "real" light, but I'm looking for a complete explanation : why do we "need" over 30 FPS for CGI environments ?

 

Thanks

Link to comment
Share on other sites

Link to post
Share on other sites

We don't need it, we just like it because it looks waaaaaaaaaaaaaaaaaaaaay better than 30.

✨PC Specs✨

AMD Ryzen 7 3800X | MSI MPG B550 Gaming Plus | 16GB Team T-Force 3400MHz | Zotac GTX 1080 AMP EXTREME

BeQuiet Dark Rock Pro 4 Samsung 850 EVO 250GB | NZXT 750W | Phanteks Eclipse P400A

Extras: ASUS Zephyrus G14 (2021) | OnePlus 7 Pro | Fully restored Robosapien V2, Omnibot 2000, Omnibot 5402

Link to comment
Share on other sites

Link to post
Share on other sites

tbh i've never had a game console or otherwise that could keep 30fps without looking absolutely terrible (smh consoles) so i couldn't tell you honestly how 30fps matches up to 60 fps. My gpu always reaches 60 so i've had no real reason to look at 30

Shipping sucks

Link to comment
Share on other sites

Link to post
Share on other sites

The thing is, a movie can get away with 24 fps because it has a consistent frame time between each frame. That way our brain doesn't have to work as hard to "fill in the blank" as we think of it. 

 

However, the nature of a game means that you cannot possibly control the frame time, as it has to be rendered in real time. For example, the average frame time for 30fps would be 33 ms between each frame, however, the actual time to render each frame can range from 5 to 50 milliseconds, which makes up the "studdering" we observe. 

 

With 60 fps, the average frame time is reduced to 17ms, which makes the variance in frame time less observable, and with higher refresh rate panels, it would be essentially unnoticeable. 

Me: Computer Engineer. Geek. Nerd.

[Educational] Computer Architecture: Computer Memory Hierarchy

[Educational] Computer Architecture:  What is SSE/AVX? (SIMD)

Link to comment
Share on other sites

Link to post
Share on other sites

...motion...blur

                     .
                   _/ V\
                  / /  /
                <<    |
                ,/    ]
              ,/      ]
            ,/        |
           /    \  \ /
          /      | | |
    ______|   __/_/| |
   /_______\______}\__}  

Spoiler

[i7-7700k@5Ghz | MSI Z270 M7 | 16GB 3000 GEIL EVOX | STRIX ROG 1060 OC 6G | EVGA G2 650W | ROSEWILL B2 SPIRIT | SANDISK 256GB M2 | 4x 1TB Seagate Barracudas RAID 10 ]

[i3-4360 | mini-itx potato | 4gb DDR3-1600 | 8tb wd red | 250gb seagate| Debian 9 ]

[Dell Inspiron 15 5567] 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

24FPS for cinema is a result of "smooth enough" animation with motion burr and "don't blow too much money on film" along with "sync well enough with audio" during the early days of the "Talkies" motion pictures, and it's stuck as such since then (the celluloid used for the physical film reels was EXPENSIVE during the early days, so keeping the amount of it you used low along with a high enough framerate to trick the eye plus syncing with the audio track was the balancing act that ended up at 24fps).

 

25FPS & 30FPS are results of PAL (25) & NTSC (30) respectively being based upon the AC power systems of Europe (220-240V @ 50Hz) & NA (110-120V @ 60Hz) for TV due to frame interlacing (odd-lines in a frame renderd on one half of the power cycle, even-lines on the other).

 

60FPS is considered baseline 'smooth' for computer gaming as an evolution from the 30fps/60Hz factor of TV standards. 90+fps is considered minimum for realistic animation with minimal discomfort in VR due to visual/physical disconnect in regards to the body and VR simulations, which is a nice side effect of higher framerates looking more smooth and realistic with minimal motion blur (which tricks the eye into perceiving smooth motion when there's not such as 24/25/30fps), which is why 120Hz & 144Hz monitos are considered "must have" by competitive gamers particularly for Shooters on PC (higher monitor refresh rate = less kneecapping of the GPU to keep framerates high and screen tearing low).

Link to comment
Share on other sites

Link to post
Share on other sites

fps in things like gaming benchmarks is an 'average'.  Most scenes in games can be rendered without a lot of difficulty by GPU hardware.  Its the corner cases, ie: when a whole scene needs to be redrawn, which very quickly pulls down the instantaneous (or near-instantaneous) fps to the point where there is visible lag. 

 

This is why gamers will usually aim for dramatically more averaged fps than is strictly necessary.  Because they know that if the average is much higher in benchmarks, the minimum fps is apt to be much higher, and the hardware will slice through those scene changes like butter..

 

If a hypothetical video card could guarantee that fps would *never*, even on an instantaneous basis, drop beneath 30fps, then nobody would care whether their card benchmarks at 45fps, 60fps, 120fps, etc.  However, that's not the nature of hardware. 

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Wander Away said:

The thing is, a movie can get away with 24 fps because it has a consistent frame time between each frame. That way our brain doesn't have to work as hard to "fill in the blank" as we think of it. 

 

However, the nature of a game means that you cannot possibly control the frame time, as it has to be rendered in real time. For example, the average frame time for 30fps would be 33 ms between each frame, however, the actual time to render each frame can range from 5 to 50 milliseconds, which makes up the "studdering" we observe. 

 

With 60 fps, the average frame time is reduced to 17ms, which makes the variance in frame time less observable, and with higher refresh rate panels, it would be essentially unnoticeable. 

 

7 minutes ago, Technous285 said:

24FPS for cinema is a result of "smooth enough" animation with motion burr and "don't blow too much money on film" along with "sync well enough with audio" during the early days of the "Talkies" motion pictures, and it's stuck as such since then.

 

25FPS & 30FPS are results of PAL (25) & NTSC (30) respectively being based upon the AC power systems of Europe (220-240V @ 50Hz) & NA (110-120V @ 60Hz) for TV due to frame interlacing (odd-lines in a frame renderd on one half of the power cycle, even-lines on the other).

 

60FPS is considered baseline 'smooth' for computer gaming as an evolution from the 30fps/60Hz factor of TV standards. 90+fps is considered minimum for realistic animation with minimal discomfort in VR due to visual/physical disconnect in regards to the body and VR simulations, which is a nice side effect of higher framerates looking more smooth and realistic with minimal motion blur (which tricks the eye into perceiving smooth motion when there's not such as 24/25/30fps), which is why 120Hz & 144Hz monitos are considered "must have" by competitive gamers particularly for Shooters on PC (higher monitor refresh rate = less kneecapping of the GPU to keep framerates high and screen tearing low).

 

4 minutes ago, Mark77 said:

fps in things like gaming benchmarks is an 'average'.  Most scenes in games can be rendered without a lot of difficulty by GPU hardware.  Its the corner cases, ie: when a whole scene needs to be redrawn, which very quickly pulls down the instantaneous (or near-instantaneous) fps to the point where there is visible lag. 

 

This is why gamers will usually aim for dramatically more averaged fps than is strictly necessary.  Because they know that if the average is much higher in benchmarks, the minimum fps is apt to be much higher, and the hardware will slice through those scene changes like butter..

 

If a hypothetical video card could guarantee that fps would *never*, even on an instantaneous basis, drop beneath 30fps, then nobody would care whether their card benchmarks at 45fps, 60fps, 120fps, etc.  However, that's not the nature of hardware. 

So if I understand well, are you saying that when we get movies for VR headsets, it won't really matter if they are shot at 60 or 30 FPS?

I know that Hz is a unity of frequency/rates, so can you explain me what it has to do with power systems?

What about animation movies? I think the framerate is consistent, so does it mean it doesn't matter whether the movie is played at 30 or 60 FPS?

Finally, what's the maximum framerate that the average human eye can perceive?

Link to comment
Share on other sites

Link to post
Share on other sites

Movies can get away with 24 fps because they have natural motion blur. Games look better with more fps because they don't have motion blur down pat.

i5 6600k and GTX 1070 but I play 1600-900. 1440p BABY!

Still, don't put too much faith in my buying decisions. xD 

Link to comment
Share on other sites

Link to post
Share on other sites

Power cycling at 60Hz means the wave is moving up/down 60 times per second. If you're rendering complete images at the rate of 1 image/cycle (eg: 720p, 1080p), you get 60fps from 60hz.

If however you render alternating portions of an image (say odd-numbered lines on an Up cycle and Even on a Down) on each up/down cycle you'll get 60 partial-images/second that you'll have to interlace for 30 images/second from that same 60Hz AC power cycle (eg: 720i, 1080i).

 

If you've ever watched an old show on modern tech and noticed when you paused the video alternating lines of different images that are close but not-quite the same, that's because the old show was filmed and aired using interlacing and modern technology progressively generates the frames.

 

~~~~~

 

As for "maximum" framerate the human eye can see - it's basically Infinite  as there's no "framerate" to reality, though various forms of testing have shown the eye and brain can recognise an image shown for 1/200-1/220th of a second (basically 4-5/1000th's of a second), and that we can react to something before "seeing" it and recognising it (eg: reacting to the presence of a snake before consciously noticing it).

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, IAmAndre said:

what's the maximum framerate that the average human eye can perceive?

We do not know the answer to that question.

"It pays to keep an open mind, but not so open your brain falls out." - Carl Sagan.

"I can explain it to you, but I can't understand it for you" - Edward I. Koch

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, YedZed said:

Movies can get away with 24 fps because they have natural motion blur. Games look better with more fps because they don't have motion blur down pat.

Videos can get away with a low frame rate as well because there's no input lag, which is another major aspect. No input lag when it comes to watching movies or YouTube videos.

"It pays to keep an open mind, but not so open your brain falls out." - Carl Sagan.

"I can explain it to you, but I can't understand it for you" - Edward I. Koch

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Godlygamer23 said:

Videos can get away with a low frame rate as well because there's no input lag.

I don't remember Linus mentioning that.

i5 6600k and GTX 1070 but I play 1600-900. 1440p BABY!

Still, don't put too much faith in my buying decisions. xD 

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, IAmAndre said:

 

 

So if I understand well, are you saying that when we get movies for VR headsets, it won't really matter if they are shot at 60 or 30 FPS?

I know that Hz is a unity of frequency/rates, so can you explain me what it has to do with power systems?

What about animation movies? I think the framerate is consistent, so does it mean it doesn't matter whether the movie is played at 30 or 60 FPS?

Finally, what's the maximum framerate that the average human eye can perceive?

 
 

 

For VR, the environment would have to be at a higher frame rate, because you need butter smooth pictures to avoid motion sickness. A movie can be still on a virtual screen, but that virtual screen itself is moving with respect to your orientation, so it also has to be at a high frame rate. (Imagine watching a movie at a movie theatre, you move your head. the movie itself is still moving at 24fps, but the screen is moving with respect to your eyes, and that motion has to be as smooth as can be, and our brains can't "fill in the blank" as well when it comes to our sense of physical motion [inner ear])

 

Animation movies are the same thing as any other movies, it doesn't matter if the movie is made from graphics artists or from film clips, as long as the frame time is consistent, and it is all prerendered, instead of rendered in real time as a game is (ask the hundreds of hours of render time on animations), it would seem to have smooth motion. 

 

Personally, I can see the difference between a 60hz and a 100hz screen (i have a 144hz monitor), but from 100hz up, I can't really tell too much of a difference, but this is sure to vary person to person, just as some people are more prone to motion sickness than others. 

Me: Computer Engineer. Geek. Nerd.

[Educational] Computer Architecture: Computer Memory Hierarchy

[Educational] Computer Architecture:  What is SSE/AVX? (SIMD)

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, YedZed said:

I don't remember Linus mentioning that.

It doesn't matter. He doesn't need to mention it in order for it to be true - he doesn't know everything out there.

 

If you have no input lag, then there's no delay being perceived as you are not in control of what's on the screen. I suggest you watch the below video from TotalBiscuit.

 

"It pays to keep an open mind, but not so open your brain falls out." - Carl Sagan.

"I can explain it to you, but I can't understand it for you" - Edward I. Koch

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Technous285 said:

Power cycling at 60Hz means the wave is moving up/down 60 times per second. If you're rendering complete images at the rate of 1 image/cycle (eg: 720p, 1080p), you get 60fps from 60hz.

If however you render alternating portions of an image (say odd-numbered lines on an Up cycle and Even on a Down) on each up/down cycle you'll get 60 partial-images/second that you'll have to interlace for 30 images/second from that same 60Hz AC power cycle (eg: 720i, 1080i).

 

If you've ever watched an old show on modern tech and noticed when you paused the video alternating lines of different images that are close but not-quite the same, that's because the old show was filmed and aired using interlacing and modern technology progressively generates the frames

Interesting, but tbh I lost you at the partial image thing. Can you elaborate on that please?

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, IAmAndre said:

Interesting, but tbh I lost you at the partial image thing. Can you elaborate on that please?

Draw a wave pattern with 30 peaks and 30 valleys.

Take an image that's, say, 6 inches tall and 8 inches wide.

Carefully cut it so you are left with 60 picture slices that are 0.1 inches tall and 8 inches wide, then put them to one side of the table carefully so they still make the original image.

Now, as you follow the path of the wave pattern, for each peak and valley you come across you move the highest slice of the original image from where you laid them out and move it from one side of the table to the other side of the table until you've recreated the original image.

 

That's basically how the initial forms of TV transmitted images. Later on as technology got better they were able to transmit groups of those image slices for each peak and valley of the power cycle (the wave pattern) by sending the odd-numbered slices in one group on a peak then the even-numbered slices in the next group on a valley, and the two groups together formed one frame, and there's a total of 30 interlaced frames each second.

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, IAmAndre said:

So if I understand well, are you saying that when we get movies for VR headsets, it won't really matter if they are shot at 60 or 30 FPS?

I know that Hz is a unity of frequency/rates, so can you explain me what it has to do with power systems?

What about animation movies? I think the framerate is consistent, so does it mean it doesn't matter whether the movie is played at 30 or 60 FPS?

Finally, what's the maximum framerate that the average human eye can perceive?

The issue with VR headsets and framerates as some people claim is because the motion in the video is not smooth enough, they get disoriented and have motion sickness. However, some of these people also claim if they don't see something greater than 60FPS, they also get motion sickness (which makes me wonder how the heck they even watched anything growing up). However, considering the "gold standard" for VR apparently comes from Occulus Rift and its owner is a huge subscriber to the "PC gaming master race" idea, I kind of don't take what he says seriously.

 

As far as the maximum framerate a human eye can perceive, you have to take into account of how the mind processes things. For one, humans process things continuously, not discretely like in digital video. However, the brain also discards a lot of information that isn't "important". For example, ever notice when you're in a crowd and you talk with someone, you can still carry out a conversation more or less? That's because your brain is filtering out all of the other noise from your conscious processing.

 

Anyway, studies on air force pilots have shown that the fastest they were able to meaningfully perceive something (they were shown images of airplanes briefly) is about 1/220th of a second. Does this imply the maximum frame rate a person can see is 220 FPS?... No, not really. Maybe we can perceive more in subtle ways. Maybe you can see light that's only 1/500th of a second long. So maybe 500FPS? Who knows? And everyone's different too. Going back to the VR issue, I was able to experience Google Cardboard on my phone just fine without much issues and my phone was maybe pushing 30-40FPS. Apparently some people will hurl chunks if they see VR this way.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, IAmAndre said:

So if I understand well, are you saying that when we get movies for VR headsets, it won't really matter if they are shot at 60 or 30 FPS?

I know that Hz is a unity of frequency/rates, so can you explain me what it has to do with power systems?

What about animation movies? I think the framerate is consistent, so does it mean it doesn't matter whether the movie is played at 30 or 60 FPS?

Finally, what's the maximum framerate that the average human eye can perceive?

 

Yeah on a movie which is just streaming playback from disk, as opposed to real-time rendering, there isn't likely to be much of any human perceptible difference between 30fps and 60fps.  My comments were with respect to the scenario, ie: gaming, where there is real-time rendering.  In which case, a 60fps 'average' can easily be brought down to an unacceptable 10-20fps during major scene changes. 

 

60Hz is the sinusoidal frequency of electricity in North America.  50Hz is used elsewhere in the world.  Hz is just literally 'cycles per second", or refreshes per second. 

 

As for the human eye, some are sensitive to higher rates more than others.  Some of it depends upon a person's mental state of concentration.  Some people with epilepsy, for instance, have a higher perceptual capability than others. 

Link to comment
Share on other sites

Link to post
Share on other sites

http://accidentalscientist.com/2014/12/why-movies-look-weird-at-48fps-and-games-are-better-at-60fps-and-the-uncanny-valley.html

 

I also ran into this handy dandy article. The thing is you can get away with lower FPS if you employ temporal antialiasing(e.g., motion blur) and "grainy noise", because for some reason adding a bit of noise actually improves how quickly you can resolve details. It also says the rods and cones in the eyes "wiggle" to get super-resolution imaging, and this wiggling happens on average at about 83Hz. So 42 FPS or so is the minimum if you really want to start taking advantage of our eyes getting the most out of something, if you don't add any other sort of effect to the frames (like temporal antialiasing)

 

The guy who did 8088 Corruption demonstrated something else: since humans are pattern matching organisms, eyes prefer more samples than more detail. In other words, things become more meaningful to us when we get more samples, even if the detail is garbage:

 

So basically it boils down to this: more FPS is better, but the tolerance to what FPS range you're looking at varies from person to person.

Link to comment
Share on other sites

Link to post
Share on other sites

Because games are interactive is the short answer.  

Intel 4670K /w TT water 2.0 performer, GTX 1070FE, Gigabyte Z87X-DH3, Corsair HX750, 16GB Mushkin 1333mhz, Fractal R4 Windowed, Varmilo mint TKL, Logitech m310, HP Pavilion 23bw, Logitech 2.1 Speakers

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, IAmAndre said:

Hi,

 

I know it's a question that has been tackled quite a few times by Linus, but I'm still confused as to why 24/30 FPS is fine with movies, but some people want up to 144FPS when gaming. I've been watching both gaming and real world videos on YouTube at 60 FPS and I can tell there's a difference between 30 and 60 FPS, but as far as I can see it's not something that would make me upgrade my GPU.

So to summarize, I know it has something to do with "real" light, but I'm looking for a complete explanation : why do we "need" over 30 FPS for CGI environments ?

 

Thanks

I watched the hobbit in cinema which was in 48fps ("HFR"), it was so faking smooth!

because we are used to watching movies with that framerate, also with videogames , you can kindof feel the lower fps, (if you play at 60 , youll definetly feel it if it drops below 60)

                                                                                                                                                                                                                                                   Sample Text ( ͡° ͜ʖ ͡°)

Link to comment
Share on other sites

Link to post
Share on other sites

30fps is much less responsive than 60fps. What I mean is that with 30fps your movement is only looked at 30 times per second, while with 60fps it's done twice as often. This just means every move you make will be replicated in game much quicker with 60fps than with 30fps.

A movie is just something you watch and don't interact with, so that's the main difference between why people want 60fps (or higher) in games.

"We're all in this together, might as well be friends" Tom, Toonami.

 

mini eLiXiVy: my open source 65% mechanical PCB, a build log, PCB anatomy and discussing open source licenses: https://linustechtips.com/topic/1366493-elixivy-a-65-mechanical-keyboard-build-log-pcb-anatomy-and-how-i-open-sourced-this-project/

 

mini_cardboard: a 4% keyboard build log and how keyboards workhttps://linustechtips.com/topic/1328547-mini_cardboard-a-4-keyboard-build-log-and-how-keyboards-work/

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×