Jump to content

Gemini Man to be shown in 120 fps

LukeSavenije

For people saying motion blur is "unrealistic":

Put your hand in front of your eyes then wave it around.

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, mr moose said:

Remember all those arguments over FPS on the forums?  Basically I keep posting evidence the human eye and visual processing system can't identify more than about 76FPS.   Because computer games are interactive and there is a lot more feedback and processing going on that changes Frame rates there are other mechanisms by which we can identify a better performing system (i.e one with faster rates), however a movie has no feedback or lag potential.  After 80FPS there is no advantage to perception.

so youre telling me if theres 1 white frame and all others are black in 77 FPS video, we will not notice anything?

MSI GX660 + i7 920XM @ 2.8GHz + GTX 970M + Samsung SSD 830 256GB

Link to comment
Share on other sites

Link to post
Share on other sites

Am I surprised the people that say anything under 120fps is "literally unplayable" are also defending this?

 

I am fine with current 24fps in movies. Motion blur is needed because its real because we experience it in real life. 

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Neftex said:

so youre telling me if theres only 1 white frame in 77 FPS video, we will not be able to there was something off?

Thats not a fair comparison. Our eyes are not like shutters, more like buckets trying to process as fast as possible. Inserting a single white frame would be noticeable no matter the frame rate. You wouldnt be able to tell what it is but it would be off. 

 

Take a video 80fps and put it side by side to 120fps and you will not be able to perceive the difference. Taking a starking white image and throwing in the mix is not the same thing.  

Link to comment
Share on other sites

Link to post
Share on other sites

38 minutes ago, Nowak said:

For people saying motion blur is "unrealistic":

Put your hand in front of your eyes then wave it around.

"Motion blur" in real life only happens with relative motion compared to perspective. When you add in a random ass motion blurring effect you are forcing a certain focus point for your viewer and if they don't focus on that specific relative motion (assuming you did it properly, which most games don't), its extremely disorienting and pointlessly nauseating. 

 

If you as a real human being focus on your hand (to the limit at which your eyes can scan in degrees per second), it doesnt blur. Of course eyes can't actually scan that fast while processing data, so its extremely easy to move something at your eye level with an angular velocity in massive excess of your capacity to keep focused on it.

 

The difference between trees blurring as you fly past them and watching one actively scan across your vision.

 

It is blatantly incorrect from a physical standpoint to arbitrarily blur everything that moves regardless of the focus.

 

Again, this is even more relevant in 3d projection viewing where you shouldnt even infer ANY specific perspective for your viewer (either in what constitutes relative motion or focal plane) and instead show everything in perfect focus, because the viewers sight will already do all of the narrowing and motion related effects automatically (by choosing to focus on something themselves) and then there won't be a disconnect between what their brains expect in the motion and what the video is showing.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, mynameisjuan said:

Thats not a fair comparison. Our eyes are not like shutters, more like buckets trying to process as fast as possible. Inserting a single white frame would be noticeable no matter the frame rate. You wouldnt be able to tell what it is but it would be off. 

 

Take a video 80fps and put it side by side to 120fps and you will not be able to perceive the difference. Taking a starking white image and throwing in the mix is not the same thing.  

it is fair. if you can notice that, it means the human eye and brain can process stuff at higher fps.

MSI GX660 + i7 920XM @ 2.8GHz + GTX 970M + Samsung SSD 830 256GB

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Neftex said:

it is fair. if you can notice that, it means the human eye and brain can process stuff at higher fps.

No it doesnt. Your eyes/brain tend to average out light coming. The 1 white frame would make the picture feel brighter, it wouldnt stand out. This is how everyday LEDs work. LED strips, taillights, clocks, all work by utilizing this technique. LEDs dont get brighter because they apply more power, they get brighter because they increase the hz. 

 

Its still up in the air on what the brain can perceive at high frame rate as its good at piecing together information and even make up its own. Like Moose is saying, its the extra physical input the helps the brain perceive this higher frame rate. We see motion blue on pretty slow moving objects in real life.  

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, LukeSavenije said:

We finally get high refresh rate in movies... where is this going to?

They already tried 48fps on The Hobbit, people didn't buy it. Granted, the movies weren't very good.

 

Honestly, 120fps doesn't make any sense to me - it's just wasted storage or bandwidth on a movie. Just give us 48 or 60.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Neftex said:

so youre telling me if theres 1 white frame and all others are black in 77 FPS video, we will not notice anything?

No,  each receptor in the eye triggers when a photon hits it and it stays triggered for a few hundredths of a sec. This means for a small fraction of time that receptor cannot be triggered again.   This is called vision persistence and it is why when you close your eyes or cover them the the last thing you saw remains as an outline.   The image lasts longer the brighter it was or the darker you can make it after closing your eyes.   If you were to have 77 white frames and one black frame you would not perceive it, if you had 77 black frame and 1 white frame you will be able to perceive it.  The key to this is that in this situation it does not say anything about the ability to perceive a difference in frame rates when each frame is of a similar light intensity.  This test only proves the existence of visual persistence.

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

I'm all for having higher-fps movies...

 

...not because of the movies themselves, but because it may mean we'll get affordable high refresh rate TVs somewhere down the line, and those will be awesome for gaming. 

Ryzen 1600x @4GHz

Asus GTX 1070 8GB @1900MHz

16 GB HyperX DDR4 @3000MHz

Asus Prime X370 Pro

Samsung 860 EVO 500GB

Noctua NH-U14S

Seasonic M12II 620W

+ four different mechanical drives.

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, Giganthrax said:

I'm all for having higher-fps movies...

 

...not because of the movies themselves, but because it may mean we'll get affordable high refresh rate TVs somewhere down the line, and those will be awesome for gaming. 

Samsungs 4k 2018 lineup has true 1080@120hz off HDMI 2 and is only like ~$700ish for the 55" model

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, mynameisjuan said:

Samsungs 4k 2018 lineup has true 1080@120hz off HDMI 2 and is only like ~$700ish for the 55" model

That would probably be priced at least $1000 in Europe. Way too much man. I was thinking more in line of budget offerings. :D

Ryzen 1600x @4GHz

Asus GTX 1070 8GB @1900MHz

16 GB HyperX DDR4 @3000MHz

Asus Prime X370 Pro

Samsung 860 EVO 500GB

Noctua NH-U14S

Seasonic M12II 620W

+ four different mechanical drives.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Giganthrax said:

That would probably be priced at least $1000 in Europe. Way too much man. I was thinking more in line of budget offerings. :D

Point is its already a thing and trickling down. Sony and Visio are already doing it as well. Movies will not help this much.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Neftex said:

it is fair. if you can notice that, it means the human eye and brain can process stuff at higher fps.

I would like to see a blind test where someone has been able to perceive the difference using only their eyes.

 

It has been shown that under very specific light/dark cases with sharp edge the human eye can identify flicker average of 500Hz (this is not the same as FPS), but the maximum actually perception of FPS observed through any testing and through study of the visual processing system is not more than 90Hz.

 

https://www.nature.com/articles/srep07861

 

This is why you can tell the difference between 60fps and 120 fps, but without some sort of biofeedback (computer input) there has been no established cases of people perceiving more than 90fps.

 

Quote

These studies have included both stabilized and unstablized retinal images, and report the maximum observable rate as 50–90 Hz.

and

Quote

The critical flicker fusion rate is defined as the rate at which human perception cannot distinguish modulated light from a stable field. This rate varies with intensity and contrast, with the fastest variation in luminance one can detect at 50–90 Hz

 

 

There is a lot more I wanted to say, but to keep it short, maybe in the future they will find out we can detect more precisely higher frame rates in certain conditions, the above linked article suggests being able encode 3 images in a video stream allowing for simultaneous 3d and 2d.   However for now the current percieved limit is about 90 for traditional video.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

A lot of shit about the human eye with, ### FPS blah blah blah.. what about outside where the light from the sun is continuous and the frame perception is near infinite FPS... This crap needs to stop.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Ryujin2003 said:

A lot of shit about the human eye with, ### FPS blah blah blah.. what about outside where the light from the sun is continuous and the frame perception is near infinite FPS... This crap needs to stop.

Because it's not infinite.   The crap that needs to stop is this constant BS that there is no limit to the eye.  A constant stream of photons going in does not mean that each receptor can react fast enough to perceive every single photon. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, WereCatf said:

The direction I approve of. Fuck low-FPS movies in the ear. With a cactus. Sideways.

I don't approve of that.

I only see your reply if you @ me.

This reply/comment was generated by AI.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Origami Cactus said:

I don't approve of that.

Well, I didn't obviously mean you! Some other, less-likeable cactus! Maybe that weird uncle-Joe who's always touching sunflowers inappropriately?

Hand, n. A singular instrument worn at the end of the human arm and commonly thrust into somebody’s pocket.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, WereCatf said:

Well, I didn't obviously mean you! Some other, less-likeable cactus! Maybe that weird uncle-Joe who's always touching sunflowers inappropriately?

fbi joined the chat

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, mr moose said:

Because it's not infinite.   The crap that needs to stop is this constant BS that there is no limit to the eye.  A constant stream of photons going in does not mean that each receptor can react fast enough to perceive every single photon. 

As a corollary from your previous post, one of the reasons why traditional gaming setups have occasionally been capable of showing benefits up to or inexcess of 120 Hz monitors is that as you mentioned, the human brain and occular system can notice things like flicker or unstable motion far more quickly than it has the ability to derive any additional content data. So in the era before adaptive sync technologies, higher refresh rates meant that frame hitching was a smaller average temporal change every time it occured, and with a more granular control, was less likely to occur regardless because many frame rates are sufficiently divisible by some multiple of 12. (Like 120Hz can render 24 fps without frame pacing issues and 60 Hz cannot). 

 

This also helps explain why some people (myself and Linus included) have noted that adaptive sync technologies lower the minimum acceptable framerate. 45-60 fps is perfectly fine (nice even) for almost every game with g-sync enabled. 

 

In a movie setting, your frames shouldnt have had any issue with pacing, so one of the single biggest advantages to exceptionally high fps are eliminated.

 

(As you mentioned pwm flicker is a good example of how even more than 500 Hz the human occular system is sometimes capable of confusion or discomfort, even if it isnt capable of recognition or distinction.)

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/15/2019 at 10:08 AM, williamcll said:

A projector like that must be expensive

Just overclock it and watercool problem solved.

 

Anyway i found out using motion interpolation with AviSynth and potplayer that movies start looking like shit above 35fps, thats the setting i use for motion interpolation 24fps converted to 35max, ofcourse the movies already have motion blur and other effects so interpolation looks weird compared to native 60fps recorded movie, but no one i know likes the soap opera fluid movies on those smart TV's that have interpolation enabled by default.

 

Its a blessing for anime though, i play all anime at 60fps, if i had my 4K TV with 120hz support i would definately play 120fps interpolation.

But outside the cinema this doesnt make sense, the file size of 4K 120hz 10bit+ HDR would be insane. Unstreamable, and no physical disk that can hold a full movie with those settings.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×