Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
LukeSavenije

Gemini Man to be shown in 120 fps

Recommended Posts

36 minutes ago, WereCatf said:

The direction I approve of. Fuck low-FPS movies in the ear. With a cactus. Sideways.

George lucas remasters Kylo and Rey duet fight scene in 1 to 7 fps style

Link to post
Share on other sites

so next ubi games will run at cinematic 120 fps?


One day I will be able to play Monster Hunter Frontier in French/Italian/English on my PC, it's just a matter of time... 4 5 6 7 8 9 years later: It's finally coming!!!

Phones: iPhone 4S/SE | LG V10 | Lumia 920

Laptops: Macbook Pro 15" (mid-2012) | Compaq Presario V6000

Link to post
Share on other sites

Honestly I only like high refresh rates for games... 120 FPS for a movie? Not sure I'd like that. BUT, not gonna say that for sure because I obviously haven't seen one.


If you haven't held a SNES controller before, you don't know gaming.

well okay that's not entirely true but still go play a fucking SNES if you haven't.

 

 

Current Rig

FX 8320 @ 4.5ghz - EVGA GTX 770 SC 2GB - 16GB Hyper x Fury - 250GB SSD - CM 212 EVO - Asus M5A97 R2.0 - EVGA 650w 80+G G2 - CM Elite 430 Black - WD Blue 1TB

 

The 5$ Laptop (Asus r510c) (+60$ worth of upgrades)

Core i5 i5-3337U 1.8ghz (2.5ghz) - 120gb SSD - 320gb WD Black - 8GB DDR3 - Intel HD Graphics 4000

 

Secondary Programming Machine

i7 870 @ 4.0ghz - GTX 670 2GB - 8GB DDR3 Hyperx Fury - 64GB SSD - 1TB WD Blue - Some 1156 Asus board

 

HTPC - Wolf In Sheep's Clothing (Crappy HP OEM SFF case and PSU from 2009)

AMD A8 6800k @4.2ghz - GT 1030 2GB GDDR5 - 8GB DDR3 - 250GB SP SSD - Some Asus FM2+ motherboard

 

Bedroom HTPC - The 939 Monster

AMD Athlon 64 x2 @ 2.8ghz - ATi Radeon x850XT - 4GB DDR400 - 80GB WD Blue IDE - SOCKET 939 BITCHES

 

Trusty Old Rig: Athlon 64 X2 3000+ @ 2.16ghz - Radeon 9600 Pro - 1GB DDR - GB K8NSC939 - 500w CM PSU

Nokia 3310 - The PC: (Dell Optiplex GX110) Piii Coppermine 1Ghz - 512mb SDRAM - ATi 9250 pro 256mb - 80GB IDE - Sound Blaster Live!

Twin Game Servers (Two): C2Q q6600 @ 3.0ghz - Asus P5B - 8GB DDR2 800

PFSense Router: Pentium D @ 3.2ghz - 2GB DDR2

OMV NAS: Pentium 4 @ 3.2ghz - 2TB WD Red - 2GB DDR2

(I could list like 10 other PCs off the top of my head, yeah I'm a big time PC hoarder.)

Link to post
Share on other sites
10 hours ago, Dabombinable said:

The faster the framerate, the better the illusion of things moving on the screen. And I at least can see the difference between 60Hz and 120Hz when it comes to playing games or simply watching content. Its smoother and more lifelike. There was even noticeable difference for me when I went down from my very old low res 85Hz LCD TV/monitor to my current 1080p 60Hz screen.

 

13 hours ago, mr moose said:

Remember all those arguments over FPS on the forums?  Basically I keep posting evidence the human eye and visual processing system can't identify more than about 76FPS.   Because computer games are interactive and there is a lot more feedback and processing going on that changes Frame rates there are other mechanisms by which we can identify a better performing system (i.e one with faster rates), however a movie has no feedback or lag potential.  After 80FPS there is no advantage to perception.

What your saying seems about right. 


QuicK and DirtY. Read the CoC it's like a guide on how not to be moron.  Also I don't have an issue with the VS series.

Link to post
Share on other sites

One additional issue though is that motion blur at high fps is extremely nauseating for some individuals (like myself) and for some ungodly reason a number of game developers and media designers cant seem to understand that motion blur looks like crap and is unrealistic. 

 

The closer things get to the speed and distinctiveness that real life has, the less they should be doing crap like narrow focal depth and motion blur and all that, instead relying on the actual viewer actually concentrating in plane on something that seems realistic. (Particularly with 3d projections)

 

 

Side opinion note: real dslr bokeh looks bad 99% of the time. Fake bokeh looks worse. please stop forcing people to remove detail from their images.


LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Galaxy S9+ - XPS 13 (9343 UHD+) - Samsung Note Tab 7.0 - Lenovo Y580

 

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to post
Share on other sites
2 hours ago, mr moose said:

 

What your saying seems about right. 

I was also referring to content, aka video. Not just games. Most of the issues people have with high frame rate video are simply due to the unrealistic bullshit edited into the film, such as the aforementioned motion blur.


"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to post
Share on other sites

For people saying motion blur is "unrealistic":

Put your hand in front of your eyes then wave it around.


Officially LTT's 'coolest' member (yes, that's a pun)

The w is pronounced like v, if you're wondering!

Please quote me so I can see that you replied.

Spoiler

Current rig (Ninetales):

Intel Core i7-8086K @ 4.7GHz, ASUS Strix GTX 1070 (8GB), 16GB Corsair Vengeance LPX DDR4-3000, Windows 10 Pro x64

Laptop (Vulpix): 

Intel Core i7-7700HQ, GeForce GTX 1060 (6GB), 16GB G.Skill Ripjaws DDR4-2400, Windows 10 Pro x64
More detailed specs on my profile.

 

On 4/17/2017 at 5:36 PM, Ryan_Vickers said:

Rawr9 Furry Sex

Link to post
Share on other sites
16 hours ago, mr moose said:

Remember all those arguments over FPS on the forums?  Basically I keep posting evidence the human eye and visual processing system can't identify more than about 76FPS.   Because computer games are interactive and there is a lot more feedback and processing going on that changes Frame rates there are other mechanisms by which we can identify a better performing system (i.e one with faster rates), however a movie has no feedback or lag potential.  After 80FPS there is no advantage to perception.

so youre telling me if theres 1 white frame and all others are black in 77 FPS video, we will not notice anything?


MSI GX660 + i7 920XM @ 2.8GHz + GTX 970M + Samsung SSD 830 256GB

Link to post
Share on other sites

Am I surprised the people that say anything under 120fps is "literally unplayable" are also defending this?

 

I am fine with current 24fps in movies. Motion blur is needed because its real because we experience it in real life. 

Link to post
Share on other sites
17 minutes ago, Neftex said:

so youre telling me if theres only 1 white frame in 77 FPS video, we will not be able to there was something off?

Thats not a fair comparison. Our eyes are not like shutters, more like buckets trying to process as fast as possible. Inserting a single white frame would be noticeable no matter the frame rate. You wouldnt be able to tell what it is but it would be off. 

 

Take a video 80fps and put it side by side to 120fps and you will not be able to perceive the difference. Taking a starking white image and throwing in the mix is not the same thing.  

Link to post
Share on other sites
38 minutes ago, Nowak said:

For people saying motion blur is "unrealistic":

Put your hand in front of your eyes then wave it around.

"Motion blur" in real life only happens with relative motion compared to perspective. When you add in a random ass motion blurring effect you are forcing a certain focus point for your viewer and if they don't focus on that specific relative motion (assuming you did it properly, which most games don't), its extremely disorienting and pointlessly nauseating. 

 

If you as a real human being focus on your hand (to the limit at which your eyes can scan in degrees per second), it doesnt blur. Of course eyes can't actually scan that fast while processing data, so its extremely easy to move something at your eye level with an angular velocity in massive excess of your capacity to keep focused on it.

 

The difference between trees blurring as you fly past them and watching one actively scan across your vision.

 

It is blatantly incorrect from a physical standpoint to arbitrarily blur everything that moves regardless of the focus.

 

Again, this is even more relevant in 3d projection viewing where you shouldnt even infer ANY specific perspective for your viewer (either in what constitutes relative motion or focal plane) and instead show everything in perfect focus, because the viewers sight will already do all of the narrowing and motion related effects automatically (by choosing to focus on something themselves) and then there won't be a disconnect between what their brains expect in the motion and what the video is showing.


LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Galaxy S9+ - XPS 13 (9343 UHD+) - Samsung Note Tab 7.0 - Lenovo Y580

 

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to post
Share on other sites
8 minutes ago, mynameisjuan said:

Thats not a fair comparison. Our eyes are not like shutters, more like buckets trying to process as fast as possible. Inserting a single white frame would be noticeable no matter the frame rate. You wouldnt be able to tell what it is but it would be off. 

 

Take a video 80fps and put it side by side to 120fps and you will not be able to perceive the difference. Taking a starking white image and throwing in the mix is not the same thing.  

it is fair. if you can notice that, it means the human eye and brain can process stuff at higher fps.


MSI GX660 + i7 920XM @ 2.8GHz + GTX 970M + Samsung SSD 830 256GB

Link to post
Share on other sites
Just now, Neftex said:

it is fair. if you can notice that, it means the human eye and brain can process stuff at higher fps.

No it doesnt. Your eyes/brain tend to average out light coming. The 1 white frame would make the picture feel brighter, it wouldnt stand out. This is how everyday LEDs work. LED strips, taillights, clocks, all work by utilizing this technique. LEDs dont get brighter because they apply more power, they get brighter because they increase the hz. 

 

Its still up in the air on what the brain can perceive at high frame rate as its good at piecing together information and even make up its own. Like Moose is saying, its the extra physical input the helps the brain perceive this higher frame rate. We see motion blue on pretty slow moving objects in real life.  

Link to post
Share on other sites
17 hours ago, LukeSavenije said:

We finally get high refresh rate in movies... where is this going to?

They already tried 48fps on The Hobbit, people didn't buy it. Granted, the movies weren't very good.

 

Honestly, 120fps doesn't make any sense to me - it's just wasted storage or bandwidth on a movie. Just give us 48 or 60.


<Make me a sandwich.> <No! Make it yourself!> <Sudo make me a sandwich.> <FINE.> What is scaling and how does it work? Asus PB287Q unboxing! Console alternatives :D Watch Netflix with Kodi on Arch Linux CoC F.A.Q Beginner's Guide To LTT (by iamdarkyoshi)

Sauron'stm Product Scores:

Spoiler

Just a list of my personal scores for some products, in no particular order, with brief comments. I just got the idea to do them so they aren't many for now :)

Don't take these as complete reviews or final truths - they are just my personal impressions on products I may or may not have used, summed up in a couple of sentences and a rough score. All scores take into account the unit's price and time of release, heavily so, therefore don't expect absolute performance to be reflected here.

 

-Lenovo Thinkpad X220 - [8/10]

Spoiler

A durable and reliable machine that is relatively lightweight, has all the hardware it needs to never feel sluggish and has a great IPS matte screen. Downsides are mostly due to its age, most notably the screen resolution of 1366x768 and usb 2.0 ports.

 

-Apple Macbook (2015) - [Garbage -/10]

Spoiler

From my perspective, this product has no redeeming factors given its price and the competition. It is underpowered, overpriced, impractical due to its single port and is made redundant even by Apple's own iPad pro line.

 

-OnePlus X - [7/10]

Spoiler

A good phone for the price. It does everything I (and most people) need without being sluggish and has no particularly bad flaws. The lack of recent software updates and relatively barebones feature kit (most notably the lack of 5GHz wifi, biometric sensors and backlight for the capacitive buttons) prevent it from being exceptional.

 

-Microsoft Surface Book 2 - [Garbage - -/10]

Spoiler

Overpriced and rushed, offers nothing notable compared to the competition, doesn't come with an adequate charger despite the premium price. Worse than the Macbook for not even offering the small plus sides of having macOS. Buy a Razer Blade if you want high performance in a (relatively) light package.

 

-Intel Core i7 2600/k - [9/10]

Spoiler

Quite possibly Intel's best product launch ever. It had all the bleeding edge features of the time, it came with a very significant performance improvement over its predecessor and it had a soldered heatspreader, allowing for efficient cooling and great overclocking. Even the "locked" version could be overclocked through the multiplier within (quite reasonable) limits.

 

-Apple iPad Pro - [5/10]

Spoiler

A pretty good product, sunk by its price (plus the extra cost of the physical keyboard and the pencil). Buy it if you don't mind the Apple tax and are looking for a very light office machine with an excellent digitizer. Particularly good for rich students. Bad for cheap tinkerers like myself.

 

 

Link to post
Share on other sites
2 hours ago, Neftex said:

so youre telling me if theres 1 white frame and all others are black in 77 FPS video, we will not notice anything?

No,  each receptor in the eye triggers when a photon hits it and it stays triggered for a few hundredths of a sec. This means for a small fraction of time that receptor cannot be triggered again.   This is called vision persistence and it is why when you close your eyes or cover them the the last thing you saw remains as an outline.   The image lasts longer the brighter it was or the darker you can make it after closing your eyes.   If you were to have 77 white frames and one black frame you would not perceive it, if you had 77 black frame and 1 white frame you will be able to perceive it.  The key to this is that in this situation it does not say anything about the ability to perceive a difference in frame rates when each frame is of a similar light intensity.  This test only proves the existence of visual persistence.

 

 


QuicK and DirtY. Read the CoC it's like a guide on how not to be moron.  Also I don't have an issue with the VS series.

Link to post
Share on other sites

I'm all for having higher-fps movies...

 

...not because of the movies themselves, but because it may mean we'll get affordable high refresh rate TVs somewhere down the line, and those will be awesome for gaming. 


Ryzen 1600x @4GHz

Asus GTX 1070 8GB @1900MHz

16 GB HyperX DDR4 @3000MHz

Asus Prime X370 Pro

Samsung 860 EVO 500GB

Noctua NH-U14S

Seasonic M12II 620W

+ four different mechanical drives.

Link to post
Share on other sites
30 minutes ago, Giganthrax said:

I'm all for having higher-fps movies...

 

...not because of the movies themselves, but because it may mean we'll get affordable high refresh rate TVs somewhere down the line, and those will be awesome for gaming. 

Samsungs 4k 2018 lineup has true 1080@120hz off HDMI 2 and is only like ~$700ish for the 55" model

Link to post
Share on other sites
4 minutes ago, mynameisjuan said:

Samsungs 4k 2018 lineup has true 1080@120hz off HDMI 2 and is only like ~$700ish for the 55" model

That would probably be priced at least $1000 in Europe. Way too much man. I was thinking more in line of budget offerings. :D


Ryzen 1600x @4GHz

Asus GTX 1070 8GB @1900MHz

16 GB HyperX DDR4 @3000MHz

Asus Prime X370 Pro

Samsung 860 EVO 500GB

Noctua NH-U14S

Seasonic M12II 620W

+ four different mechanical drives.

Link to post
Share on other sites
Just now, Giganthrax said:

That would probably be priced at least $1000 in Europe. Way too much man. I was thinking more in line of budget offerings. :D

Point is its already a thing and trickling down. Sony and Visio are already doing it as well. Movies will not help this much.

Link to post
Share on other sites
2 hours ago, Neftex said:

it is fair. if you can notice that, it means the human eye and brain can process stuff at higher fps.

I would like to see a blind test where someone has been able to perceive the difference using only their eyes.

 

It has been shown that under very specific light/dark cases with sharp edge the human eye can identify flicker average of 500Hz (this is not the same as FPS), but the maximum actually perception of FPS observed through any testing and through study of the visual processing system is not more than 90Hz.

 

https://www.nature.com/articles/srep07861

 

This is why you can tell the difference between 60fps and 120 fps, but without some sort of biofeedback (computer input) there has been no established cases of people perceiving more than 90fps.

 

Quote

These studies have included both stabilized and unstablized retinal images, and report the maximum observable rate as 50–90 Hz.

and

Quote

The critical flicker fusion rate is defined as the rate at which human perception cannot distinguish modulated light from a stable field. This rate varies with intensity and contrast, with the fastest variation in luminance one can detect at 50–90 Hz

 

 

There is a lot more I wanted to say, but to keep it short, maybe in the future they will find out we can detect more precisely higher frame rates in certain conditions, the above linked article suggests being able encode 3 images in a video stream allowing for simultaneous 3d and 2d.   However for now the current percieved limit is about 90 for traditional video.


QuicK and DirtY. Read the CoC it's like a guide on how not to be moron.  Also I don't have an issue with the VS series.

Link to post
Share on other sites

A lot of shit about the human eye with, ### FPS blah blah blah.. what about outside where the light from the sun is continuous and the frame perception is near infinite FPS... This crap needs to stop.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×