Jump to content
Matias_Chambers

Why are movies and music videos always in 24-30 fps?

Recommended Posts

A few. The first being that in movies, there is motion blur. This can't be artificially replicated, thus why 24-30 fps looks really bad in games. They are also shooting at insane resolution, so those extra frames per second really do add up.


<CPU> Intel Pentium G3258 @ 4.4 GHz 1.275v <Cooler> Cooler Master Hyper 212 EVO <Motherboard> MSI Z97 PC MATE <Memory> Crucial Balistix Sport 8GB 1866MHz <Storage> WD Caviar Blue 1TB, Kingston SSDNow V300 120GB <GPU> EVGA Geforce GTX 750ti <PSU> EVGA 500B <Case> Corsair Spec-01 By the way, if you need my attention, quote my response

Link to post
Share on other sites

Probably because there is no need for it, maybe you will find the odd small artist doing it with a unique music video but really they are just left at 24 or 29.97 fps because their just streamed on MTV or a music channel. Most likely due to the extra storage used would not be needed.


PC Editing / Gaming Rig Specs : i7-3930k @ 4ghz w/ Hyper 212 Evo | 840 Evo 250gb and Seagate 1TB | 2x8GB Corsair Vengeance RAM | Zotac GTX 780 | Corsair Vengeance C70 Green | OS: Windows 8.1 Pro #KilledMyWife #MakeBombs

Link to post
Share on other sites
7 minutes ago, Aytex said:

"Cinematic effect"

 

Anything under 60 fps to me is cancer to my eyes so :/ 

MPC-HC + SVP = No cancer.


X5650@4.66Ghz, X58 EVGA SLI3, 20Gb of Random DDR3 @ 2133, 1Tb WD Blue 1Tb Seagate Barricuda 160Gb WD Scorpio, GTX 970, Corsair Carbide 400c, Corsair RMx 750 / Setup / Asus 150Hz MG248Q, Some 1080p Dell IPS, Corsair Strafe MXBrown, Logitech G900

Link to post
Share on other sites

our eye can only only perceive certain amount of frames per second but then also it meshes one frame with another . yes higher frames and ultra sharpness is a good thing in gaming but 24-30 fps is more soothing for film directors as they can affect the mood greatly

Link to post
Share on other sites

actually this has quite a simple reason.

make up, prosthetics and vfx.

Ever read reviews of 48fps version of the hobbit? they will destroy the movie for sure, I agree with then, the 48fps version sucks(saw it myself). This is because 24fps is limited in the amount of data that is captured and displayed within that second, somehow our brains aren't able to see through make up and vfx at 24fps, but at 48fps it is absolutely horribly to see it, close up shots allow you to see the difference between proper applied make up(gandalf) and professional made prothesis, you can just tell that they aren't real.

But with 24fps this isn't a problem, same movie, different frame rate and no problem, it is very weird but it's true and also the reason why no other professional work goes to 48fps or higher.


May the light have your back and your ISO low.

Link to post
Share on other sites
48 minutes ago, Bsmith said:

actually this has quite a simple reason.

make up, prosthetics and vfx.

Ever read reviews of 48fps version of the hobbit? they will destroy the movie for sure, I agree with then, the 48fps version sucks(saw it myself). This is because 24fps is limited in the amount of data that is captured and displayed within that second, somehow our brains aren't able to see through make up and vfx at 24fps, but at 48fps it is absolutely horribly to see it, close up shots allow you to see the difference between proper applied make up(gandalf) and professional made prothesis, you can just tell that they aren't real.

But with 24fps this isn't a problem, same movie, different frame rate and no problem, it is very weird but it's true and also the reason why no other professional work goes to 48fps or higher.

I'm going to somewhat disagree with you here.

 

I saw the entire Hobbit trilogy at 48fps HFR when it was in theatres. And the first one was VERY jarring. The first 20-30 minutes of the movie just looked and felt off.

 

But after that? I got used to it and totally forgot it was HFR. For the second and third movies, that time period narrowed to more like 15-20 minutes before I adjusted.

 

The "soap opera" effect, as many call it, is mainly because we've all spent our entire lives watching movies at 24p, and then we see one in 48p, and our brains don't associate that with a movie yet.

 

If movies were always made in 48p, it would look totally normal and great to us.

 

Once I got used to it, the advantages far outweighed any disadvantages. The directors and film crew will need to adjust their film style, set design, etc, because yes, higher detail and information is captured, but the smoothness of it all just makes it better.

 

@Matias_Chambers the reason why things are filmed in 24p (movies) is because WAY BACK in the 20's and 30's when movies were becoming a popular thing, the technology was shit. They basically needed to find the absolute minimum frame rate they could achieve, that produced the motion effect, without costing too much, or without having too complex equipment.

 

24p was discovered to be just fast enough for people to "see" it as motion, but slow enough that the projector technology would be cheap and reliable (faster fps meant bigger risk of jamming, fires, break downs, etc).

 

Once this was adopted, it became a global standard in theatres everywhere. So we just kept it. It's not because 24p looks more "cinematic", it's because no one bothered to ever change the system, because getting global adoption of a new system would have been an incredibly large task - and still is today.

 

HFR has had a hard time getting adopted because the cost of everything involved is very high, and the results are still in their infancy. The Hobbit certainly wasn't perfect. But neither were the first 3D movies that came out in the late 90's and early 2000's - a lot of those looked like utter shit.

 

Honestly it wasn't really until Avatar came out, in what, 2009? When 3D movies really found their footing. Avatar was the first 3D movie that blew my mind in 3D (And I personally still don't care about 3D either way, but I do recognize that Avatar was a game changer in that tech).

 

Avatar 2 is supposed to be shot in HFR 60 fps, so we'll see how that looks.

 

Now on to TV shows. This one is a bit more technical, and I'm not going to go into details, but the short answer is because the way TV's work in North America (60hz - or the equivalent of 60 fps), 30fps (or more accurately 29.97) was chosen because it was the easiest way to achieve a good quality look when broadcasting Analog TV.

 

When Digital finally rolled around in the 2000's, no one felt much need to change the system, since it still worked well enough. Though now sports broadcasts are often in 720p60, since the higher frame rate there makes a big difference.

 

It's only a matter of time now, before 48 (or more likely 60) fps HFR becomes more widespread. We simply need to wait for the film technology and film techniques to catch up. We need the HFR equivalent of "Avatar" to show the world how good HFR can look. Peter Jackson thought that was going to be The Hobbit, but the film techniques haven't quite caught up by then yet, and he was still learning (and essentially making it up as he goes) how to shoot in HFR when making the Hobbit movies.


* Intel i7-4770K * ASRock Z97 Anniversary * 16GB RAM * 750w Seasonic Modular PSU *

* Crucial M4 128GB SSD (Primary) * Hitachi 500GB HDD (Secondary) *

* Gigabyte HD 7950 WF3 * SATA Blu-Ray Writer * Logitech g710+ * Windows 10 Pro x64 *

 

Link to post
Share on other sites
1 hour ago, Bsmith said:

actually this has quite a simple reason.

make up, prosthetics and vfx.

Ever read reviews of 48fps version of the hobbit? they will destroy the movie for sure, I agree with then, the 48fps version sucks(saw it myself). This is because 24fps is limited in the amount of data that is captured and displayed within that second, somehow our brains aren't able to see through make up and vfx at 24fps, but at 48fps it is absolutely horribly to see it, close up shots allow you to see the difference between proper applied make up(gandalf) and professional made prothesis, you can just tell that they aren't real.

But with 24fps this isn't a problem, same movie, different frame rate and no problem, it is very weird but it's true and also the reason why no other professional work goes to 48fps or higher.

Same Reason i Turn off the motion and added frames that they all name different  on my tv and why its nick named the  " Soap Oprah Affect" at this time high refreshrate makes everything look fake on a lot of content. 

 

as @dalekphalm Notes you do get use to it after a while but personally there are many times that it looks terrible when flipping channels and such same way back in 2006ish you use to have to switch between stretching the image watching in hd or watching in 4:3 all the content was a mixed bag. It is superior but its not there yet for me to want it. Same as 4k videos most cable companies in the US provide a SHIT picture 720p/1080i compressed to the fullest where 1080p bluray blows it out of the water. there no where near ready for 4k yet everyones pushing it. 

Link to post
Share on other sites

Ever seen the first Hobbit film that was shot at 40fps? The amount of bitching about the "unrealistic" look of the film? Granted it was mostly CG'd, but the cinematic shots looked really smooth, even when they were filmed in a real location. I don't care about high frame rate films, the 24fps is fine for movies. 


Case: Rosewill THORV2 CPU: i7 5930K Motherboard: EVGA X99 Classified RAM: 16GB DDR4 G.Skill GPU: MSi GTX 980 Ti Lightning Edition (2x SLI) PSU: Corsair 1000w 80+ Platinum Drives: 500GB SSD (2x), 120GB SSD, 2TB (2x) HDD, 4TB HDD, & 500GB HDD Displays: Dell P2715Q 27" (4K) (3x nVidia Surround)

Link to post
Share on other sites
4 hours ago, dalekphalm said:

I'm going to somewhat disagree with you here.

 

I saw the entire Hobbit trilogy at 48fps HFR when it was in theatres. And the first one was VERY jarring. The first 20-30 minutes of the movie just looked and felt off.

 

But after that? I got used to it and totally forgot it was HFR. For the second and third movies, that time period narrowed to more like 15-20 minutes before I adjusted.

 

The "soap opera" effect, as many call it, is mainly because we've all spent our entire lives watching movies at 24p, and then we see one in 48p, and our brains don't associate that with a movie yet.

 

If movies were always made in 48p, it would look totally normal and great to us.

 

Once I got used to it, the advantages far outweighed any disadvantages. The directors and film crew will need to adjust their film style, set design, etc, because yes, higher detail and information is captured, but the smoothness of it all just makes it better.

-snips-

 

Like you said, the amount of changes that need to be made aren't easy, film style and set design like you said yourself, but also make up and VFX most importantly, you kinda double the work for the crew that has to sync and render the animations, the colour grading, chromakeying, check up on the colour grade for consistency. They have double the frames and double the data to plow through. adding up all these things together comes to quite a big cost image that also requires more time, valuable time that is, big budget movies work with tight deadlines, although that should go away a little when the hollywood mafia dies off.


May the light have your back and your ISO low.

Link to post
Share on other sites
9 hours ago, Matias_Chambers said:

Movies, music videos and such are always in 24-25 fps sometimes 30 fps. Why aren't they in 60 fps? Any specific reason? 

See @dalekphalm's answer below. I was going to say exactly what he stated - it is mostly due to broadcast standards defined in the 90's that just haven't changed much since then, except for the odd sports broadcast and soap opera drama shows.

7 hours ago, dalekphalm said:

Now on to TV shows. This one is a bit more technical, and I'm not going to go into details, but the short answer is because the way TV's work in North America (60hz - or the equivalent of 60 fps), 30fps (or more accurately 29.97) was chosen because it was the easiest way to achieve a good quality look when broadcasting Analog TV.

 

When Digital finally rolled around in the 2000's, no one felt much need to change the system, since it still worked well enough. Though now sports broadcasts are often in 720p60, since the higher frame rate there makes a big difference.

 

9 hours ago, Aytex said:

"Cinematic effect"

 

Anything under 60 fps to me is cancer to my eyes so :/ 

9 hours ago, RALalong said:

our eye can only only perceive certain amount of frames per second but then also it meshes one frame with another . yes higher frames and ultra sharpness is a good thing in gaming but 24-30 fps is more soothing for film directors as they can affect the mood greatly

Our eyes are capable of seeing up to the equivalent of ~150-200 FPS on average1, which is why anything under that is not optimal for gaming.

Technically, our retinal nerves can talk to the brain at about 1000 FPS, but with overhead throughput, realistically this works out to ~25% of that.

Obviously not everyone can tell the difference, but for those who can, the difference between 30, 60, 120, and 240 Hz/FPS is astounding.

 

As you point out @RALalong, for TV and Film, standardizing a "cinematic" framerate makes sense, since we're accustomed to the same motion in shows.

Plus, watching shows at 240Hz appears fake as hell, although this is because those TV's are artificially interpolating frames that don't existing.

For some shows, such as sporting events or live action, higher FPS / Hz makes sense, as our brains can process the additional frames for more detail.

 

You can sort of experiment what your eyes are capable of seeing by comparing the live view LCD of two digital camera's side-by-side.

For instance, the 30 FPS live view on my Panasonic DMC-TS20D looks super laggy compared to the 60 FPS live view on my Canon SX280HS.

If you only have one camera, you can also change the recording quality from 1080p60 to 1080p30 to see the same effect on live view.

To me, 60 FPS appears a lot more fluid, although again, I wouldn't record everything at 60 FPS as it looks really weird when played back.

 

This article posted to Blizzard's forums covers framerates for gaming and film/TV shows really well, explaining why things are the way they are.


Desktop: i7-4790k Build - ALMOST COMPLETE (GTX950 PURCHASED!!!) Tablet: HP Touchpad | ASUS ME302C
Laptop: Dell XPS 15 9560 (a real 15" MacBook Pro that Apple failed to make) Mobile: Note 3 | Bell 250 Mins + 6GB Data ($57/month)
Camera: Canon SX280 + T1i (500D) | Sony HDR-AS50R | Panasonic DMC-TS20D Music: Spotify (CIRCA '08) | iPod Classic 80GB

Link to post
Share on other sites

I did like that the hobbit was 48 fps.

I wish all movies was 48 or 60 fps.


“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to post
Share on other sites

Honestly one of the main reasons is aesthetic

 

24fps feels "filmic" like the movie was shot on film and it looks "movie like" there is motion blur between the frames and everything 

 

60fps feels "lifelike" which for a movie would feel very weird, it has that strange sitcom feeling, as everything appears just smooth and strange

 

In the digital age, cinematographers are still aiming for the "filmic look" iconic from proper film cameras

 

Heck, one of the most common things to do with digital footage is to apply "real" film grain using filmconvert!!! where you apply grain taken from real film and overlay your clean digital footage to make it look more filmic


Desktop - Corsair 300r i7 4770k H100i MSI 780ti 16GB Vengeance Pro 2400mhz Crucial MX100 512gb Samsung Evo 250gb 2 TB WD Green, AOC Q2770PQU 1440p 27" monitor Laptop 1 Alienware M18x Red - 18.4" 1080p i7 3610qm CF 7970m samsung evo 250gb, raid 0 2x 500gb harddrive, killer nic n1103, 8gb ram, Nebula Red Laptop 2 Clevo W110er - 11.6" 768p, i5 3230m, 650m GT 2gb, OCZ vertex 4 256gb,  4gb ram, Laptop 3 M11x Celeron SU7300 + 512mb 335m, 8gb Ram Server: Fractal Define Mini, MSI Z78-G43, Intel G3220, 8GB Corsair Vengeance, 4x 3tb WD Reds in Raid 10, iPad 4 32gb, iPhone 6 128gb Space Grey, Sony A7

Link to post
Share on other sites
17 hours ago, Bsmith said:

Like you said, the amount of changes that need to be made aren't easy, film style and set design like you said yourself, but also make up and VFX most importantly, you kinda double the work for the crew that has to sync and render the animations, the colour grading, chromakeying, check up on the colour grade for consistency. They have double the frames and double the data to plow through. adding up all these things together comes to quite a big cost image that also requires more time, valuable time that is, big budget movies work with tight deadlines, although that should go away a little when the hollywood mafia dies off.

Just because solving those problems might be difficult or expensive does not mean they aren't worth solving.

 

Blockbuster movies are starting to have $300m+ budgets. They can easily - if they decide to - allocate $10m of that budget on increased VFX and better prop design, etc.

4 hours ago, ShadowCaptain said:

Honestly one of the main reasons is aesthetic

 

24fps feels "filmic" like the movie was shot on film and it looks "movie like" there is motion blur between the frames and everything 

 

60fps feels "lifelike" which for a movie would feel very weird, it has that strange sitcom feeling, as everything appears just smooth and strange

 

In the digital age, cinematographers are still aiming for the "filmic look" iconic from proper film cameras

 

Heck, one of the most common things to do with digital footage is to apply "real" film grain using filmconvert!!! where you apply grain taken from real film and overlay your clean digital footage to make it look more filmic

The thing is that the "filmic" motion blur effect (and other visual cues due to the nature of filming in 24p) can be recreated in Post Production. One of the reasons why Motion Blur doesn't work very well in video games is because you cannot predict with 100% accuracy where the next frame will be. But in movies, that's not an issue at all.

 

I firmly believe that the drawbacks of 48 or 60fps HFR are entirely surmountable.


* Intel i7-4770K * ASRock Z97 Anniversary * 16GB RAM * 750w Seasonic Modular PSU *

* Crucial M4 128GB SSD (Primary) * Hitachi 500GB HDD (Secondary) *

* Gigabyte HD 7950 WF3 * SATA Blu-Ray Writer * Logitech g710+ * Windows 10 Pro x64 *

 

Link to post
Share on other sites
2 minutes ago, dalekphalm said:

I firmly believe that the drawbacks of 48 or 60fps HFR are entirely surmountable.

I still dont think its necessary, plus lets not forget it makes the files twice as large for a production especially if they shoot RAW

I feel that a lower framerate will still have the more cinematic FEEL, but I would admit its possible we have been conditioned that way due to 100 years of cinema

 

I am yet to be convinced, even if I film airosft action stuff i almost always conform it down to 30fps to make it feel more cinematic and to do half speed from 60 to 30

 

 


Desktop - Corsair 300r i7 4770k H100i MSI 780ti 16GB Vengeance Pro 2400mhz Crucial MX100 512gb Samsung Evo 250gb 2 TB WD Green, AOC Q2770PQU 1440p 27" monitor Laptop 1 Alienware M18x Red - 18.4" 1080p i7 3610qm CF 7970m samsung evo 250gb, raid 0 2x 500gb harddrive, killer nic n1103, 8gb ram, Nebula Red Laptop 2 Clevo W110er - 11.6" 768p, i5 3230m, 650m GT 2gb, OCZ vertex 4 256gb,  4gb ram, Laptop 3 M11x Celeron SU7300 + 512mb 335m, 8gb Ram Server: Fractal Define Mini, MSI Z78-G43, Intel G3220, 8GB Corsair Vengeance, 4x 3tb WD Reds in Raid 10, iPad 4 32gb, iPhone 6 128gb Space Grey, Sony A7

Link to post
Share on other sites
11 minutes ago, ShadowCaptain said:

I still dont think its necessary, plus lets not forget it makes the files twice as large for a production especially if they shoot RAW

I feel that a lower framerate will still have the more cinematic FEEL, but I would admit its possible we have been conditioned that way due to 100 years of cinema

 

I am yet to be convinced, even if I film airosft action stuff i almost always conform it down to 30fps to make it feel more cinematic and to do half speed from 60 to 30

 

 

The biggest issue for me with 24p is simply the drawbacks of fast paced action scenes or quick panning effects.

 

Take Star Trek: Into Darkness, as an example. In the scene where Scotty is in the shuttlecraft and he discovers the drydock shipyard where the prototype starship is being built, they scan across the top of the structure, and it's a fucking slide-show. This is common with any pan scenes that are done a bit too quick.

 

That cinematic feel, I'm confident, can be reproduced. Literally the reason why they chose 24p had nothing to do with film effects or feel - all of that came later. It was technological limitations. Film was goddamn expensive back in the early 20th century, and cameras had issues with running too much film through the camera, or running the film too fast (One of the reasons why 70mm IMAX never caught on in mainstream filming - the cameras were notoriously temperamental, and you could really only film for 5-7 minutes at a time).

 

Shooting RAW, and storage issues? Surmountable. In fact, easily surmountable, given the advances in storage technology over the last half decade or so. High speed SSD's that can shoot, eg, 8K RAW - I'm pretty sure shooting 4K or 6K at 60 fps is doable. It might be more expensive, yes, but not so long ago, shooting in 4K RAW itself was considered too expensive.


* Intel i7-4770K * ASRock Z97 Anniversary * 16GB RAM * 750w Seasonic Modular PSU *

* Crucial M4 128GB SSD (Primary) * Hitachi 500GB HDD (Secondary) *

* Gigabyte HD 7950 WF3 * SATA Blu-Ray Writer * Logitech g710+ * Windows 10 Pro x64 *

 

Link to post
Share on other sites
4 minutes ago, dalekphalm said:

SNIP.

I agree various issues are surmountable 

 

But whether or not everything would be better in a higher framerate I am yet to be convinced off, not that I am saying its gospel truth


Desktop - Corsair 300r i7 4770k H100i MSI 780ti 16GB Vengeance Pro 2400mhz Crucial MX100 512gb Samsung Evo 250gb 2 TB WD Green, AOC Q2770PQU 1440p 27" monitor Laptop 1 Alienware M18x Red - 18.4" 1080p i7 3610qm CF 7970m samsung evo 250gb, raid 0 2x 500gb harddrive, killer nic n1103, 8gb ram, Nebula Red Laptop 2 Clevo W110er - 11.6" 768p, i5 3230m, 650m GT 2gb, OCZ vertex 4 256gb,  4gb ram, Laptop 3 M11x Celeron SU7300 + 512mb 335m, 8gb Ram Server: Fractal Define Mini, MSI Z78-G43, Intel G3220, 8GB Corsair Vengeance, 4x 3tb WD Reds in Raid 10, iPad 4 32gb, iPhone 6 128gb Space Grey, Sony A7

Link to post
Share on other sites
1 minute ago, ShadowCaptain said:

I agree various issues are surmountable 

 

But whether or not everything would be better in a higher framerate I am yet to be convinced off, not that I am saying its gospel truth

True, we won't really know until people try to do it and see if the issues can be fixed.

 

I for one, just want people to try. We've been so complacent with 24p filming for years, I don't want to hold back advances and technology simply because "we've always done it this way". I look forward to Avatar 2 and see how James Cameron deals with filming in 60p. If anyone can make it work, it's him.


* Intel i7-4770K * ASRock Z97 Anniversary * 16GB RAM * 750w Seasonic Modular PSU *

* Crucial M4 128GB SSD (Primary) * Hitachi 500GB HDD (Secondary) *

* Gigabyte HD 7950 WF3 * SATA Blu-Ray Writer * Logitech g710+ * Windows 10 Pro x64 *

 

Link to post
Share on other sites
1 minute ago, dalekphalm said:

True, we won't really know until people try to do it and see if the issues can be fixed.

 

I for one, just want people to try. We've been so complacent with 24p filming for years, I don't want to hold back advances and technology simply because "we've always done it this way". I look forward to Avatar 2 and see how James Cameron deals with filming in 60p. If anyone can make it work, it's him.

Well I never like how my videos look at 60p, it feels to fast and fluid, and "too real" which you think would be good, but a part of my brain thinks movies should look like movies

 

Often why movies have heavy colour grades, its to look the opposite of real, but to like "hyper real" or "dreamlike" or to make us realise this is a cinematic universe, and not real life

high frame rates often feel ike an amateur camcorder instead of a heavily orchestrated piece of cinema

I am willing to try, but as I say from my own experimentation have never liked videos I have tried to make at 60fps


Desktop - Corsair 300r i7 4770k H100i MSI 780ti 16GB Vengeance Pro 2400mhz Crucial MX100 512gb Samsung Evo 250gb 2 TB WD Green, AOC Q2770PQU 1440p 27" monitor Laptop 1 Alienware M18x Red - 18.4" 1080p i7 3610qm CF 7970m samsung evo 250gb, raid 0 2x 500gb harddrive, killer nic n1103, 8gb ram, Nebula Red Laptop 2 Clevo W110er - 11.6" 768p, i5 3230m, 650m GT 2gb, OCZ vertex 4 256gb,  4gb ram, Laptop 3 M11x Celeron SU7300 + 512mb 335m, 8gb Ram Server: Fractal Define Mini, MSI Z78-G43, Intel G3220, 8GB Corsair Vengeance, 4x 3tb WD Reds in Raid 10, iPad 4 32gb, iPhone 6 128gb Space Grey, Sony A7

Link to post
Share on other sites
3 hours ago, dalekphalm said:

Just because solving those problems might be difficult or expensive does not mean they aren't worth solving.

 

Blockbuster movies are starting to have $300m+ budgets. They can easily - if they decide to - allocate $10m of that budget on increased VFX and better prop design, etc.

 

 

possible in cost yes, but really worth it in the end? I highly doubt it since it also makes the standards higher for new people and the old garde needs to be kicked under it's ass to do it, why change when it's good enough? the movie industry isn't like apple.


May the light have your back and your ISO low.

Link to post
Share on other sites
23 minutes ago, Bsmith said:

 

possible in cost yes, but really worth it in the end? I highly doubt it since it also makes the standards higher for new people and the old garde needs to be kicked under it's ass to do it, why change when it's good enough? the movie industry isn't like apple.

"Good enough" to you, maybe. But when I see panning shots in an otherwise gorgeously filmed movie, that look like someone is using flip cards, then no, that's not good enough for 2016.

 

People are used to 24p - that doesn't make it better.


* Intel i7-4770K * ASRock Z97 Anniversary * 16GB RAM * 750w Seasonic Modular PSU *

* Crucial M4 128GB SSD (Primary) * Hitachi 500GB HDD (Secondary) *

* Gigabyte HD 7950 WF3 * SATA Blu-Ray Writer * Logitech g710+ * Windows 10 Pro x64 *

 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


  • Recently Browsing   0 members

    No registered users viewing this page.


×