Jump to content

Assassin's Creed dev thinks industry is dropping 60 fps standard

There is only one word to describe this guy: fuckwit

 

While that sounds to us like a profoundly stupid statement to make, the sad fact is that most of the games press does not criticize games with poor framerates. Console gamers are more than happy to continue buying games at 30 FPS and are in no hurry to support developers or platforms that can reach 60+ FPS; hell, some will even loudly defend developers of games stuck at 30 FPS, accusing the rest of us of being "PC elitists."

 

So I can't really blame the guy for saying this. The gaming industry really doesn't incentivize pushing the framerate, even if its objectively true that the games would be better if they did. It doesn't seem to help their review scores, and the player base doesn't seem to care.

 

But on the bright side, I dare say PC ports are generally better than they have been in awhile. On the rare occasion we see a new game with an artificial 30 FPS cap on PC (Need For Speed: Rivals), it's kind of a big deal. That used to happen all the time.

Link to comment
Share on other sites

Link to post
Share on other sites

Love this qoute in the article :-D

 

 

Cinematic my *** It's a game not a movie!

 

The whole "it looks more cinematic" is absolute utter silly-bollocks.  There is simply no factual evidence to support it.  None whatsoever.

 

The reason 24fps is acceptable in movies (although many would argue in fast action and high resolutions that it is not enough - see Hobbit 48 etc) is due to heavy motion blur on each captured frame.  Run these quickly in succession and your brain tricks you into thinking it is all one smooth motion.

 

Frames output from a GPU or a monitor do not have this motion blur applied to them, so you are seeing a static (frequently torn) image flicker on to your screen with each available output.  If vSync is enabled this will cause a delay while the monitor is outputting multiples of the same frame while the GPU draws a new one.

 

Motion Blur in games is not the answer as in order for it to work well it must blur the correct way 100% of the time.  This is what we see with one directional movies, but if you are in a game the camera can be looking in different directions while moving independantly in 3D space.  This prevents "motion blur" from accurately predicting the information and also will add considerable delay to any commands, not to mention the massive horsepower requirements to run it.

 

That is my laymans understanding of it anyway, TotalBiscuit has done some fantastic coverage and commentary on the issue

 

Utter silly-bollocks.  

Link to comment
Share on other sites

Link to post
Share on other sites

First of all, Ubisoft, as someone who played on consoles for the majority of his life before finally building a decent computer, there is a big difference between 30fps and 60fps. In my opinion, the only way that someone could not understand this is if they had never played a truly beautiful PC game at 60fps.

 

Second, Monolith Productions achieved, in one try, what you couldn't do with more than five games. The Nemesis System makes me feel like my actions matter in the game world; something you've always promised, but never delivered. Add to that, they are smart enough to understand that the Arkham combat system is BRILLIANT. AND, they did all of this on your shitty consoles as well as PC. Is Shadow of Mordor perfect? Absolutely not, but I'd rather play their game than yours.

 

Put that in your consoles and play it.

Link to comment
Share on other sites

Link to post
Share on other sites

The whole "it looks more cinematic" is absolute utter silly-bollocks. There is simply no factual evidence to support it. None whatsoever.

The reason 24fps is acceptable in movies (although many would argue in fast action and high resolutions that it is not enough - see Hobbit 48 etc) is due to heavy motion blur on each captured frame. Run these quickly in succession and your brain tricks you into thinking it is all one smooth motion.

Frames output from a GPU or a monitor do not have this motion blur applied to them, so you are seeing a static (frequently torn) image flicker on to your screen with each available output. If vSync is enabled this will cause a delay while the monitor is outputting multiples of the same frame while the GPU draws a new one.

Motion Blur in games is not the answer as in order for it to work well it must blur the correct way 100% of the time. This is what we see with one directional movies, but if you are in a game the camera can be looking in different directions while moving independantly in 3D space. This prevents "motion blur" from accurately predicting the information and also will add considerable delay to any commands, not to mention the massive horsepower requirements to run it.

That is my laymans understanding of it anyway, TotalBiscuit has done some fantastic coverage and commentary on the issue

Utter silly-bollocks.

Yup he made a video about. Where he gave credit to Linus about one of his videos. And yes you're correct.

Still don't get how someone can defend these companies. 24/30 fps i not enough, especially not with this so called "next gen"

Hardware: Intel I7 4790K 4Ghz | Asus Maximus VII Hero Z97 | Gigabyte 780 Windforce OC | Noctua NH-U12P SE2 | Sandisk Extreme Pro 480GB | Seagate 500Gb 7200Rpm | Phanteks Enthoo Luxe | EVGA Supernova G2 850W | Noctua NF12 | SupremeFX 2014 | Patriot Viper 3 16GB.

Gaming Gear: Cooler Master TK Stealth | Sennheiser PC350SE | Steelseries Rival | LG IPS23L-BN ' 5ms | Philips Brillians 144hz 

Link to comment
Share on other sites

Link to post
Share on other sites

First of all, Ubisoft, as someone who played on consoles for the majority of his life before finally building a decent computer, there is a big difference between 30fps and 60fps. In my opinion, the only way that someone could not understand this is if they had never played a truly beautiful PC game at 60fps.

Yeah it is night and day with 30 vs 60 fps. I just fired up Quake 3 and capped the FPS at 30 with com_maxfps (requires a graphics restart afterwards) and it was unplayable.

And I don't mean "unplayable" as "it looked bad", I mean it as "I had to lower the difficulty because it became so hard to play". Massively more input lag and everything felt like it was jumping around. Anyone who says there isn't a big difference is either lying or have never actually tried it. Sure it makes a bigger difference in an FPS but there the difference in terms of fluidity is still the same in all games. It's just that the input lag is not that much of a deal.

 

Ubisoft are idiotic for making this statement. Just say like it is instead, the new consoles are pretty awful in terms of hardware so they can't handle it. I wouldn't be surprised if they would do a 180 turn if Microsoft came out with some magic thing that doubled the FPS in games. Then all of a sudden 60 FPS would be fantastic in their PR campaigns.

Movies and games are not the same. Sure crippling the FPS in games is one way to make them more like movies. Wanna know another way of making games more like movies? Remove the ability for the player to input commands!

Maybe they should aim for "VHS looks and feels" next by running the game at 480x320. Or how about reducing it down to 24 FPS if you want it to be cinematic? Why run it at 30 FPS?

Link to comment
Share on other sites

Link to post
Share on other sites

Ubi should just stop now, before it's too late...

"Strength does not come from winning. Your struggles develop your strengths. When you go through hardships and decide not to surrender, that is strength" Arnold

Link to comment
Share on other sites

Link to post
Share on other sites

I hate playing at sub-45. I'd rather play at 720p if it meant that I would be able to play at 60FPS.

Main rig on profile

VAULT - File Server

Spoiler

Intel Core i5 11400 w/ Shadow Rock LP, 2x16GB SP GAMING 3200MHz CL16, ASUS PRIME Z590-A, 2x LSI 9211-8i, Fractal Define 7, 256GB Team MP33, 3x 6TB WD Red Pro (general storage), 3x 1TB Seagate Barracuda (dumping ground), 3x 8TB WD White-Label (Plex) (all 3 arrays in their respective Windows Parity storage spaces), Corsair RM750x, Windows 11 Education

Sleeper HP Pavilion A6137C

Spoiler

Intel Core i7 6700K @ 4.4GHz, 4x8GB G.SKILL Ares 1800MHz CL10, ASUS Z170M-E D3, 128GB Team MP33, 1TB Seagate Barracuda, 320GB Samsung Spinpoint (for video capture), MSI GTX 970 100ME, EVGA 650G1, Windows 10 Pro

Mac Mini (Late 2020)

Spoiler

Apple M1, 8GB RAM, 256GB, macOS Sonoma

Consoles: Softmodded 1.4 Xbox w/ 500GB HDD, Xbox 360 Elite 120GB Falcon, XB1X w/2TB MX500, Xbox Series X, PS1 1001, PS2 Slim 70000 w/ FreeMcBoot, PS4 Pro 7015B 1TB (retired), PS5 Digital, Nintendo Switch OLED, Nintendo Wii RVL-001 (black)

Link to comment
Share on other sites

Link to post
Share on other sites

ugh, more of this '30 fps looks more cinematic' idiocy. The fact that this crap comes from the mouths of actual game developers is pretty ridiculous. I don't know what's worse, that this guy actually believes this bullshit, or if he's lying through his teeth to excuse their decision to go 30 fps.

I find this funny as hell, maybe they should just stop making video games and simply make CGI movies if they're so interested in fucking cinematography.

It's a video game, not a movie, it can look cinematic if they want it to but it's supposed to feel smooth and responsive.

“The mind of the bigot is like the pupil of the eye; the more light you pour upon it the more it will contract” -Oliver Wendell Holmes “If it can be destroyed by the truth, it deserves to be destroyed by the truth.” -Carl Sagan

Link to comment
Share on other sites

Link to post
Share on other sites

well Ubisoft is going to have to tell their investors that profit is also just a number becauses I'm not going to buy anymore of their games.

If you judge a fish by its ability to climb a tree, it will live its whole life thinking it's stupid.  - Albert Einstein

Link to comment
Share on other sites

Link to post
Share on other sites

i always root for ubisoft, being a canadian dev with games i have really enjoyed over the years, but these past few weeks its been hard to stay in there corner. they need to stop with this bullshit

Intel i5 4670K | Asus z97 Deluxe | MSI GTX 1060 (6GB) | 16GB Corsair Vengeance |

 

120GB Samsung EVO - 480GB OCZ Solid III - 1.5TB Seagate Barracuda  | Corsair AX760

 

 

Razer Blackwidow Ultimate | Razer Ouroboros

 

Link to comment
Share on other sites

Link to post
Share on other sites

"Too hard to get 60 FPS" my ass, sure for consoles, but those piles of shit are shit.

My build that has no dedicated GPU and is using the integrated graphics of my i5 4690K can at least come close to these "Next Gen" consoles, if not beating them.

Sure I may have to turn some settings down a bit, but most games I play easily hit 60 FPS.

Unless you only league or something, that's a bit of a stretch. Linus did a showdown between consoles and a PC in the same price range, a 7770, I believe, offered the same performance as the consoles.

Link to comment
Share on other sites

Link to post
Share on other sites

Why can't they admit that these systems are just terrible? It is just holding back everything else in the video game industry with these incredibly underpowered consoles. I really hope that these ones don't last nearly as long as the last generation, because these are already outdated by superior and cheaper hardware.

Something, something, something, famous quote, computer specs, and stuff...

Link to comment
Share on other sites

Link to post
Share on other sites

Link to comment
Share on other sites

Link to post
Share on other sites

Only reinforces the fact that ubisoft needs to vanish.

Spoiler

Corsair 400C- Intel i7 6700- Gigabyte Gaming 6- GTX 1080 Founders Ed. - Intel 530 120GB + 2xWD 1TB + Adata 610 256GB- 16GB 2400MHz G.Skill- Evga G2 650 PSU- Corsair H110- ASUS PB278Q- Dell u2412m- Logitech G710+ - Logitech g700 - Sennheiser PC350 SE/598se


Is it just me or is Grammar slowly becoming extinct on LTT? 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I find this funny as hell, maybe they should just stop making video games and simply make CGI movies if they're so interested in fucking cinematography.

It's a video game, not a movie, it can look cinematic if they want it to but it's supposed to feel smooth and responsive.

 

Pretty much this. Especially since their cinematic trailers end up being wayyy more entertaining than their actual games.

Link to comment
Share on other sites

Link to post
Share on other sites

ubisoft:

 

sml_gallery_20081_337_100806.png

4690K // 212 EVO // Z97-PRO // Vengeance 16GB // GTX 770 GTX 970 // MX100 128GB // Toshiba 1TB // Air 540 // HX650

Logitech G502 RGB // Corsair K65 RGB (MX Red)

Link to comment
Share on other sites

Link to post
Share on other sites

To be honest, I'm not really sure who the blame here. It seems that, because the PS4 and Xbox One are underpowered in terms of their hardware, the game devs are having to make all kinds of excuses as to why the can't deliver a proper gaming experience to avoid being bashed by Sony and Microsoft. However on the other hand, the excuses that the devs are coming up with are just ridiculous, especially when they try to claim that 30fps is better than 60.

 

But, I do agree with what is being said here. (bring on the hate...) Here me out, the industry (Specifically the console game industry) is dropping 60fps because the consoles are underpowered, so in that sense, this guy is correct, it's just that the rest of his excuse is BS.

 

I expect towards the end of these consoles' life span, when games are better optimised, we will have games like this running at, at least 1080p 30fps, with the option for 720p or 900p 60fps, hopefully this will sort out all of this excuse making and lying.

My PC:


4670k      GTX 760 ACX      CoolerMaster Hyper 412s      Fractal Design Node 804      G1 Sniper M5      Corsair RM 650      WD Red 1TB     Samsung 840 Evo 120GB

Link to comment
Share on other sites

Link to post
Share on other sites

I said it before and I'll say it again. If I didn't hold myself back I'd have a broken nose because of the sheer strength of the face palm that would be necessary.

Link to comment
Share on other sites

Link to post
Share on other sites

I love it. Just as 4K at 60fps is becoming a more and more viable option for a growing number of people in the PC arena, the PS4 is offering that truly NEXT GENERATION sub-HD, 30fps experience. -_-

 

And the console fanboys are letting them get away with this, they are letting themselves get screwed royally by putting up with this bullshit.

Link to comment
Share on other sites

Link to post
Share on other sites

I can't even begin to imagine who exactly they think believes this crap anymore. Sure, they're defending their product...but blatantly lying? It's not even a good lie. 

Link to comment
Share on other sites

Link to post
Share on other sites

How about... 60Hz is the minimum and if they can't manage that, they need to go back to the drawing board.

[TRUENO] i7 4770k (~4.4Ghz, 1.28v) || Thermalright Macho 120 || Asus Z87 Gryphon || 2x8Gb Mushkin Blackline|| Reference NVIDIA GTX770 || Corsair Neutron GTX 480GB || 2x3TB WD HDD || Corsair 350D || Corsair RM750

Link to comment
Share on other sites

Link to post
Share on other sites

It is just amazing how low and stupid ubisoft can go for admitting failure like this. Why not just come out clean and say that it is a easy port for them, instead of making horrible lies that even the hardcore of trolls won't go far. Pathetic company.

Link to comment
Share on other sites

Link to post
Share on other sites

I wouldn't blame the developers to be honest, I'd like to give them the benefit of the doubt that they really do the best they can. They obviously can't do much with the hardware, and I doubt the management isn't making it any better for them.

Don't forget, it's never the developers coming out and saying that 30fps and sub-1080p resolutions are fine, it's always people like "World Level Design Director" who wouldn't know the first thing about optimising a game.

 

That said, I don't give a shit what the consoles are running at, so long as it doesn't affect the PC version or spread misinformation about the effects of FPS/resolution on gameplay. Sadly, this isn't the case...

Link to comment
Share on other sites

Link to post
Share on other sites

Might as well go backwards now. Next conference, "you know, we should go back to 8-bit"

post-52522-0-58371200-1412893696.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×