Jump to content

Assassin's Creed dev thinks industry is dropping 60 fps standard

1zptc37.jpg

 

Luckily, i don't like any of ubisofts games.

Intel Core i5 3570K @ 4.5GHz | ASUS P8Z77-V LX2 | 2x4GB kingston hyper-x genesis @1600MHz | Gigabyte windforce GTX 780 3xOC rev.2 | 240GB kingston v300 & 500GB seagate 7200rpm | 

Corsair GS600 |  1440p Dell U2515H & 1080p 60Hz tv/monitor | Asetek based AIO 120mm liquid cpu cooler

Link to comment
Share on other sites

Link to post
Share on other sites

Yet another way the industry is holding back the PC.

Come and join the awesome Official LTT Star Citizen Org at LTT Conglomerat,  GTA 5 LTT Crew at LTT Conglomarate


PC Specs - 4770k - OC 4.5GHz  - GTX 780 SC - 16GB HyperX - NZXT H440 White - Corsair H100i - Corsair AX750 - Samsung Evo 250GB - 2 x PA238Q - ATH-M50 - 

Link to comment
Share on other sites

Link to post
Share on other sites

Haven't read the article, but sadly I think that games will go that route. One reason I think that is because games in the past couple years (Both 7th and 8th gen) have been going the route of 30 FPS over 60 FPS, most likely to keep up with the graphical standard.

 

I'm going to give credit where credit is due real fast before I go on, and say that 30 FPS in a game can most certainly be appropriate. In fact, IIRC, L.A. Noire was capped at 30 FPS. It's not that the game's poorly optimized, but it simply doesn't go above 30 FPS without 3rd party tools. The reason why isn't your average excuse like laziness or graphical fidelity (At least not entirely, and again - IIRC). The reason why was because of the technology they used for facial animations. This is an advancement in technology I can get behind, mainly because it opens doors for future game installments at best (As well as advancements in the technology itself, such as working properly under 60 FPS) and goes unnoticed at worst (Which really isn't such a big deal). With an advancement of technology like this, I can do more than welcome games in 30 FPS... as long as it goes by the reasons of "advancements" in technology and not "Oooh, game are prettier than if it ran at 60 FPS".

 

As a whole though, games seem to be going more in the direction of advancing graphical fidelity, most likely because of type of people (usually) who complained about Watch Dogs (Oh! It's not as good looking as the trailers, must mean that it sucks!). Although I think we should welcome graphical fidelity enhancements, I don't think we should support the idea that it's so important, that it needs to be #1 on our priority list. I think we automatically do that though, with complaining about games not looking that good when they're slightly below expectations, or with people noticing almost only graphical changes to games (Like Watch Dogs) and ignoring (Or mostly ignoring) everything else. 30 FPS is most certainly not unplayable, but the simple lack of 60+ FPS as the standard seems to almost inadvertently promote lack of true polish within games. 

 

What I think people should do (I seriously doubt it'll turn out this way, but hey) is instead - promote game engine polish (mainly) and higher frame rate (Not just 60 FPS, but even further to satisfy all gamers - competitive players included) in the development process. I seriously doubt this will ever come to action however, considering everyone always has to think about the graphics and how good they can be. As an end goal, I wish it would get to the point where something like 80-90 FPS becomes standard (Like I said, to satisfy nearly all gamers out there). Imagine, where games that are amazingly polished, and at a very high frame rate. After that, I think then, and only then - graphical fidelity should be a true focus.

Previously Trogdor8freebird

5800x | Asus x570 Pro Wifi (barely enough for 64GB apparently given it's 2133 and still crashes sometimes) | 64GB DDR4 | 3070 Ti 8GB | Love that whole weeb shit

Link to comment
Share on other sites

Link to post
Share on other sites

That's like saying "You guys shouldn't expect 60fps because it's useless anyway". It's hard to hit 60fps in those underpowered consoles but that doesn't mean it's impossible. Lousy excuse to feed the console crowd that says fps is just a number.

Link to comment
Share on other sites

Link to post
Share on other sites

The amount of resolution and framerate e-peen stroking in this thread is ridiculous. 

 

Since when did games becoming about resolution and framerates? Whatever happened to gameplay? To story? To all of the other reasons people play video games? 

 

Frankly, if the gameplay and story-telling is above average, I'd be more than willing to sacrifice resolution and framerates. And no, I don't agree that smoother framerates make gameplay better (first-person shooters being the only exceptions where it's necessary). 

Interested in Linux, SteamOS and Open-source applications? Go here

Gaming Rig - CPU: i5 3570k @ Stock | GPU: EVGA Geforce 560Ti 448 Core Classified Ultra | RAM: Mushkin Enhanced Blackline 8GB DDR3 1600 | SSD: Crucial M4 128GB | HDD: 3TB Seagate Barracuda, 1TB WD Caviar Black, 1TB Seagate Barracuda | Case: Antec Lanboy Air | KB: Corsair Vengeance K70 Cherry MX Blue | Mouse: Corsair Vengeance M95 | Headset: Steelseries Siberia V2

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Does this mean we're going to 120/144 already? Because that better damn well be what he means.

They mean making 30fps the standard

I run my browser through NSA ports to make their illegal jobs easier. :P
If it's not broken, take it apart and fix it.
http://pcpartpicker.com/b/fGM8TW

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not suprised, let's see how much people aboard (developers) the 30 FPS Bullshit Train. I read that Evil Within is also locked at 30 FPS.

Link to comment
Share on other sites

Link to post
Share on other sites

If its too hard for you guys to do your job and develop a good game, then its too hard for me to delve into my pocket and give you my money. 

Link to comment
Share on other sites

Link to post
Share on other sites

The guys behind Witcher: We will give the consoles what they can handle, but we won't gimp the PC to fit the console. Both platforms will get the best we can manage, of course the PC will be better since we can manage more.

Ubisoft: Herp Derp we are inept developers and can't exploit anything regardless of the hardware it runs on. 

 

I'm not even going to blame the console hardware. Console hardware HAS ALWAYS been crap. That was never the issue. 

 

Developers are just getting very lazy. They don't want to put in extra work. They want to repackage their work as DLC (looking at you, Bungie and Destiny) or they want to make bullshit excuses about parity just to hide the fact that they can't code their way out of a paper bag. 

 

Fucking hacks are what is holding the industry back. Laziness.

 

Not hardware. Some devs have the right attitude towards the consoles. They do what they can, where they can, but they don't gimp the PC version just cause they feel lazy. As someone who games on 3 platforms, I appreciate when a PC release isn't completely fucked up (see: Shadow of Mordor). Sure, I bought SoM on my Xbox instead of PC, but thats just my personal preference on the matter. 

 

I appreciate the choice. I appreciate having the effort put in no matter what platform. At least make me believe you tried. Don't give BS excuses as to why. Do you think most rational and sane PC/Console gamers give a shit about one platform exceeding another? 

 

I sure as hell, as a consumer who spends hundreds on a year on gaming without even talking about the hardware, don't care. I don't care that my PC stomps over my PS4 which stomps over my Xbox which stomps over the Wii U (which I don't have, but lets keep this example going). I just don't care. I care that the games are fun. First and foremost. I care that the effort was put in to make a proper port that doesn't hold back for arbitrary reasons. 

 

I just like playing games! 

 

And then these assholes come in with bullshit justifications to cover up their pathetic skills as game makers. 

Dude... Well said... Just that *starts applauding*

Link to comment
Share on other sites

Link to post
Share on other sites

I think even 30 fps is a stretch, let's drop it further to 24 fps, or even 15 fps so we can get the true cinematic experience. The lower the better, am I right?

 

/s

Link to comment
Share on other sites

Link to post
Share on other sites

This guy was paid to say that or there's only one word to describe him; idiot.

Link to comment
Share on other sites

Link to post
Share on other sites

Honestly, if they want to drop framerate, at least make it optional with a slider. Low details & 60FPS or high details and 30FPS. Sure the next gen consoles are crap. There are lots of bad and outdated PC out there also, but nobody complains about that, because on PC you get to choose between framerate and detail settings (or just have both if you have a good machine).

i5 4570/ASUS Z87-A/8GB Corsair Veangeance DDR3-1600/GTX 770/A-DATA SP600/EVGA SuperNova NEX 650G/NZXT Source 530

Link to comment
Share on other sites

Link to post
Share on other sites

And no, I don't agree that smoother framerates make gameplay better (first-person shooters being the only exceptions where it's necessary). 

How does it not make the game better? Sure it doesn't make the *gameplay* better but it does make the game better.

30 FPS is laggy as hell. Seriously, go play Quake 3 at 60 FPS and then cap it at 30 FPS. It is night and day in terms of smoothness. At 30 FPS it looks like ass and is extremely unresponsive, to the point of being unplayable.

The exact same thing happens in all games, not just FPS. It's just that it's more important in FPS games like Quake because it's far more sensitive to input lag. The animations being choppy is the same no matter the game though.

Link to comment
Share on other sites

Link to post
Share on other sites

How does it not make the game better? Sure it doesn't make the *gameplay* better but it does make the game better.

30 FPS is laggy as hell. Seriously, go play Quake 3 at 60 FPS and then cap it at 30 FPS. It is night and day in terms of smoothness. At 30 FPS it looks like ass and is extremely unresponsive, to the point of being unplayable.

The exact same thing happens in all games, not just FPS. It's just that it's more important in FPS games like Quake because it's far more sensitive to input lag. The animations being choppy is the same no matter the game though.

 

 

And no, I don't agree that smoother framerates make gameplay better (first-person shooters being the only exceptions where it's necessary). 

 

It makes the game better only for a particular type of game. Other types of games aren't competitive enough for it to be a problem. 

Interested in Linux, SteamOS and Open-source applications? Go here

Gaming Rig - CPU: i5 3570k @ Stock | GPU: EVGA Geforce 560Ti 448 Core Classified Ultra | RAM: Mushkin Enhanced Blackline 8GB DDR3 1600 | SSD: Crucial M4 128GB | HDD: 3TB Seagate Barracuda, 1TB WD Caviar Black, 1TB Seagate Barracuda | Case: Antec Lanboy Air | KB: Corsair Vengeance K70 Cherry MX Blue | Mouse: Corsair Vengeance M95 | Headset: Steelseries Siberia V2

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Honestly, in my opinion the framerate should be prioritized over graphical quality

Link to comment
Share on other sites

Link to post
Share on other sites

Honestly, in my opinion the framerate should be prioritized over graphical quality

 

Yes.

 

To quote the developer, "you don't gain that much" from stuffing a scene with hundreds of characters and texture and mesh quality that a console cannot handle.

Link to comment
Share on other sites

Link to post
Share on other sites

What's going to happen when the consoles start supporting VR? We're already seeing Project Morpheus being developed by Sony, and no doubt Microsoft will eventually follow suit. But games are downright unplayable at 30fps in VR because (at least from what I've heard) it's quite sickening, because the increased lag induces motion sickness. Even 60fps is still supposed to be not that great, and we should be looking at something like 75-90FPS for VR. On the PC I'd happily turn down some details to achieve this if I had a rift (I am only on a 760 after all), but for console players they'll be relying on the devs to take notice. Otherwise they're going to end up with a lot of complaints...

Link to comment
Share on other sites

Link to post
Share on other sites

Honestly, I've played at FPS from  both 30 thorugh to 60, there is a difference between the two. However the difference, in my opinon, is not even worth it. As long as the game plays well, looks decent, and you enjoy it, why does the FPS have to matter?

Gaming PC: Case: NZXT Phantom 820 Black | PSU: XFX 750w PRO Black Edition 80Plus Gold (Platinum) | CPU: Intel Core i5 4690K | CPU Cooler: BE QUIET! Dark Rock Pro 2 | MB: ASUS Sabertooth Z97 Mark S | RAM: 24GB Kingston HyperX and Corsair Vengeance 1866MHz | GPU: MSI R9 280X 3G | SSD: Samsung 840 Evo 250GB | HDD: 9TB Total | Keyboard: K70 RGB Brown | Mouse: R.A.T MMO7

Laptop: HP Envy 15-j151sa | 1920x1080 60HZ LED | APU: AMD A10-5750M 2.5GHZ - 3.5GHZ | 8GB DDR3 1600mhz | GPU: AMD  HD 8650G + 8750M Dual Graphics | 1TB SSHD

 

Link to comment
Share on other sites

Link to post
Share on other sites

Also something Linus has pointed out in his latest G-Sync monitor review (that Acer 4K monitor) is that gameplay feels relatively smooth at about 45fps, maybe this is the answer for consoles? Obviously G-Sync isn't going to work for them and Free-Sync will still require a different monitor and console hardware needs to support it with a display port connection, but maybe the PS5/whatever the next XBox will be called will make use of this...? Wishful thinking I suspect...

Link to comment
Share on other sites

Link to post
Share on other sites

Also something Linus has pointed out in his latest G-Sync monitor review (that Acer 4K monitor) is that gameplay feels relatively smooth at about 45fps, maybe this is the answer for consoles? Obviously G-Sync isn't going to work for them and Free-Sync will still require a different monitor and console hardware needs to support it with a display port connection, but maybe the PS5/whatever the next XBox will be called will make use of this...? Wishful thinking I suspect...

Or MS/Sony could grow a set and spend some money on real hardware...

Current System - Intel Core i7-3770k @ 4.5GHz - 16GB Corsair Vengeance DDR3-1600 - Corsair H110i GT - 2x EVGA GTX 970 FTW+ in SLI - XFX Pro Series Black Edition 1250W - Samsung 840 EVO 128GB Boot SSD - WD Green 2TB Mass Storage HDD - Fractal Design Define S Windowed Edition with Green LED Lighting provided by 2 Bitfenix Spectre PRO 140mm fans, and 2 Corsair SP140 Green LED fans - Samsung U28D590D 4K Main Monitor with BenQ GW2265 1080p Side Monitor

Link to comment
Share on other sites

Link to post
Share on other sites

pff.. that's just stupid... any pc now a days can easily brake the 60 fps barrier... that's what happens when the HW used on a console is years behind the pc specs... (even 3 years old pc's can handle a lot of modern titles at +60fps with better graphics than a console...)

 

if fps is just a number... let's go back to the doom days and AGP cards.. right? 

 

if a console can't break the 60fps barrier at 2014.. then they need to work harder.. or just give up..

 

in my case.. i can't play a game if it's not above 50fps ._. (don't ask me why or how.. i just can't.. i see it slow.. and with graphic stuttering..) .. so consoles are not my thing

 

ohh ubisoft.. your are digging your own grave at the moment... lol

Spacebar bestbar

-blocko

Link to comment
Share on other sites

Link to post
Share on other sites

They mean making 30fps the standard

You reading skills are excellent.

.

Link to comment
Share on other sites

Link to post
Share on other sites

It seems that very few of you know that with the large majority of TVs having a high input lag (>45 ms), it makes no difference to have more than 30 FPS with those televisions. TVs aren't like monitors. I don't think there's an accurate survey of people's TV regarding input lag when connected to a console, but from what I noticed when shopping for a new TV in 2013, most TV's input lag were around 100-133 ms across most price ranges. So yes, 60 FPS are a must-have for PCs connected to monitors, but for consoles (or even PCs) connected to TVs, it's fundamental to buy a TV with less than 45 ms if you want to even notice 60 FPS.

There are 10 types of people in the world. Those who know binary and those who don't.

Link to comment
Share on other sites

Link to post
Share on other sites

I can deal with 30 fps if I have to, but the game better damn well be at least 1080p. At the very least.

Link to comment
Share on other sites

Link to post
Share on other sites

Noone really knows what happens in game development and also console development. Noone really knows where the bottlenecks lay.

Game development is not magic. Hardware manufacturing is not magic. Taking advantage of both knowledge offered by each industry is also not magic.

 

We know how game development works (it is public knowledge; check your local library)

We know how marketing works (it is public knowledge; check your local library)

We know how the hardware manufacturing process is done (it is public knowledge; check your local library)

 

We have seen knowledge applied from both sides into game making. as a consequence we were witness to one of the greatest milestones in game-making history; Crysis.

 

We know what happens in PC development. We know what happens in Console development. We know where the bottlenecks are.

It's public knowledge; check your local library.

SPAAAAAACE!!!

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×