Jump to content

Ubisoft Responds to Low Frame Rates in Assassin's Creed Unity [Updates]

ObscureMammal

Did anyone actually watch this video?

 

I'm sitting here wanting to punch my monitor because these guys don't get it. The way the game looks does not justify the performance being tanked. One of the co-hosts implied that players should turn their resolution down to 900p instead of 1080p and it will run fine.

 

whatthefuckican'tevenrightnowplsstopthis

 

Well if you can't play at 1080p then maybe you should play at 900p. It is probably better than playing at 10-15fps. I think you are underestimating how much processing power goes into global illumination. Lighting is not an easy thing. Especially in an open world game.

 

The men in video have lost all creditably and have proven not to be a trusted source of information.

 

Care to explain why?

 

This pretty much illustrates the idiocy of Ubisoft. Visual quality is important when the game is playable. 

 

You could drop the settings. If a console can play it, a half decent PC can play it. Drop the res if you have to .

Rig: i7 2600K @ 4.2GHz, Larkooler Watercooling System, MSI Z68a-gd80-G3, 8GB G.Skill Sniper 1600MHz CL9, Gigabyte GTX 670 Windforce 3x 2GB OC, Samsung 840 250GB, 1TB WD Caviar Blue, Auzentech X-FI Forte 7.1, XFX PRO650W, Silverstone RV02 Monitors: Asus PB278Q, LG W2243S-PF (Gaming / overclocked to 74Hz) Peripherals: Logitech G9x Laser, QPad MK-50, AudioTechnica ATH AD700

Link to comment
Share on other sites

Link to post
Share on other sites

Well if you can't play at 1080p then maybe you should play at 900p. It is probably better than playing at 10-15fps. I think you are underestimating how much processing power goes into global illumination. Lighting is not an easy thing. Especially in an open world game.

 

 

Care to explain why?

 

 

You could drop the settings. If a console can play it, a half decent PC can play it. Drop the res if you have to .

 

It's an unoptimizied piece of shit. I don't have to be a game developer or know how to program to know this.

 

I shouldn't have to lower my resolution for a game that doesn't even justify the performance hit with how it looks.

Link to comment
Share on other sites

Link to post
Share on other sites

Theres a difference between well-looking well optimized games and good looking shit optimized games, Unity is the latter

Muh rig: i7 4770k, Cooler Master Hyper 212 Evo, MSI Z87 G45, Kingston Hyper X Blu 8GB, Samsung 840 EVO 120 + WD Blue 1 TB, Asus GTX 770 2GB, Corsair 200r + 2x Corsair AF 120 Blue + 1x Stock corsair fan, Corsair TX650, LG 27EA33V IPS, Steelseries Sensei Raw + QCK mini, CM Quickfire Ultimate Blue.

Link to comment
Share on other sites

Link to post
Share on other sites

http://www.pcgameshardware.de/Assassins-Creed-Unity-PC-258436/Specials/Test-Technik-Benchmarks-1142550/

 

FXAA is the only thing playable at 1080p on the top of the line Nvidia card (which is freakin laughable). Same Nvidia Games work sabotage of AMD GPU's we saw in the last Batman game and Watchdogs. R9 280 which DWARFS a console GPU (more than 3 to 1 in the case of the Xbox One and has the same GCN architecture) and has almost the same FPS lows as a 750TI which is not close as far as power.

 

Assassins_Creed_Unity_GPU_Benchmark_1080

 

Ubisoft saying AMD has optimization to do and that they are in touch with them is a sham. Nvidia controls the Games Works library and AMD has no access to it, unless Nvidia allows it. They have to literally play guess work and optimization is impossible.

 

http://www.extremetech.com/extreme/173511-nvidias-gameworks-program-usurps-power-from-developers-end-users-and-amd

 

At this point with all the Ubisoft lies, and Nvidia FUD? I hope AMD cripples Nvidia performance on all upcoming EA titles. Maybe if AMD starts being as ruthless, and doing the same BS Nvidia does? Nvidia will stop. We sure as hell know that our corrupt press with Nvidia ads running in the background of their webpage isn't going to tell us what is going on, while they allow Ubisoft to spout absolute fairytales about AMD being allowed access to the game to optimize it.

 

Cancelled my preorder of Witcher 3. Not dealing with this crap. Nvidia/Ubisoft are literally stealing money out of my pocket as far as GPU performance. Ubisoft wants to talk about piracy all the time? Sure thing. I hope people pirate the hell out of Games Works games. They deserve it. 

 

If Nvidia wants a closed system on what is an open platform? Here is a thought. Go make a console and make exclusive games and get the hell out of PC Gaming.

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

this game looks worse than crysis 3... its poorly coded. ADMIT IT UBISOFT

 

 

stupid idiots...

"If a Lobster is a fish because it moves by jumping, then a kangaroo is a bird" - Admiral Paulo de Castro Moreira da Silva

"There is nothing more difficult than fixing something that isn't all the way broken yet." - Author Unknown

Spoiler

Intel Core i7-3960X @ 4.6 GHz - Asus P9X79WS/IPMI - 12GB DDR3-1600 quad-channel - EVGA GTX 1080ti SC - Fractal Design Define R5 - 500GB Crucial MX200 - NH-D15 - Logitech G710+ - Mionix Naos 7000 - Sennheiser PC350 w/Topping VX-1

Link to comment
Share on other sites

Link to post
Share on other sites

It's an unoptimizied piece of shit. I don't have to be a game developer or know how to program to know this.

 

I shouldn't have to lower my resolution for a game that doesn't even justify the performance hit with how it looks.

 

 

Theres a difference between well-looking well optimized games and good looking shit optimized games, Unity is the latter

 

Again I think you are underestimating the powered required for global illumination. They don't really go into too much detail on it but it is only used in a couple of games; and basically it is a fps hog. Between that and the amount of people in this game, I can understand why it is difficult to run.

 

 

Take the world of Unity for example: it lacks the wide open oceans of Black Flag, which required little in the way of horsepower, and instead opts for densely-packed cities and detail filled set pieces (you may have seen a certain video online...). Within Paris there are massive buildings, accurately-recreated monuments, thousands of on-screen civilians, seamlessly-accessible interiors, and a whole lot more. In comparison, Black Flag's biggest land locations feature a few dozen civilians at any one time, fewer buildings, no interiors, and a lot less of everything else. Unity's developers could have dialed up the level of detail settings and turned a cathedral into a featureless box, and they could have removed all civilians, effects, and Global Illumination lighting, but they chose not, because that would ruin the look and feel they designed for their game. Therefore, a powerful GPU, faster than the one required for Black Flag's minimum settings, is required.

 

http://www.geforce.com/whats-new/guides/assassins-creed-unity-graphics-and-performance-guide

 

I haven't played the game yet so I can't really say for sure. I am not going to slate the game purely because it is difficult to run.

Rig: i7 2600K @ 4.2GHz, Larkooler Watercooling System, MSI Z68a-gd80-G3, 8GB G.Skill Sniper 1600MHz CL9, Gigabyte GTX 670 Windforce 3x 2GB OC, Samsung 840 250GB, 1TB WD Caviar Blue, Auzentech X-FI Forte 7.1, XFX PRO650W, Silverstone RV02 Monitors: Asus PB278Q, LG W2243S-PF (Gaming / overclocked to 74Hz) Peripherals: Logitech G9x Laser, QPad MK-50, AudioTechnica ATH AD700

Link to comment
Share on other sites

Link to post
Share on other sites

 

Again I think you are underestimating the powered required for global illumination. They don't really go into too much detail on it but it is only used in a couple of games; and basically it is a fps hog. Between that and the amount of people in this game, I can understand why it is difficult to run.

 

 

I haven't played the game yet so I can't really say for sure. I am not going to slate the game purely because it is difficult to run.

 

 

So are you suggesting that Nvidia GameWorks isn't at all responsible for the poor framerate here?

Link to comment
Share on other sites

Link to post
Share on other sites

So are you suggesting that Nvidia GameWorks isn't at all responsible for the poor framerate here?

 

In what regard? You mean the tools they built for game devs to use are poorly coded? Because those tools are used in other games too. Or do you mean with AMD cards?

 

I thought you were saying Ubi is at fault. So is it Ubi or Nvidia?

 

I am not suggesting anything. I am just not going to write them off based on practically no information. Have you played the game?

Rig: i7 2600K @ 4.2GHz, Larkooler Watercooling System, MSI Z68a-gd80-G3, 8GB G.Skill Sniper 1600MHz CL9, Gigabyte GTX 670 Windforce 3x 2GB OC, Samsung 840 250GB, 1TB WD Caviar Blue, Auzentech X-FI Forte 7.1, XFX PRO650W, Silverstone RV02 Monitors: Asus PB278Q, LG W2243S-PF (Gaming / overclocked to 74Hz) Peripherals: Logitech G9x Laser, QPad MK-50, AudioTechnica ATH AD700

Link to comment
Share on other sites

Link to post
Share on other sites

In what regard? You mean the tools they built for game devs to use are poorly coded? Because those tools are used in other games too. Or do you mean with AMD cards?

 

I thought you were saying Ubi is at fault. So is it Ubi or Nvidia?

 

A popular assumption is that Nvidia's GameWorks library exists to cripple AMD's performance in the games that implement the tools from said library. Are you saying that this library is not responsible for the framerate, but rather the engine's own lighting technique?

 

I'm not saying one way or another in this post, I'm just trying to get an answer out of you :P

Link to comment
Share on other sites

Link to post
Share on other sites

A popular assumption is that Nvidia's GameWorks library exists to cripple AMD's performance in the games that implement the tools from said library.

 

 

If that is the case then they are doing a pretty bad job. Isn't it just a s bad on Nvidia cards as it is on AMD? You think they'd get it right first time around if they were trying to cripple AMD. :P

 

 

Are you saying that this library is not responsible for the framerate, but rather the engine's own lighting technique?

 

I am assuming that is the case but I can't give a definitive answer. All I do know is the game has packed a whole load of stuff into it that we don't usually see. That could account for the lack in high detail of some textures. (purposely vague :D)

 

 

I'm not saying one way or another in this post, I'm just trying to get an answer out of you :P

 

I'll take that as a 'no' then? :P

Rig: i7 2600K @ 4.2GHz, Larkooler Watercooling System, MSI Z68a-gd80-G3, 8GB G.Skill Sniper 1600MHz CL9, Gigabyte GTX 670 Windforce 3x 2GB OC, Samsung 840 250GB, 1TB WD Caviar Blue, Auzentech X-FI Forte 7.1, XFX PRO650W, Silverstone RV02 Monitors: Asus PB278Q, LG W2243S-PF (Gaming / overclocked to 74Hz) Peripherals: Logitech G9x Laser, QPad MK-50, AudioTechnica ATH AD700

Link to comment
Share on other sites

Link to post
Share on other sites

If that is the case then they are doing a pretty bad job. Isn't it just a s bad on Nvidia cards as it is on AMD? You think they'd get it right first time around if they were trying to cripple AMD. :P

 

Yup. Everyone who is hellbent on GameWorks being the issue is ignoring that, though. 

 

I am assuming that is the case but I can't give a definitive answer. All I do know is the game has packed a whole load of stuff into it that we don't usually see. That could account for the lack in high detail of some textures. (purposely vague :D)

 

Maybe. However, in Watch_Dogs, it was discovered that certain effects were turned off for PC. The modding community found these "hidden" files and "turned the effects back on" which neither hindered or increased performance at all on most machines - it's odd that these effects that would make the game look better overall would be turned off even though it had no hindrance on PC performance. The game was still poorly performing on both vendor's cards and visuals did not justify the performance hit.

 

The assumption could be made about AC:U.

 

I'll take that as a 'no' then? :P

I'm on the ship that says Nvidia isn't at fault, but the team who made the poor decisions on the visuals. The way the game looks does not justify the performance hit. PCPer compared AC:U's performance to the Crysis (1) release when apparently even dual Nvidia 8800's could not run it even at 30 FPS minimum - what they fail to realize is Crysis is a beautiful game and one could argue that it justified the performance hit, so one would have to turn settings down for it to be playable. To suggest that we should just turn down resolution on AC:U for it to be playable is not an answer that the PC community should get. The game's visuals do not (again) justify the performance hit.

 

I do not have to be a programmer or a game developer to see this.

Link to comment
Share on other sites

Link to post
Share on other sites

Doesn't playing the game offline double your fps now?

Muh rig: i7 4770k, Cooler Master Hyper 212 Evo, MSI Z87 G45, Kingston Hyper X Blu 8GB, Samsung 840 EVO 120 + WD Blue 1 TB, Asus GTX 770 2GB, Corsair 200r + 2x Corsair AF 120 Blue + 1x Stock corsair fan, Corsair TX650, LG 27EA33V IPS, Steelseries Sensei Raw + QCK mini, CM Quickfire Ultimate Blue.

Link to comment
Share on other sites

Link to post
Share on other sites

The men in video have lost all creditably and have proven not to be a trusted source of information.

 

or maybe they're just wrong in this case...?

Link to comment
Share on other sites

Link to post
Share on other sites

The game doesn't look that great.  I have an i3, granted, but that's around what new-gen consoles have right? Plus a 2GB GTX 760 shouldn't die at low settings on a 1200p monitor. and the difference between high and ultra only degrades performance a bit; so the CPU might be a bottleneck, but it shouldn't be. The devs went crazy with like 5 different cloth simulations on every character, and that's what's bringing it down. They said themselves that the GPUs in new consoles were fine, but the CPUs were big bottlenecks; because they're doing way too much on an engine not yet optimized.

 

Ubisoft needs to fix a game that's disgustingly broken. I have a feeling the developers were warning that the game was far from finished, but got pushed to release on time. Didn't they see the good that came from Watch_Dogs? It had some bugs, but was nowhere near as broken as it would have been five months prior. Good thing I, um, borrowed my copy.

Link to comment
Share on other sites

Link to post
Share on other sites

Well if you can't play at 1080p then maybe you should play at 900p. It is probably better than playing at 10-15fps. I think you are underestimating how much processing power goes into global illumination. Lighting is not an easy thing. Especially in an open world game.

 

 

Care to explain why?

 

 

You could drop the settings. If a console can play it, a half decent PC can play it. Drop the res if you have to .

 

Funny thing is that even consoles are having issues with this. Good luck changing options on consoles.

Two revolutionary dance tones

Link to comment
Share on other sites

Link to post
Share on other sites

Did anyone actually watch this video?

 

I'm sitting here wanting to punch my monitor because these guys don't get it. The way the game looks does not justify the performance being tanked. One of the co-hosts implied that players should turn their resolution down to 900p instead of 1080p and it will run fine.

 

whatthefuckican'tevenrightnowplsstopthis

Yeah, playing at 720p runs fine on my computer, but for fucks sake I don't want a tiny window (or badly pixelated full screen image).

Link to comment
Share on other sites

Link to post
Share on other sites

Funny thing is that even consoles are having issues with this. Good luck changing options on consoles.

Yeah; global illumination is taxing, but more so if you don't optimize it properly. If they spent time even on just consoles to truly optimize it, they could have pushed even better graphics with better performance.

 

ALAS

Link to comment
Share on other sites

Link to post
Share on other sites

Apart from all the already known issues...

 

Honestly, did they need that many NPC's on screen at once, even cutting the amount by 1/5th could do some justice to CPU/GPU/PC/Console performance.

Plus while other games use that method of having same models in different clothing, they could have tried harder to make them more different (seeing as there is SO many NPC's in this game, compared to others)

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

Did anyone actually watch this video?

 

I'm sitting here wanting to punch my monitor because these guys don't get it. The way the game looks does not justify the performance being tanked. One of the co-hosts implied that players should turn their resolution down to 900p instead of 1080p and it will run fine.

 

whatthefuckican'tevenrightnowplsstopthis

 

I don't think you understand what they're saying. They're assuming that the game is optimized but current hardware is the bottleneck. In this case you may have to turn down you're settings.

 

The internets are saying that the game is not optimized for PC(which i believe is true)....But in their argument they are assuming that the game is optimized for PC because this is what they have been told by the games developers. I don't think it's hard to understand. 

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think you understand what they're saying. They're assuming that the game is optimized but current hardware is the bottleneck. In this case you may have to turn down you're settings.

 

The internets are saying that the game is not optimized for PC(which i believe is true)....But in their argument they are assuming that the game is optimized for PC because this is what they have been told by the games developers. I don't think it's hard to understand. 

 

"Oh, well just turn down your resolution" and "You're not a programmer or a game developer, so you don't know shit about their code".

 

I completely understand what they're saying.

 

I don't understand how one could assume the game is optimized, especially from the mouths of Ubisoft's PR.

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think you understand what they're saying. They're assuming that the game is optimized but current hardware is the bottleneck. In this case you may have to turn down you're settings.

 

Ubisoft made this game in the future with future hardware.  That explains everything.  Ok time to go home folks.  /sarcasm

Link to comment
Share on other sites

Link to post
Share on other sites

When you're interviewing these guys it probably sounds pretty genuine and you would hope it is. He mentioned in his play through that he had no other issues so this may have seemed a reasonable conclusion. 

 

I'm sure they would be willing to admit they were wrong as all this new info comes out. 

Link to comment
Share on other sites

Link to post
Share on other sites

Ubisoft made this game in the future with future hardware.  That explains everything.  Ok time to go home folks.  /sarcasm

 

What's wrong with pushing the limits of current tech (i'm not talking about AC here but in general)? Have you heard of Crysis?

 

What point are you trying to make here?  

Link to comment
Share on other sites

Link to post
Share on other sites

What's wrong with pushing the limits of current tech (i'm not talking about AC here but in general)? Have you heard of Crysis?

 

What point are you trying to make here?  

Crysis was taxing because of the lush flora.  All we have is speculation why AC runs so terribly.  PC has worse textures than the xb1 and the game is "optimized" for the gtx 680 but still manages to get shit framerates on higher end hardware.

 

How is ubisoft pushing the envelope/hardware?  What purpose does your white knighting of ubisoft do?  Why dont you direct your questions at them?  You're not the only one who can ask questions.

Link to comment
Share on other sites

Link to post
Share on other sites

Like I said above, I don't believe the game is optimized. I'm also not defending Ubisoft. 

 

Going by your post above: Why would the Crysis devs make such lush flora? Did they make it in the future with future hardware? Why would they do that? 

 

I also mentioned above (in post you directly quoted) that the PCper guys were assuming the code was OK and that current hardware was the bottleneck. They were assuming it was another Crysis situation. It has nothing to do with how poorly optimized the game is. 

 

As for things that may push current hardware, they are mentioned in the article: 

 

  • There are tens of thousands of objects are visible on-screen, casting and receiving shadows.
  • Paris is incredibly detailed. For example, Notre-Dame itself is millions of triangles.
  • The entire game world has global illumination and local reflections.
  • There is realistic, high-dynamic range lighting.
  • We temporally stabilized anti-aliasing.

 

Again it doesn't help that these things have been (probably) poorly optimized. But if done well i'm sure it could tax a decent rig. 

 

Your sarcasm post was pointless...

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×