Jump to content

Why Ubisoft Games Are Badly Optimized?

Khader87
4 hours ago, JZStudios said:

~3 years. It's been OC'd for maybe half that.

It could be time for an upgrade. You can get a minimum of $75.00 for it and a maximum of $125.00 if you're super patient. I saw one sell for $96.00 a week ago so $100.00 is probably a price to start with. Then you can put that $100 minus what it costs to ship it($8.00-$13.00 or so) towards a better GPU. Maybe a 1060 3GB. I owned one for a year and a half and it's a 1080p BEAST. You can even dabble in 1440p with it. You can find a 1060 3GB VERY cheap right now USED. Only about $90.00-$130.00. I'd say that's a perfect trade off in your situation. Swapping them out is such a breeze too.

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/18/2018 at 4:03 PM, Jurrunio said:

Denuvo and other anti-pirate stuff

It's not really that. Their developers just have more ambition than skill. People blame Denuvo because they want it to be something simple that they can change. Tests show that removing Denuvo has relatively little effect compared to the patches games being patched.

 

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, A Random Dude said:

You can find a 1060 3GB VERY cheap right now USED. Only about $90.00-$130.00.

Probably a scam at that price. That's what a used 1050 should go for.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/18/2018 at 10:49 PM, Khader87 said:

Eg assassins creed odyssey is extremely badly optimized

Now thats not technically true. AC Odyssey isn't bad optimized, it just runs DRM software in the back and that absolutely hammers the cpu.

Digital Foundry's testing suggests that even a Ryzen 7 1700x wasn't fast enough to not bottleneck a GTX 1070 thanks to DRM software.

They observed over 50% cpu usage on a modern 8 core / 16 thread cpu while just standing at the beach and staring at the sea.

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, black0utm1rage said:

Now thats not technically true. AC Odyssey isn't bad optimized, it just runs DRM software in the back and that absolutely hammers the cpu.

Digital Foundry's testing suggests that even a Ryzen 7 1700x wasn't fast enough to not bottleneck a GTX 1070 thanks to DRM software.

They observed over 50% cpu usage on a modern 8 core / 16 thread cpu while just standing at the beach and staring at the sea.

That doesn't mean it's because of DRM. It just points to bad coding.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, JoostinOnline said:

Probably a scam at that price. That's what a used 1050 should go for.

Nope, completely legit. They are USED and you bid on them before they reach higher prices. The $130.00 1050 you speak of is NEW.

 

https://www.ebay.com/itm/Gigabyte-GeForce-GTX-1060-Windforce-OC-3GB-GDDR5-Graphics-Card/132823940328?epid=8023644105&hash=item1eecec7ce8:g:Y-QAAOSwzc1bx6aD:rk:1:pf:0

 

https://www.ebay.com/itm/Gigabyte-GeForce-GTX-1060-Windforce-OC-3GB-GDDR5-Graphics-Card/132820083947?epid=16023510183&hash=item1eecb1a4eb:g:X-0AAOSwR~tbq8Nd:rk:3:pf:0

 

Check their feedback. They sell PC components every day.

 

Here's the PNY 1060 I saw that sold for $90.00 USED, shows serial number and everything...

 

https://www.ebay.com/itm/PNY-GeForce-GTX-1060-3GB-Graphics-Card-VCGGTX10603PB-/173588692983?epid=1381709076&hash=item286ab13ff7%3Ag%3AD3YAAOSwy~Jbw38V&nma=true&si=ajybMX%2Ba8pSYd6TJ6lty3jRxogI%3D&orig_cvip=true&nordt=true&rt=nc&_trksid=p2047675.l2557

 

Another one for $95.00... https://www.ebay.com/itm/EVGA-GeForce-GTX-1060-3GB-FTW/392144264978?hash=item5b4d9e9f12:g:RKkAAOSw6EZbu-6l

 

Another for $100.00... 

 

https://www.ebay.com/itm/ASUS-GeForce-GTX-1060-Dual-OC-3GB-Video-Card-/263898587102?epid=3019375128&hash=item3d7194dfde%3Ag%3AAvUAAOSwttlbgY5I&nma=true&si=ajybMX%2Ba8pSYd6TJ6lty3jRxogI%3D&orig_cvip=true&nordt=true&rt=nc&_trksid=p2047675.l2557

 

Another for $100.00 used for mining for 6 months, sold 2 days ago... 

 

https://www.ebay.com/itm/Gigabyte-Geforce-GTX-1060-3GB-Windforce-OC-Edition-Used-/123435081527?hash=item1cbd4ddf37%3Ag%3A62gAAOSwWotbrX9D&nma=true&si=ajybMX%2Ba8pSYd6TJ6lty3jRxogI%3D&orig_cvip=true&nordt=true&rt=nc&_trksid=p2047675.l2557

 

Want me to keep going until I get to $130.00 where they start becoming NEW? Any model you want sells inside the price range of $90.00-$130.00. Never underestimate the power of eBay.

 

Do you by chance want a 1070 for $90.00 off? Here's one for $300.00 BRAND SPANKING NEW with FREE shipping...

 

https://www.ebay.com/itm/ZOTAC-GeForce-GTX-1070-Mini-8GB-GDDR5-VR-Ready-Super-Compact-Gaming-Graphics-NEW/302925822019?epid=691332204&hash=item4687c93843:g:tgcAAOSwVx5byP8Q:rk:11:pf:0&LH_ItemCondition=1000

 

You can buy 5-6 of them if you want, they're available.

 

You can find pretty much anything on eBay with at least a 35-45% discount.

 

Not sure if you're interested but here ya' go... 

 

https://www.ebay.com/itm/Call-of-Duty-Black-Ops-4-2018-XBox-One-/323502999741?epid=245256094&hash=item4b524808bd%3Ag%3AdQEAAOSwiWZbxmwD&nma=true&si=ajybMX%2Ba8pSYd6TJ6lty3jRxogI%3D&orig_cvip=true&nordt=true&rt=nc&_trksid=p2047675.l2557

 

It arrived 1 week after the game was released. Retail is $64.80 so around 38-39% off, FREE shipping. Imagine waiting just 1 month from now, even lower price. It's the same story with other items. You can get a 1080 for just over $300 right now, just gotta be the winning bidder or check eBay multiple times a day. There's also the magic of communicating with people that want a best offer. You can talk them down to soooo low.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, black0utm1rage said:

Now thats not technically true. AC Odyssey isn't bad optimized, it just runs DRM software in the back and that absolutely hammers the cpu.

Digital Foundry's testing suggests that even a Ryzen 7 1700x wasn't fast enough to not bottleneck a GTX 1070 thanks to DRM software.

They observed over 50% cpu usage on a modern 8 core / 16 thread cpu while just standing at the beach and staring at the sea.

I recently saw a video... Maybe it was Adored TV talking about how Nvidia basically bribed game companies.... anyways Crysis 2 had the ocean constantly being rendered even underneath the city where it's not visible at all, where most games should end it after a few meters inland. The also had crazy high tessellation and geometry on things like concrete barriers which ended up being kind of a big scandal. Maybe Far Cry is doing sort of the same thing.

https://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing/2

#Muricaparrotgang

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, A Random Dude said:

It could be time for an upgrade. You can get a minimum of $75.00 for it and a maximum of $125.00 if you're super patient. I saw one sell for $96.00 a week ago so $100.00 is probably a price to start with. Then you can put that $100 minus what it costs to ship it($8.00-$13.00 or so) towards a better GPU. Maybe a 1060 3GB. I owned one for a year and a half and it's a 1080p BEAST. You can even dabble in 1440p with it. You can find a 1060 3GB VERY cheap right now USED. Only about $90.00-$130.00. I'd say that's a perfect trade off in your situation. Swapping them out is such a breeze too.

3 years is the lifespan for a modern GPU? Wow, PC gaming really is the master race.

Still works in every game other than Ubi titles though, which I have very little interest in, and a 3gb 1060 is frankly a stupid upgrade. It's barely better than a 960 at all and I can't believe they even offer it in a 3gb flavor.

#Muricaparrotgang

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, JZStudios said:

3 years is the lifespan for a modern GPU?

It's not the lifespan that I'm talking about, you can get a good decade from a GPU provided you take the best care of it. Overclocking could possibly take some of that life span away. I'm referring to your GPU keeping up with newer games. And yes, PC gaming is the master race. For pretty much every reason a console game can come up with.

1 hour ago, JZStudios said:

Still works in every game other than Ubi titles though

Sure it might work, but it won't produce the greatest performance.

1 hour ago, JZStudios said:

a 3gb 1060 is frankly a stupid upgrade. It's barely better than a 960 at all and I can't believe they even offer it in a 3gb flavor.

You'd be surprised just how much better a 1060 3GB is than a 960 2GB...

75% faster in ULTRA settings to be exact. And this is just in Battlefield 1. A pretty recent and graphic demanding game. Lower the right settings(and I just so happen to have recently mastered BF1 graphic settings) and you're clearing 100 fps most likely. You won't get that from a 960 2GB. Never, ever underestimate the power of the GTX 1060 3GB. I absolutely adored that GPU for a year and a half. Only reason I sold it was because it was very high in value at the time and I got $285 for it and then put that towards a BRAND NEW & SEALED 1070 in which I got for a hell of a deal, only $245! Less than what I got for the USED 1060 3GB! Insane right? Retail is for suckers.

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, A Random Dude said:

It's not the lifespan that I'm talking about, you can get a good decade from a GPU provided you take the best care of it. Overclocking could possibly take some of that life span away. I'm referring to your GPU keeping up with newer games. And yes, PC gaming is the master race. For pretty much every reason a console game can come up with.

Sure it might work, but it won't produce the greatest performance.

You'd be surprised just how much better a 1060 3GB is than a 960 2GB...

75% faster in ULTRA settings to be exact. And this is just in Battlefield 1. A pretty recent and graphic demanding game. Lower the right settings(and I just so happen to have recently mastered BF1 graphic settings) and you're clearing 100 fps most likely. You won't get that from a 960 2GB. Never, ever underestimate the power of the GTX 1060 3GB. I absolutely adored that GPU for a year and a half. Only reason I sold it was because it was very high in value at the time and I got $285 for it and then put that towards a BRAND NEW & SEALED 1070 in which I got for a hell of a deal, only $245! Less than what I got for the USED 1060 3GB! Insane right? Retail is for suckers.

BF1 is actually decently optimized though. Note that it has 45 fps on ultra while GRWL and most other Ubi titles run on low for comparable results. The Ubi titles just aren't really pushing that much either. My entire point is that Dice, EA, Codemasters, CDPR, SEGA... I don't know, even Remedy with Quantum Break all run better than the Ubi titles over the past 4-5 years.

GRWL goes so far as to remove EVERY shadow from the game, and it still runs like shit. There's no reason that this requires 4gb of VRAM

 

JGfCdQS.jpg

 

I mean, those textures alone are total dogshit and people have issue with textures loading in even on SSDs, which means their texture streaming implementation is hot garbage. That ground texture... I swear I've seen more texture in Super Mario 64.
Super-Mario-64-Screenshot.jpg

 

Ultra for reference (It's still not actually that great.)

sA8Q3fZ.jpg

 

Meanwhile the performance difference between Super Mario 64 and Ultra is as follows:

1490666547279.png

 

The game is just flat out poorly optimized. They can't claim "Well it's open world, so obviously it performs like a dumpster fire." When the foliage, textures, draw distance, and physics interactions are all worse than The Witcher 3 with hairworks enabled and released 2 years prior.

You can't say "Well it looks good/better" because that shit obviously does not.

I linked someone to an article: https://rog.asus.com/articles/gaming/ghost-recon-wildlands-graphics-performance-guide/

And their rebuttal was some BS about how the game is CPU limited, or it's using a 1080 so the performance doesn't change much, which is both mind bogglingly stupid, and further proof that the game is poorly optimized if it is CPU limited, since it frankly shouldn't be. There's nothing really going on that should make it so.

#Muricaparrotgang

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, JZStudios said:

There's no reason that this requires 4gb of VRAM

It's not just the vram that it needs, you need a newer GPU. That's why I suggest a 1060 3GB. Where you don't need the 4GB of vram to run games better. I had GRWL when I had my 1060 3GB and I don't remember having any performance issues. Of course I didn't play it more than 10 minutes to notice but I do remember holding a steady 60 fps with high to near ultra settings with vsync ON. I only had a 60 Hz monitor at the time so I always used vsync to have a tear free, locked 60 fps experience. Just wasn't into the game for some reason. I uninstalled it because it's a gigantic file size.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, A Random Dude said:

It's not just the vram that it needs, you need a newer GPU. That's why I suggest a 1060 3GB. Where you don't need the 4GB of vram to run games better. I had GRWL when I had my 1060 3GB and I don't remember having any performance issues. Of course I didn't play it more than 10 minutes to notice but I do remember holding a steady 60 fps with high to near ultra settings with vsync ON. I only had a 60 Hz monitor at the time so I always used vsync to have a tear free, locked 60 fps experience. Just wasn't into the game for some reason. I uninstalled it because it's a gigantic file size.

I dunno. I'm almost inclined to redownload the 80gb of TW3 to get some screencaps at my mid-high settings compared to GRWL at low. It's the only game I have that "requires" 4gb of VRAM and it's frankly bullshit.

#Muricaparrotgang

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, JZStudios said:

It's the only game I have that "requires" 4gb of VRAM and it's frankly bullshit.

It only requires 4GB in 1080p on high settings. For 720p on low settings, that's where you only need 2GB vram. To achieve 60 fps that is. 

 

https://www.geforce.com/whats-new/articles/ghost-recon-wildlands-system-requirements

1 hour ago, JZStudios said:

I'm almost inclined to redownload the 80gb of TW3 to get some screencaps at my mid-high settings compared to GRWL at low.

These are two different games using different game engines developed by different developers. GRWL was also released almost 2 years after TW3. GRWL also uses the same game engine as Assassin's Creed, Prince of Persia, Shaun White Snowboarding, Assassin's Creed II, Prince of Persia The Forgotten Sands, Assassin's Creed Brotherhood, Assassin's Creed: Revelations, Assassin's Creed III, Assassin's Creed Liberation, Assassin's Creed Black Flag, Assassin's Creed Rogue, Assassin's Creed Unity, Assassin's Creed Syndicate, Tom Clancy's Rainbow Six Siege, Steep, For Honor, Assassin's Creed Origins and Assassin's Creed Odyssey. 

 

So if there's something to possibly blame here, it could be the Anvil game engine. Almost all of these games have some sort of issue that you have to fix on your own.

Link to comment
Share on other sites

Link to post
Share on other sites

Quote

The game is just flat out poorly optimized. They can't claim "Well it's open world, so obviously it performs like a dumpster fire." When the foliage, textures, draw distance, and physics interactions are all worse than The Witcher 3 with hairworks enabled and released 2 years prior.

The problem here is texture resolution typically does not incur any appreciable performance penalty to begin with:


 

Spoiler

 

grand-theft-auto-v-texture-quality-perfo

 

titanfall-2-texture-quality-performance-

 

watch-dogs-2-texture-resolution-performa

 

rise-of-the-tomb-raider-texture-quality-

 

fallout-4-texture-quality-performance.pn

 

 

Performance due to higher texture quality is more related to how much VRAM you have, not how powerful your GPU is.

 

Quote

You can't say "Well it looks good/better" because that shit obviously does not.

I linked someone to an article: https://rog.asus.com/articles/gaming/ghost-recon-wildlands-graphics-performance-guide/

And their rebuttal was some BS about how the game is CPU limited, or it's using a 1080 so the performance doesn't change much, which is both mind bogglingly stupid, and further proof that the game is poorly optimized if it is CPU limited, since it frankly shouldn't be. There's nothing really going on that should make it so.

One could argue that there's poor code design or some other issue with the software, but unless you actually have access to the source code, you can't really make that call.

 

Also there are plenty of games that by design, are CPU limited. Civilization is CPU limited by way of needing CPU power to calculate CPU turns. Ashes of the Singularity is CPU limited by way of needing to manage hundreds of units at once.

 

And unless you've done a thorough analysis on what the application is doing with the CPU, how can you even begin to say an application is poorly optimized or not? Because here's the definitive proof that the game isn't poorly optimized: it runs better on better systems than yours.

 

You want a game that is poorly optimized? The original Crysis

 

16 hours ago, black0utm1rage said:

Now thats not technically true. AC Odyssey isn't bad optimized, it just runs DRM software in the back and that absolutely hammers the cpu.

Digital Foundry's testing suggests that even a Ryzen 7 1700x wasn't fast enough to not bottleneck a GTX 1070 thanks to DRM software.

They observed over 50% cpu usage on a modern 8 core / 16 thread cpu while just standing at the beach and staring at the sea.

Which doesn't imply there isn't a lot going on that requires the processor's attention. For instance, this is my CPU usage on 3DMark's Time Spy, just for the graphics tests:

image.thumb.png.8cf824dfd3a3fb5efd6fb0cfe39e15bb.png

 

People would like to point the finger at the DRM, but unless someone has benchmarks of a cracked or DRM free version or a CPU profiler looking at what spent time on the CPU, it's all just speculation at this point.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, A Random Dude said:

It only requires 4GB in 1080p on high settings. For 720p on low settings, that's where you only need 2GB vram. To achieve 60 fps that is. 

 

https://www.geforce.com/whats-new/articles/ghost-recon-wildlands-system-requirements

These are two different games using different game engines developed by different developers. GRWL was also released almost 2 years after TW3. GRWL also uses the same game engine as Assassin's Creed, Prince of Persia, Shaun White Snowboarding, Assassin's Creed II, Prince of Persia The Forgotten Sands, Assassin's Creed Brotherhood, Assassin's Creed: Revelations, Assassin's Creed III, Assassin's Creed Liberation, Assassin's Creed Black Flag, Assassin's Creed Rogue, Assassin's Creed Unity, Assassin's Creed Syndicate, Tom Clancy's Rainbow Six Siege, Steep, For Honor, Assassin's Creed Origins and Assassin's Creed Odyssey. 

 

So if there's something to possibly blame here, it could be the Anvil game engine. Almost all of these games have some sort of issue that you have to fix on your own.

That's also been my point. The engine itself either doesn't work well, they don't optimize well, or both.

1 hour ago, M.Yurizaki said:

The problem here is texture resolution typically does not incur any appreciable performance penalty to begin with:


 

  Reveal hidden contents

 

grand-theft-auto-v-texture-quality-perfo

 

titanfall-2-texture-quality-performance-

 

watch-dogs-2-texture-resolution-performa

 

rise-of-the-tomb-raider-texture-quality-

 

fallout-4-texture-quality-performance.pn

 

 

Performance due to higher texture quality is more related to how much VRAM you have, not how powerful your GPU is.

 

One could argue that there's poor code design or some other issue with the software, but unless you actually have access to the source code, you can't really make that call.

 

Also there are plenty of games that by design, are CPU limited. Civilization is CPU limited by way of needing CPU power to calculate CPU turns. Ashes of the Singularity is CPU limited by way of needing to manage hundreds of units at once.

 

And unless you've done a thorough analysis on what the application is doing with the CPU, how can you even begin to say an application is poorly optimized or not? Because here's the definitive proof that the game isn't poorly optimized: it runs better on better systems than yours.

 

You want a game that is poorly optimized? The original Crysis

 

Which doesn't imply there isn't a lot going on that requires the processor's attention. For instance, this is my CPU usage on 3DMark's Time Spy, just for the graphics tests:

image.thumb.png.8cf824dfd3a3fb5efd6fb0cfe39e15bb.png

 

People would like to point the finger at the DRM, but unless someone has benchmarks of a cracked or DRM free version or a CPU profiler looking at what spent time on the CPU, it's all just speculation at this point.

 

 

I understand that texture quality only really effects things once it runs out of VRAM, but even still GRWL seems to have a bigger impact than say, DEMD. DEMD also doesn't require 3gb of VRAM for dogshit textures. Even in your benchmarks you posted most games have an incremental discernible difference from low to high. Just about every setting in GRWL, with many even being set to off only has about a 1-2fps impact each, if any change at all. Compared to DEMD or TW3 which can both cripple my 960, and be tailored to run very smoothly around 60fps. This never really seems to happen with GR.

 

Just because it runs on more powerful hardware doesn't make it well optimized. Nor is that a valid line of thought. That's like saying "This square peg doesn't fit in this round hole" and you respond with "Yes it will, get a bigger hammer." Brute forcing something to "work better" doesn't mean it worked well to begin with. Crysis actually runs just fine on my PC, even if it is optimized like shit, and y'know, it also has more effects, physics calculations, and more intelligent NPCs than GR. If you can explain to me how GR making the game purposefully CPU limited is a good optimization tactic I'll give you a million dollars.

Just out of curiosity, are Monster Hunter World and Saints Row 2 also well optimized games? I'm just kind of curious because you go on defending games that just run like shit. Or what about the 3 earlier Forza titles? Were those really perfect?

Of the list of games a random dude listed I know that for most people the Assassins Creed games have consistently performed poorly, as well as For Honor. I haven't really looked into R6S, but I wouldn't be surprised if Ubisoft puts more resources into it since it's still actively making money and getting additions. I do know that like most Ubi titles it did receive a MASSIVE downgrade, so. It's also very condensed maps, which is genuinely easier to optimize for.

#Muricaparrotgang

Link to comment
Share on other sites

Link to post
Share on other sites

Rainbow Six Siege from my experience is optimized in an excellent way. I don't know why I haven't played it in like a year. I should since I haven't played it with my 1070 yet. As for the rest of these recent comments, it's getting a bit technical for my expertise. Lots of valid points though.

Link to comment
Share on other sites

Link to post
Share on other sites

Can't believe no one mentioned that Assassin's Creed Origins/Odyssey use VM Protect on top of Denuvo. VM Protect is a virtual machine so it's basically trying to play Origins/Odyssey using an emulator which is apparently what's causing people's CPU's to go crazy playing it. 

 

Not sure about the other Ubisoft games. They always perform well for me. Sure, they use Denuvo but a lot of games use it and it's not that bad.

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, JZStudios said:

-Snip-

At this point I'm finding you're arguing based on your system, disregarding experiences and reports that everyone else is making. Just because an application runs like horse poop on  your machine doesn't mean it's the fault of the application.

 

Besides that, optimization isn't a cut and dry thing. There are ways developers think they are optimizing, but they're really not:

https://blogs.msdn.microsoft.com/oldnewthing/20041216-00/?p=36973 (Pointing out that reducing what looks like three assembly instructions down to one actually worsens performance due to how the processor works with the instruction chosen)

https://en.wikipedia.org/wiki/XOR_swap_algorithm#Reasons_for_avoidance_in_practice (article on the XOR swap, which in higher level languages looks "optimized" because you're not creating a temporary variable to swap two variables, but in practice, may be slower due to forced linearization of code)

https://blogs.msdn.microsoft.com/oldnewthing/20171117-00/?p=97416 (Explaining that trying to optimize one thing doesn't necessarily mean wide-reaching improvements)

 

And there's the problem that if you're trying to maximize optimizing your code, you'll end up with code that's even harder to maintain and debug. If it comes down to performance or maintainability, as long as I'm meeting the performance requirement, I'd like my code to be maintainable. As one of the most renowned computer scientists Donald Knuth said:

Quote

"We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3%"

 

The problem I'm seeing with this topic and topics like is that all we have are black boxes to play with. There are obvious signs of poor optimization (or rather, I should say poor design, because "poor optimization" is an oxymoron), but in situations like this, no, there is no obvious sign. Sure you can probably find people on Steam or whatever complaining about the performance of the game played on their system, but then you have a lot of people who will say the game runs just fine on theirs. And the most damning thing is if you can find other people's benchmark results, preferably close enough to your hardware configuration, that show something different than what you're getting, then maybe it's you, not them.

 

But yah, all in all, topics like these make me think people want to find some root of all evil that makes their games perform worse than it supposedly should rather than accept the fact that maybe, just maybe, their system really isn't all that great and that the developers moved on from caring about what you have.

 

EDIT: Also optimization implies there's a baseline of performance, which in almost all cases, there is no baseline of performance we can use to reference from. Well, except if there's a game patch that claims improvements were made. However for developers, there is one: a performance requirement. But we don't typically have access to those either.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, M.Yurizaki said:

At this point I'm finding you're arguing based on your system, disregarding experiences and reports that everyone else is making. Just because an application runs like horse poop on  your machine doesn't mean it's the fault of the application.

 

Besides that, optimization isn't a cut and dry thing. There are ways developers think they are optimizing, but they're really not:

https://blogs.msdn.microsoft.com/oldnewthing/20041216-00/?p=36973 (Pointing out that reducing what looks like three assembly instructions down to one actually worsens performance due to how the processor works with the instruction chosen)

https://en.wikipedia.org/wiki/XOR_swap_algorithm#Reasons_for_avoidance_in_practice (article on the XOR swap, which in higher level languages looks "optimized" because you're not creating a temporary variable to swap two variables, but in practice, may be slower due to forced linearization of code)

https://blogs.msdn.microsoft.com/oldnewthing/20171117-00/?p=97416 (Explaining that trying to optimize one thing doesn't necessarily mean wide-reaching improvements)

 

And there's the problem that if you're trying to maximize optimizing your code, you'll end up with code that's even harder to maintain and debug. If it comes down to performance or maintainability, as long as I'm meeting the performance requirement, I'd like my code to be maintainable. As one of the most renowned computer scientists Donald Knuth said:

 

The problem I'm seeing with this topic and topics like is that all we have are black boxes to play with. There are obvious signs of poor optimization (or rather, I should say poor design, because "poor optimization" is an oxymoron), but in situations like this, no, there is no obvious sign. Sure you can probably find people on Steam or whatever complaining about the performance of the game played on their system, but then you have a lot of people who will say the game runs just fine on theirs. And the most damning thing is if you can find other people's benchmark results, preferably close enough to your hardware configuration, that show something different than what you're getting, then maybe it's you, not them.

 

But yah, all in all, topics like these make me think people want to find some root of all evil that makes their games perform worse than it supposedly should rather than accept the fact that maybe, just maybe, their system really isn't all that great and that the developers moved on from caring about what you have.

 

EDIT: Also optimization implies there's a baseline of performance, which in almost all cases, there is no baseline of performance we can use to reference from. Well, except if there's a game patch that claims improvements were made. However for developers, there is one: a performance requirement. But we don't typically have access to those either.

You're missing the point where it's not just my system. A lot of UBI games run like crap on a lot of systems. Even if it was solely my system, it's odd that almost every Ubi game runs like turds when everything else runs fine and requires less resources to do so. I understand how a long string of complex calculations... or rather just a long complex calculation can run slower than a string of simple ones, but I don't even know why that was brought up.

What I can say though is that I do somewhat follow the development of BeamNG and they group selected and deleted somewhere around 40,000 lines of code to rewrite them for optimization and it actually worked significantly well. What does either of these things have to do with Ubisoft games though?

 

I know my system isn't great, but comparatively, my system performs roughly equivalent to my brothers which is both newer and more powerful by measurable standards. There's also lots of instances of other people with more powerful hardware where it just doesn't perform well. My 960 should be absofuckinglutely obliterated by a 1080 and a top end CPU, yet somehow it manages to be shockingly close.

You just want to argue that there's no such thing as "poor optimization" or "Bad design" like whats his nut said, but it's horseshit. When 1 person makes a program doing the same base function as another, but it performs worse across all systems, the optimization fucking sucks. Or the implementation. Whatever the fuck you want to call it.

 

Ubisoft games just DO NOT perform at a respectable level for what the base function is compared to others in the field. It's not like there's other systems with 960's or 8350's that a getting a million fps and for some fucking reason I'm the only one in the world that's not. The games just require way more resources than the competitors despite producing measurably worse results.

I don't give a flying FUCK what the code looks like. It's irrelevant. Does SpaceX give a shit what the internal components of its rockets are made of? No. Insider tip, Elon just says "Here's a literal fuck ton of money, get shit made." The end user doesn't give a shit what the fucking code looks like, he just wants it to take off, hit space, do whatever it was sent to do, then come back down and land in synchronous pairs to be reused. Who gives a shit that they use fucking cork for a heat shield? That's the engineers job.

Even if he said "Okay, that's expensive, see if we can do it for less money" it's STILL not his job, it's the engineers to figure out how to optimize materials for weight and code for calculation accuracy.

 

What you're doing is watching the Russian rocket blow up on the launchpad and going "Well it's the customers fault, because it blew up on his launch pad and doesn't work as well as Falcon9." Where the hell does that logic come from?

#Muricaparrotgang

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 months later...

Its odd Im playing this game currntly on high settings 1080p with a 1080ti. Im trying to get a smooth frame rate as I have a 144hz panel and even with a 1080ti and 8700k I cant get above 70fps in most scenarios. Its bad optimisation imo and also I have dual channel 16gb 2933 ram in case anyone was gonna ask that 

Link to comment
Share on other sites

Link to post
Share on other sites

Ubisoft, destroyer of grand game idea's… homnm 6 brrrr … i do have to add that their swordfighting game is greatly optimized. i mean even on my xeon x5650 combined with a 105ti it behaved admirably 

Link to comment
Share on other sites

Link to post
Share on other sites

Hi

 

I have 3 or 4 Assassin's Creed games and GRWL.

I consider the Assassin's Creed as ambitious. I have powerful hardware so even at 4k I have no problem running them at 60fps on ultra.

 

GRWL looks like crap and runs like crap. Turning up the detail equals more crap. I consider it lazy. That is visually. The rest of the game is not bad.

 

In 1992, 3 months after building my 486 50 mhz rig I had to tear it down to install a Pentium to get more frames in Doom. Nothing has changed. If you want performance you have to upgrade. If you don't you get left behind.  With consoles you at least get a few years between versions.

 

As they say. It is what it is.  

 

 

 

 

 

 

 

   

 

 

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×