Jump to content

Watch Dogs 2 PC benchmarks

Watch dogs 2 released on current gen consoles 2.5 weeks ago, and has been given a delay for the PC version to improve the game suited for PC players. And there were a lot of things Ubisoft wanted to add and improve (see here). Today it is the day the PC version is finally out; thus the Mr. Robot simulator is put to the test with benchmarks! *Do note that this is a Nvidia Gameworks title*

 

First, we'll start with guru3D. They have tested the game with the latest nvidia and AMD drivers http://www.guru3d.com/articles-pages/watch-dog-2-pc-graphics-performance-benchmark-review.html

 

1080p

Spoiler

index.php?ct=articles&action=file&id=27107

1440p

Spoiler

index.php?ct=articles&action=file&id=27108

 

4K

Spoiler

index.php?ct=articles&action=file&id=27109

VRAM usage

Spoiler

index.php?ct=articles&action=file&id=27115

i7 5960X vs. FX 8370 

Spoiler

 

index.php?ct=articles&action=file&id=27113

 

index.php?ct=articles&action=file&id=27114

 

 

Now onto Gamersnexus' benchmarks. They have also tested the game with the latest nvidia and AMD drivers http://www.gamersnexus.net/game-bench/2700-watch-dogs-2-gpu-benchmark-11-video-cards

 

Performance scaling with the different visual presets

Spoiler

watch-dogs-2-gpu-scaling

1080p Ultra

Spoiler

watch-dogs-2-gpu-1080-ultra

1080p Very High

Spoiler

watch-dogs-2-gpu-1080-vhigh

 

1440p

Spoiler

watch-dogs-2-gpu-1440-ultra

4K Very High

Spoiler

watch-dogs-2-gpu-4k-vhigh

If you are curious what the 1% and 0.1% Low numbers are on about, it is sort-of self-explanatory: the lowest 1% and 0.1% frame rates observed 

 

Last up is techpowerup. They too used the latest drivers https://www.techpowerup.com/reviews/Performance_Analysis/Watch_Dogs_2/

 

A video to indulge and drool on the sheer amount of visual settings available 

VRAM Usage

Spoiler

vram.png

 

1080p

Spoiler

1080.png

1440p 

Spoiler

1440.png

4K

Spoiler

2160.png

 

Disclaimer: As far as I know, all of these benches are done without using Nvidia gameworks features, such as HFTS. the above presets will not use Nvidia gameworks features unless you deliberately turn them on. I believe these features are disabled on the AMD side.

 

Apart from Gameworks features, there are even more options available past Ultra that can be used on both Nvidia and AMD. It is mentioned here by techpowerup

Quote

 

  • Field of View: Adjustable from 70° to 110°
  • Pixel Density: Lets you adjust the game's rendering resolution
  • Extra Details: Even when running at Ultra, you can increase the settings by increasing this slider which controls the game engine's level-of-detail scaling with distance.
  • Texture Resolution: Ultra is available only after downloading the free HD Texture pack DLC
  • Temporal Filtering: This boosts performance greatly by rendering at half the resolution while the player is moving.
  • MSAA: Is available and can be combined with other post-process-AA options, but comes with a huge performance hit.

 

 

In addition the Temporal AA the game uses is very similar to the variety used in Rainbow Six Seige: the game is rendered at half the selected resolution, then 2x MSAA is applied. The difference is negligible in still shots, but in motion it can look quite gnarly. guru3D has Temporal AA disabled, whereas Gamersnexus and techpowerup I am unsure about. 

 

Finally, something to note: The game uses an Anti-cheat system called EasyAntiCheat that prevents the use of RTSS, even while offline. One can roll back to a very old RTSS version to get overlays working, or get EasyAntiCheat to white-list RTSS. http://www.guru3d.com/news-story/update-watch-dogs-2-anti-cheat-system-blocks-rtss-overlay-software.html

This could also prevent modding, regardless of whether RTSS is white-listed

 

My thoughts:

Seeing that this is a gameworks title, AMD is pushed back a ways with nvidia at the forefront. But the Rx Fury keeps up quite well! Interesting 

In general though, this is a demanding title. Personally the game looks pretty slick in spite of how demanding it is, but whether one can say it is unoptimized.. perhaps it is. I have looked around, and there are no news of a patch yet. 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

what? need at least a 980ti for 60fps at ultra? im glad i upgraded to a 1070 if future games are gonna be this heavy.

Link to comment
Share on other sites

Link to post
Share on other sites

index.php?ct=articles&action=file&id=27115

 

Wow the Fury is capping it. Mind you it seems the game is scaling badly, or better to say it has unnecessary VRAM usage. It uses up to 2 GB of VRAM more on 1060 when maxed but the Fury gets better FPS at all resolutions. 

The ability to google properly is a skill of its own. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Bouzoo said:

index.php?ct=articles&action=file&id=27115

 

Wow the Fury is capping it. Mind you it seems the game is scaling badly, or better to say it has unnecessary VRAM usage. It uses up to 2 GB of VRAM more on 1060 when maxed but the Fury gets better FPS at all resolutions. 

I'd assume it's storing assets in VRAM instead of actually requiring 4+GB of VRAM in this case.

Check out my guide on how to scan cover art here!

Local asshole and 6th generation console enthusiast.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Dan Castellaneta said:

I'd assume it's storing assets in VRAM instead of actually requiring 4+GB of VRAM in this case.

Well it's evident that it doesn't require 4GB+, but I don't see any upside on storing assets in this case. 

The ability to google properly is a skill of its own. 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Bouzoo said:

Well it's evident that it doesn't require 4GB+, but I don't see any upside on storing assets. It's just forcing your card to work even harder. 

Eh. I guess it'd be more useful on a slower hard drive?

Not sure, because you look at the 1060's VRAM usage and it seems unusual at that point.

Check out my guide on how to scan cover art here!

Local asshole and 6th generation console enthusiast.

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, Bouzoo said:

Wow the Fury is capping it. Mind you it seems the game is scaling badly, or better to say it has unnecessary VRAM usage. It uses up to 2 GB of VRAM more on 1060 when maxed but the Fury gets better FPS at all resolutions. 

Uhhh the Fury only comes with 4GB of VRAM.

 

36 minutes ago, huilun02 said:

As expected, the Ubisoft/Nvidia combination of doom.

A 980Ti required to do 60fps @ 1080p Ultra... without GameShaft features even.

I get better visuals and perf in BF1 64 player conquest than this shit, with my old 290X.

They back to digging I guess...

When is this tired conspiracy stuff going to end?

 

8 minutes ago, Misanthrope said:

I think is as we predicted: The best way to experience Watchdogs 2 is to not buy it and install GTA V instead.

Because GTA 5 doesn't have its own performance problems.

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, ONOTech said:

I'm glad for you, but future games should NOT be this heavy. WD2 is just unoptimized crap and no dev team should follow suit. 

 

There should not be any game out right now or out next year or the majority of the year after that requires at least 980 Ti to maintain 60 FPS at 1080p. None. 

 

You would think after the crappy optimization of the original WD Ubisoft would learn but it seems that they haven't. 

We only have a few performance reviews right now. I'd wait a little longer than 8 hours after release for someone to actually show more data than what you can cram into a graph.

 

Besides, if Watch Dogs isn't your kind of game anyways, no sense in talking shit about it. Not saying you don't like the idea of the game, but I suspect a lot of the hate it gets comes from people who don't even like the games to begin with.

 

Annnnnd to respond to this one specific point....

29 minutes ago, ONOTech said:

There should not be any game out right now or out next year or the majority of the year after that requires at least 980 Ti to maintain 60 FPS at 1080p. None.

Why not? Do you not want video games to look better? Do you know how ignorant that is?

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Kloaked said:

Uhhh the Fury only comes with 4GB of VRAM.

I know, that's why I said capping it. 

My point was that the game uses way more RAM than on all res needed for no improvements.

The ability to google properly is a skill of its own. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Bouzoo said:

I know, that's why I said capping it. 

My point was that the game uses way more RAM than needed for no FPS improvements. 

Capping it?

 

And I don't understand your point when VRAM isn't the primary bottleneck for performance when things work correctly (IE when some people had issues with their 970).

 

Just now, huilun02 said:

There isn't one to begin with. You'd be an idiot to not notice the obvious pattern in Gameworks title performance, especially when made by Ubisoft. Every shit port has Nvidia's name on it.

 

They laughing right now as people keep defending them and making GPU purchases that would otherwise be unnecessary. 

Except there's games that you probably haven't paid attention to that have Nvidia technology in them that run perfectly fine. This has nothing to do with Gameworks, and we don't even know for sure if WD2 actually has performance problems relative to how it looks.

Link to comment
Share on other sites

Link to post
Share on other sites

I remember trying to play  watchdogs 1 with 2x 6990s in crossfire.... 

 

Thhis doesnt surprise me

CPU: Amd 7800X3D | GPU: AMD 7900XTX

Link to comment
Share on other sites

Link to post
Share on other sites

GTX 980 Ti needed for 1080p Ultra at >60fps.

 

Image result for laughing gif

Project White Lightning (My ITX Gaming PC): Core i5-4690K | CRYORIG H5 Ultimate | ASUS Maximus VII Impact | HyperX Savage 2x8GB DDR3 | Samsung 850 EVO 250GB | WD Black 1TB | Sapphire RX 480 8GB NITRO+ OC | Phanteks Enthoo EVOLV ITX | Corsair AX760 | LG 29UM67 | CM Storm Quickfire Ultimate | Logitech G502 Proteus Spectrum | HyperX Cloud II | Logitech Z333

Benchmark Results: 3DMark Firestrike: 10,528 | SteamVR VR Ready (avg. quality 7.1) | VRMark 7,004 (VR Ready)

 

Other systems I've built:

Core i3-6100 | CM Hyper 212 EVO | MSI H110M ECO | Corsair Vengeance LPX 1x8GB DDR4  | ADATA SP550 120GB | Seagate 500GB | EVGA ACX 2.0 GTX 1050 Ti | Fractal Design Core 1500 | Corsair CX450M

Core i5-4590 | Intel Stock Cooler | Gigabyte GA-H97N-WIFI | HyperX Savage 2x4GB DDR3 | Seagate 500GB | Intel Integrated HD Graphics | Fractal Design Arc Mini R2 | be quiet! Pure Power L8 350W

 

I am not a professional. I am not an expert. I am just a smartass. Don't try and blame me if you break something when acting upon my advice.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

...why are you still reading this?

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, goodtofufriday said:

play  watchdogs 1 with 2x 6990s in crossfire.... 

The problem isn't Watch Dogs.

 

Just now, huilun02 said:

Tweeted that to Dice. They'll probably find it amusing. 

How sweet. Thanks, pumpkin. Dice had some performance issues with BF1 and still does, and it's only a first person shooter. Watch Dogs is set in a free-to-roam city with a lot more going on.

 

You know Dragon Age: Inquisition is using the Frostbite engine? It also had performance issues when it came to super intensive scenes. It's almost like when you're having to render a lot of stuff at once, you'll lose performance. Shocker.

Link to comment
Share on other sites

Link to post
Share on other sites

Ubisoft did a good job with The Division. Then again, it wasn't an NVidia shitstain.

 

This game? Like the first, it's a GameWorks game, and as usual with Ubi/NVidia titles, it runs like shit. There seems to be a huge latency in steering, even with 30+ fps, in fact even at 50ish fps.

Micro stutter seems to be everywhere, and sometimes it even feels like steering is synced to frame rate (but only sometimes).

 

They did a lot with the PC settings, which is nice. Too bad, they yet again used NVidia shit, and forgot to make steering possible and not have micro stutter everywhere.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Kloaked said:

Because GTA 5 doesn't have its own performance problems.

I never play online but fair enough: the few times I tried I did grew to hate skybox loading screen simulator V.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Misanthrope said:

I never play online but fair enough: the few times I tried I did grew to hate skybox loading screen simulator V.

I don't know if you enabled them, but there are graphics options on GTA 5 under the "advanced" menu or some such that you should turn off as they offer nothing significant and the settings do nothing but tank your performance. Enabling MFAA through Nvidia Control Panel if you have an Nvidia video card also helps (you have to have MSAA x2 or x4 enabled as your anti-aliasing method in your graphics options in the game as well).

 

Loads times are something I never figured out a solution to and I don't think anyone has.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Kloaked said:

Capping it?

 

And I don't understand your point when VRAM isn't the primary bottleneck for performance when things work correctly (IE when some people had issues with their 970).

Term used quite often for hitting something to the max. 

Well to be fair I remember many people had FPS issues with 970 in, for instance, Dying Light at the exact same moment when VRAM usage was hitting a certain bar above 3.5 GB, multiple videos on the subject as well. But I'm not here to go into that again.

My point is that the game is using a ridiculous amount of VRAM for absolutely no reason (as it seems to me). I'm not talking about how some cards use 8GB on 4K and such if they can. Sure there were games that ran better on a 390 than a Fury X for VRAM reasons on 4K and 5K, but this game, if it can run with 4GB on 4K better than a 6GB card, why is it using the same amount of VRAM on 1080p. It is using the Ultra Quality texture pack alright, but it is requiring the card to run the same on 1080p and 4K, so to say. 

The ability to google properly is a skill of its own. 

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, goodtofufriday said:

I remember trying to play  watchdogs 1 with 2x 6990s in crossfire.... 

You were using Dual GPU cards in XFire, technically running 4 Cayman GPUs. Let that sink in. 

The ability to google properly is a skill of its own. 

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, huilun02 said:

BF1 has  only  64 players simultaneously running around causing mayhem in sync with high tick rate servers. While looking a hell of a lot better than WD2 not looking much different from the first game.

 

DAI has been extensively improved with patches and runs better now. I'll even toss in BF4 which was a disaster at launch and yet is a polar opposite in its current state.

 

How does Arkham Knight and AC:U run today? 

That's fair to say on the 64-man servers, and BF1 does objectively look "better" if you look at graphical fidelity. But again, the game doesn't have as many objects on screen at the same time, and if you look at BF1's models closely, even though the graphics look good they do optimize their polygon count very heavily. Because their lighting and post processing is so good you don't really even pay attention to this and I don't think anyone should care. I'm only pointing this out because that's how they're able to get the performance that they do with how the game looks, and they have smart people working on the engine of course. That's just my observation and I could be wrong.

 

In WD2's case, I will be able to play it for myself when I get home from work and I will report back my findings. I have a semi-newly built PC with a 6600k (@4.5Ghz) and a EVGA SC 980 (no manual OC, just whatever it came with). I'll even record a video if you'd like.

 

DA:I runs pretty much the same for me as it did when it launched, since the troubling areas I had FPS issues in still have FPS issues, but it's because it's having to render a lot at such a high fidelity that my system cannot handle it to run at a constant 60fps at the settings that I have it at.

 

I don't know about Arkham Knight, but AC:U was in the same boat as DA:I. Whatever FPS fixes they introduced didn't fix a whole lot for me.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Kloaked said:

In WD2's case, I will be able to play it for myself when I get home from work and I will report back my findings. I have a semi-newly built PC with a 6600k (@4.5Ghz) and a EVGA SC 980 (no manual OC, just whatever it came with). I'll even record a video if you'd like.

It seems to run (almost) the same on a 5960X and a 8370, curious how it'll run on the 6600K.

The ability to google properly is a skill of its own. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Bouzoo said:

Term used quite often for hitting something to the max. 

Well to be fair I remember many people had FPS issues with 970 in, for instance, Dying Light at the exact same moment when VRAM usage was hitting a certain bar above 3.5 GB, multiple videos on the subject as well. But I'm not here to go into that again.

My point is that the game is using a ridiculous amount of VRAM for absolutely no reason (as it seems to me). I'm not talking about how some cards use 8GB on 4K and such if they can. Sure there were games that ran better on a 390 than a Fury X for VRAM reasons on 4K and 5K, but this game, if it can run with 4GB on 4K better than a 6GB card, why is it using the same amount of VRAM on 1080p. It is using the Ultra Quality texture pack alright, but it is requiring the card to run the same on 1080p and 4K, so to say. 

I know what cap means, but you're basically being redundant when you mention it. Anyways though..

 

I've seen people who didn't experience any issues with their 970 show that even though Dying Light and other games pushed the 970 past 3.5GB, their FPS did not suffer anything significant like a minority of users were reporting with their horror stories.

 

Shadow of Mordor has a ultra texture pack that they recommend 6GB of VRAM for. I ran it on my 980 and it ran fine, but what issue I had was random stuttering at some times that I didn't have before. Aside from that, everything was fine. This was at 1080p.

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, Kloaked said:

The problem isn't Watch Dogs.

5~10fps. Mostly was watch dogs. single card performance as also under 20fps on most amd cards

CPU: Amd 7800X3D | GPU: AMD 7900XTX

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, huilun02 said:

If you want object count,  you can try these two "AMD biased" games Hitman and Ashes of the Singularity. No Gameworks in sight. No Dice magic. Great performance on both AMD and Nvidia hardware. 

DirectX 12 you mean? The new API that very few developers know how to use or even bother to use yet? I wouldn't call those AMD biased at all as I'm not a shill, I would call those well running games. I am also aware that AMD beats Nvidia equivalent video cards in both of those games. I don't own those games and I no longer own an AMD video card in a second system (it was sold) so I cannot make these comparisons anymore. But I am confident that Gameworks is not the root issue in these games if there even is an issue, much in the same way as since some assmonkeys wear Tapout shirts that all people who wear Tapout shirts are bad.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, goodtofufriday said:

5~10fps. Mostly was watch dogs. single card performance as also under 20fps on most amd cards

No it wasn't, and the AMD issue was relieved as best as it could be. They ran fine.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×