Jump to content

Watch Dogs 2 PC benchmarks

10 minutes ago, Kloaked said:

No it wasn't, and the AMD issue was relieved as best as it could be. They ran fine.

And any AMD dual gpu card that was out at that time, which was many, had issues that persisted well beyond all the patches 

CPU: Amd 7800X3D | GPU: AMD 7900XTX

Link to comment
Share on other sites

Link to post
Share on other sites

HAHAHAHA 46 fps 1440p with a gtx 1070 which not long ago was essentially a TITAN X lmaooooooooooooooooooo!!! WHY UBISOFT WHY? Why you keep digging your grave? 

 

Another game ill never buy and wait to torrent. Maybe not even worth torrenting if i have to lower settings on a brand new top of the line GPU lol so sad

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, goodtofufriday said:

And any AMD dual gpu card that was out at that time, which was many, had issues that persisted well beyond all the patches 

It's not the game.

 

3 minutes ago, Zeeee said:

HAHAHAHA 46 fps 1440p with a gtx 1070 which not long ago was essentially a TITAN X lmaooooooooooooooooooo!!! WHY UBISOFT WHY? Why you keep digging your grave? 

 

Another game ill never buy and wait to torrent. Maybe not even worth torrenting if i have to lower settings on a brand new top of the line GPU lol so sad

LOL XDDDD It's like, if you take one single source instead of other tests from different sources since they all have different testing methods and what information they'll show you, you'll get different results. ECKDEE !!! xD ELEMAYYOH #sw3g

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Kloaked said:

It's not the game.

If it werent the game then custom user made patches wouldnt have worked to resolve the issues.... I very well understand the troubles of dual gpu cards, but any game can be made to utilize the cards properly....

CPU: Amd 7800X3D | GPU: AMD 7900XTX

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, goodtofufriday said:

If it werent the game then custom user made patches wouldnt have worked to resolve the issues.... I very well understand the troubles of dual gpu cards, but any game can be made to utilize the cards properly....

Can you show me?

 

Link to comment
Share on other sites

Link to post
Share on other sites

CPU: Amd 7800X3D | GPU: AMD 7900XTX

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, goodtofufriday said:

Can you show me the part where a mod fixed the FPS issues for your old GPU, or did it just allow the game to boot? 

 

These games from Ubisoft are mostly console ports, so they're going to have problems on older and unsupported PC hardware.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Kloaked said:

Can you show me the part where a mod fixed the FPS issues for your old GPU, or did it just allow the game to boot? 

 

These games from Ubisoft are mostly console ports, so they're going to have problems on older and unsupported PC hardware.

The second link fixed fps and stuttering. Was it 60fps? No. But i could comfortably play the game. 

 

CPU: Amd 7800X3D | GPU: AMD 7900XTX

Link to comment
Share on other sites

Link to post
Share on other sites

Always makes me shake my head when a new game is released and people complain that it requires a substantial GPU to run it well. Games are only going to get more demanding over time, even at 1080p. They're cramming more and more detail into these virtual worlds. It's a crazy amount of data and detail to render. 

 

It doesn't matter if 1080p is not considered that high of resolution these days. If there are substantially more objects and details to calculate and render in any given scene, it's going to be more demanding, period. 

 

Got a copy of this game free with the 1070 I just ordered. Once it comes in, I'll be curious to test it out. 

 

I can understand, some games really are crap, like the whole Arkham Knight and AC Unity situations, but some games just are simply demanding. 

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

what about multi-GPU?

CPU: Intel i7 5820K @ 4.20 GHz | MotherboardMSI X99S SLI PLUS | RAM: Corsair LPX 16GB DDR4 @ 2666MHz | GPU: Sapphire R9 Fury (x2 CrossFire)
Storage: Samsung 950Pro 512GB // OCZ Vector150 240GB // Seagate 1TB | PSU: Seasonic 1050 Snow Silent | Case: NZXT H440 | Cooling: Nepton 240M
FireStrike // Extreme // Ultra // 8K // 16K

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, DXMember said:

what about multi-GPU?

I don't believe there are any SLI/crossfire profiles included in the latest drivers. 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Technicolors said:

I don't believe there are any SLI/crossfire profiles included in the latest drivers. 

not dx12???

fail gaime

 

CPU: Intel i7 5820K @ 4.20 GHz | MotherboardMSI X99S SLI PLUS | RAM: Corsair LPX 16GB DDR4 @ 2666MHz | GPU: Sapphire R9 Fury (x2 CrossFire)
Storage: Samsung 950Pro 512GB // OCZ Vector150 240GB // Seagate 1TB | PSU: Seasonic 1050 Snow Silent | Case: NZXT H440 | Cooling: Nepton 240M
FireStrike // Extreme // Ultra // 8K // 16K

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Technicolors said:

there was a very small news that the game will have DX12 http://www.pcgamer.com/ubisoft-confirms-directx-12-support-for-watch-dogs-2/

at least some kind of scaling should be present even without profiles

profiles should only further optimize it

CPU: Intel i7 5820K @ 4.20 GHz | MotherboardMSI X99S SLI PLUS | RAM: Corsair LPX 16GB DDR4 @ 2666MHz | GPU: Sapphire R9 Fury (x2 CrossFire)
Storage: Samsung 950Pro 512GB // OCZ Vector150 240GB // Seagate 1TB | PSU: Seasonic 1050 Snow Silent | Case: NZXT H440 | Cooling: Nepton 240M
FireStrike // Extreme // Ultra // 8K // 16K

 

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, DXMember said:

what about multi-GPU?

 

20 minutes ago, Technicolors said:

I don't believe there are any SLI/crossfire profiles included in the latest drivers. 

For NV they were released yesterday and they include SLI. Have it installed myself. Afaik latest AMD driver only has optimization tweaks.

The ability to google properly is a skill of its own. 

Link to comment
Share on other sites

Link to post
Share on other sites

1060 6GB smashes 970, they have officially started to gimp performance on older cards everyone! :D

Btw it's not related to VRAM becasue if it would be the FPS would tank and the 1% and 0.1% framerates of GamesNexus would show a big-ass drop, which isn't the case.

If you want my attention, quote meh! D: or just stick an @samcool55 in your post :3

Spying on everyone to fight against terrorism is like shooting a mosquito with a cannon

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, samcool55 said:

1060 6GB smashes 970, they have officially started to gimp performance on older cards everyone! :D

Btw it's not related to VRAM becasue if it would be the FPS would tank and the 1% and 0.1% framerates of GamesNexus would show a big-ass drop, which isn't the case.

The 1060 was talked about being the 980 replacement, and besting it. Even reviews show this, so it's nothing new.

 

BUT MUH PLANNED OBSOLESCENCE AND GIMPING AMIRITE XDDDDD

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Kloaked said:

The 1060 was talked about being the 980 replacement, and besting it. Even reviews show this, so it's nothing new.

 

BUT MUH PLANNED OBSOLESCENCE AND GIMPING AMIRITE XDDDDD

Tbh that's even worse.

That means the 980 lost half it's value as soon as the 1060 released...

If i owned a 980 (thank god i don't) i would feel ripped off, a lot.

 

550 euros for a card that's less than 275 worth just over 2 years later (if you got it at launch)

If you got one a year ago that would mean you lose 275 euros in a year, just in depreciation of a graphics card...

We know pc components don't hold value well but that's just terrible.

 

If you can come up with an AMD equivalent between a 3XX card and a rx 4XX card, let me know :D

If you want my attention, quote meh! D: or just stick an @samcool55 in your post :3

Spying on everyone to fight against terrorism is like shooting a mosquito with a cannon

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, samcool55 said:

Tbh that's even worse.

That means the 980 lost half it's value as soon as the 1060 released...

If i owned a 980 (thank god i don't) i would feel ripped off, a lot.

 

550 euros for a card that's less than 275 worth just over 2 years later (if you got it at launch)

If you got one a year ago that would mean you lose 275 euros in a year, just in depreciation of a graphics card...

We know pc components don't hold value well but that's just terrible.

 

If you can come up with an AMD equivalent between a 3XX card and a rx 4XX card, let me know :D

That's what happens when technology gets better. I own a 980, but I have more than two working brain cells so I know how things work. What, you don't want things to get better?

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, samcool55 said:

Tbh that's even worse.

That means the 980 lost half it's value as soon as the 1060 released...

If i owned a 980 (thank god i don't) i would feel ripped off, a lot.

 

550 euros for a card that's less than 275 worth just over 2 years later (if you got it at launch)

If you got one a year ago that would mean you lose 275 euros in a year, just in depreciation of a graphics card...

We know pc components don't hold value well but that's just terrible.

 

If you can come up with an AMD equivalent between a 3XX card and a rx 4XX card, let me know :D

You don't like progress? You are weird man. Better tech at better priced, I thought everyone wants that. You say it like you want turtle pace tech like Intel is doing.

Link to comment
Share on other sites

Link to post
Share on other sites

Well I guess it will get better though.

But 4K 60fps gaming I mean looking at this 1080p maxed at 60fps stable is goal xD

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, MEC-777 said:

Always makes me shake my head when a new game is released and people complain that it requires a substantial GPU to run it well. Games are only going to get more demanding over time, even at 1080p. They're cramming more and more detail into these virtual worlds. It's a crazy amount of data and detail to render. 

 

It doesn't matter if 1080p is not considered that high of resolution these days. If there are substantially more objects and details to calculate and render in any given scene, it's going to be more demanding, period. 

 

Got a copy of this game free with the 1070 I just ordered. Once it comes in, I'll be curious to test it out. 

 

I can understand, some games really are crap, like the whole Arkham Knight and AC Unity situations, but some games just are simply demanding. 

WOW arent you a genius? No shit games are going to get more demanding we all get that and we are all OKAY WITH THAT. But considering we know exactly what watch dogs 2 looks like based on a TON OF GAMEPLAY footage we are saying yeah the game clearly shouldnt be as demanding as it is.... its terrible.... you need a 1200$ titan xp to play over 60fps at 1440p are you f***** kidding me rn? The gtx 1070 does phenomenal in better looking games and gets over 60 in all 1440p games that look better than watch dogs 2 so that leaves us all to believe THAT ITS A SHITTY OPTIMIZATION JOB.

 

I dont know how DICE do it obviously im no dev but other companies should learn from them, they make games in which they can limit all the bs rendering that does nothing to improve graphical fidelity and focus on the things that do which makes their games aka bf1 look amazing and run even better than anyone can imagine. Think about this do you realize that COD IW is a terrible looking game compared to BF1 with smaller maps and smaller campaign maps even and yet it runs way worse than BF1... that shows you what  proper optimization is vs bullshit 

Link to comment
Share on other sites

Link to post
Share on other sites

Damnit Ubisoft. I was able to play Watch Dogs OG at Ultra or Very High at 40fps at 1080p and now you're telling me that I can't with Watch_Dogs 2?

 

Fuck off. I'm not gonna buy the game until the performance issues are fixed.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, AluminiumTech said:

Damnit Ubisoft. I was able to play Watch Dogs OG at Ultra or Very High at 40fps at 1080p and now you're telling me that I can't with Watch_Dogs 2?

 

Fuck off. I'm not gonna buy the game until the performance issues are fixed.

I've been playing it for the last few hours. It actually looks way better than the first one and I don't even have it maxed out.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Zeeee said:

WOW arent you a genius? No shit games are going to get more demanding we all get that and we are all OKAY WITH THAT. But considering we know exactly what watch dogs 2 looks like based on a TON OF GAMEPLAY footage we are saying yeah the game clearly shouldnt be as demanding as it is.... its terrible.... you need a 1200$ titan xp to play over 60fps at 1440p are you f***** kidding me rn? The gtx 1070 does phenomenal in better looking games and gets over 60 in all 1440p games that look better than watch dogs 2 so that leaves us all to believe THAT ITS A SHITTY OPTIMIZATION JOB.

 

I dont know how DICE do it obviously im no dev but other companies should learn from them, they make games in which they can limit all the bs rendering that does nothing to improve graphical fidelity and focus on the things that do which makes their games aka bf1 look amazing and run even better than anyone can imagine. Think about this do you realize that COD IW is a terrible looking game compared to BF1 with smaller maps and smaller campaign maps even and yet it runs way worse than BF1... that shows you what  proper optimization is vs bullshit 

Calm down, jeez. 

 

I haven't looked at any gameplay footage or screen shots. I'm going in with zero expectations and having heard nothing about the game. I also only played the first game on Xbox One for maybe 1/2 hour. Watching captured or streaming gameplay footage is not the same as playing it and seeing it on your own monitor in front of you. The image quality is degraded. I'm not saying this game shouldn't be demanding. I'm saying I don't know as I haven't played it myself and will withhold that judgement until then. 

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×