Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

3090 Not Able To Maintain 60 FPS At 1080p In Watch Dogs Legion

Hi Everyone! So today the patch was released which improved performance on PC in Watch Dogs Legion. I'm playing at maximum settings. It definitely did improve performance because prior to today the GPU usage % on the 3090 was about 80% at best. Now it's 96%, which is a great improvement albeit still not perfect (99% would be perfect). In the screenshot attached below, I am getting 51 FPS (yesterday it was in the low 40s). So bravo on the performance uplift! However, now that the GPU is being almost fully utilized, 51 FPS at 1080p with 96% usage??? 99% usage would only yield 1 or 2 more FPS if that, so we'd be looking at a theoretical max of 53 FPS at 1080p. I know, I know, it's an Ubisoft game and they are never optimized and performance is crap, but I fully expected to power through all that with a 3090 (with a +80MHz OC on core and +400MHz on VRAM). So questions:

 

1) Anyone else seeing the same or similar performance on a 3090 at max settings?

2) Was I expecting too much in expecting to power through any optimization issues with a 3090 at 1080p? 

 

spacer.pngWatch_Dogs_2.thumb.png.bac9604d4ba3b6a6b0e55b8775ea6ece.png

 

WatchDogs.thumb.png.918c1a068d4a255f262d7bbfaa0aff96.png

Link to post
Share on other sites
3 minutes ago, SAVE12HK said:

who made watchdog?

ubisoft

issue solved

I addressed that already:

5 minutes ago, jerubedo said:

I know, I know, it's an Ubisoft game and they are never optimized and performance is crap, but I fully expected to power through all that with a 3090 (with a +80MHz OC on core and +400MHz on VRAM)

 

Link to post
Share on other sites

im getting better results than that on my 2080ti with RTX on... do you have RTX on? DLSS on? Can you post your benchmark scores from the game?

PSU Tier List Thread

Please make sure to Quote me or @ me to see your reply!

 

"White Ice"

Ryzen 7 3700x | Asus Crosshair VIII Hero (Wi-Fi) | EVGA RTX 2080ti | Ballistix 32gb 16-18-16-36 3600mhz | Custom Water Cooling Loop | 1tb Samsung 970 Evo

2tb Crucial MX500 SSD | 2x 3tb Seagate Drive | Fractal Design Meshify S2 |  EVGA G2 750w PSU | 3x Corsair LL140 | 3x Corsair LL120

 

Dedicated Streaming Rig

 Ryzen 7 1800x | Asus B450-F Strix | 32gb Gskill Flare X 3000mhz | Corsair RM550x | EVGA GTX 1060 3gb | 250gb 860 Evo m.2

Phanteks Enthoo Evolv |  Elgato HD60 Pro | Elgato 4k60 Pro mk.2 | Avermedia 4k GC573 Capture Card

 

Link to post
Share on other sites
16 minutes ago, jerubedo said:

I addressed that already:

 

Yes but its still a ubisoft game and you are attempting to play at fully maxed settings. Turn some of them down to see which one is causing the issue. Most settings are trash and dont actually add much.

Link to post
Share on other sites
7 minutes ago, Skiiwee29 said:

im getting better results than that on my 2080ti with RTX on... do you have RTX on? DLSS on? Can you post your benchmark scores from the game?

Yes, RTX on ultra, FOV 90, 4x Headlights, 100% on the slider for extra details, no DLSS (but even with it on the results are the same). Everything is max (beyond ultra). Here are the benchmark results, which are really not indicative of actual gameplay:

 

watchdogs3.thumb.png.78c886f24cb982842cee2b9e8c03cbc2.pngwatchdogs4.thumb.png.747afa3b6e11a35d6bdf695ae8f630ea.png

Link to post
Share on other sites

True and the game doesn't even look to hot (ignoring the waifu character) 

RYZEN 5 3600 | GIGABYTE 3070 VISION OC | 16GB CORSAIR VENGEANCE LPX 3200 DDR4 | MSI B350M MORTAR | 250GB SAMSUNG EVO 860 | 4TB TOSHIBA X 300 | 1TB TOSHIBA SSHD | 120GB KINGSTON SSD | WINDOWS 10 PRO | INWIN 301| BEQUIET PURE POWER 10 500W 80+ SILVER | ASUS 279H | LOGITECH Z906 | DELL KB216T | LOGITECH M185 | SONY DUALSHOCK 4

 

LENOVO IDEAPAD 510 | i5 7200U | 8GB DDR4 | NVIDIA GEFORCE 940MX | 1TB WD | WINDOWS 10 GO HOME 

Link to post
Share on other sites

Give this a watch. This could help you dial in settings to get a playable experience.

 

 

 

 

 

PSU Tier List Thread

Please make sure to Quote me or @ me to see your reply!

 

"White Ice"

Ryzen 7 3700x | Asus Crosshair VIII Hero (Wi-Fi) | EVGA RTX 2080ti | Ballistix 32gb 16-18-16-36 3600mhz | Custom Water Cooling Loop | 1tb Samsung 970 Evo

2tb Crucial MX500 SSD | 2x 3tb Seagate Drive | Fractal Design Meshify S2 |  EVGA G2 750w PSU | 3x Corsair LL140 | 3x Corsair LL120

 

Dedicated Streaming Rig

 Ryzen 7 1800x | Asus B450-F Strix | 32gb Gskill Flare X 3000mhz | Corsair RM550x | EVGA GTX 1060 3gb | 250gb 860 Evo m.2

Phanteks Enthoo Evolv |  Elgato HD60 Pro | Elgato 4k60 Pro mk.2 | Avermedia 4k GC573 Capture Card

 

Link to post
Share on other sites
1 minute ago, jerubedo said:

, RTX on ultra

You know they said the card doesn't work well for 1080p scenarios do you? 

 

So idk how true that is but you can easily chk this out. 

 

You have a NV card, turn DSR on and set your game to 1440p or even 4k and see if that improves anything (besides the blurriness I mean, just the frames) 

RYZEN 5 3600 | GIGABYTE 3070 VISION OC | 16GB CORSAIR VENGEANCE LPX 3200 DDR4 | MSI B350M MORTAR | 250GB SAMSUNG EVO 860 | 4TB TOSHIBA X 300 | 1TB TOSHIBA SSHD | 120GB KINGSTON SSD | WINDOWS 10 PRO | INWIN 301| BEQUIET PURE POWER 10 500W 80+ SILVER | ASUS 279H | LOGITECH Z906 | DELL KB216T | LOGITECH M185 | SONY DUALSHOCK 4

 

LENOVO IDEAPAD 510 | i5 7200U | 8GB DDR4 | NVIDIA GEFORCE 940MX | 1TB WD | WINDOWS 10 GO HOME 

Link to post
Share on other sites
Just now, Skiiwee29 said:

Give this a watch. This could help you dial in settings to get a playable experience.

 

 

 

 

 

I mean sure, but I really wasn't looking to turn down any settings on a $1500 card at 1080p. I could understand at 1440p or 4k, but not 1080p. It's not even like this is a CPU issue. With 96% usage on the GPU, it's simply that the GPU can't hold 60. CPU usage is around 50-60% and no thread is capped, either. The highest thread usage % is 80%.

Link to post
Share on other sites
16 hours ago, Mark Kaine said:

You know they said the card doesn't work well for 1080p scenarios do you? 

 

So idk how true that is but you can easily chk this out. 

 

You have a NV card, turn DSR on and set your game to 1440p or even 4k and see if that improves anything (besides the blurriness I mean, just the frames) 

No, it gets worse at both 1440p and 4k. At 1440p it goes down to 47 FPS at the same scene (with 99% usage). And at 4k it goes down to 45 FPS (still with 99% usage).

 

EDIT: 35 FPS at 4K, not 45.

Link to post
Share on other sites
1 minute ago, jerubedo said:

I mean sure, but I really wasn't looking to turn down any settings on a $1500 card at 1080p. I could understand at 1440p or 4k, but not 1080p. It's not even like this is a CPU issue. With 96% usage on the GPU, it's simply that the GPU can't hold 60. CPU usage is around 50-60% and no thread is capped, either. The highest thread usage % is 80%.

It's touted to be a 4k card so try that out as outlined above. 

 

Also DLSS should at least do 'something' at least at higher resolutions. 

RYZEN 5 3600 | GIGABYTE 3070 VISION OC | 16GB CORSAIR VENGEANCE LPX 3200 DDR4 | MSI B350M MORTAR | 250GB SAMSUNG EVO 860 | 4TB TOSHIBA X 300 | 1TB TOSHIBA SSHD | 120GB KINGSTON SSD | WINDOWS 10 PRO | INWIN 301| BEQUIET PURE POWER 10 500W 80+ SILVER | ASUS 279H | LOGITECH Z906 | DELL KB216T | LOGITECH M185 | SONY DUALSHOCK 4

 

LENOVO IDEAPAD 510 | i5 7200U | 8GB DDR4 | NVIDIA GEFORCE 940MX | 1TB WD | WINDOWS 10 GO HOME 

Link to post
Share on other sites
1 minute ago, jerubedo said:

No, it gets worse at both 1440p and 4k. At 1440p it goes down to 47 FPS at the same scene (with 99% usage). And at 4k it goes down to 45 FPS (still with 99% usage).

Hmm ok, then something might be wrong. But II think it's probably just badly optimized tbh... 

 

Do other games work ok? 

RYZEN 5 3600 | GIGABYTE 3070 VISION OC | 16GB CORSAIR VENGEANCE LPX 3200 DDR4 | MSI B350M MORTAR | 250GB SAMSUNG EVO 860 | 4TB TOSHIBA X 300 | 1TB TOSHIBA SSHD | 120GB KINGSTON SSD | WINDOWS 10 PRO | INWIN 301| BEQUIET PURE POWER 10 500W 80+ SILVER | ASUS 279H | LOGITECH Z906 | DELL KB216T | LOGITECH M185 | SONY DUALSHOCK 4

 

LENOVO IDEAPAD 510 | i5 7200U | 8GB DDR4 | NVIDIA GEFORCE 940MX | 1TB WD | WINDOWS 10 GO HOME 

Link to post
Share on other sites
6 minutes ago, Mark Kaine said:

It's touted to be a 4k card so try that out as outlined above. 

See my previous response:

7 minutes ago, jerubedo said:

No, it gets worse at both 1440p and 4k. At 1440p it goes down to 47 FPS at the same scene (with 99% usage). And at 4k it goes down to 45 FPS (still with 99% usage).

 

6 minutes ago, Mark Kaine said:

Also DLSS should at least do 'something' at least at higher resolutions. 

Yes, at 4k DLSS brings me back up to 51 FPS. But that's it. On that same scene of course.

Link to post
Share on other sites
Just now, Mark Kaine said:

Hmm ok, then something might be wrong. But II think it's probably just badly optimized tbh... 

 

Do other games work ok? 

Oh yeah, other games are fine, but this worries me about upcoming releases. I play a lot of Ubisoft games. Valhalla is next for me after this lol. It will probably be the same :(. I really hoped the 3090 would power through dev issues and bull crap.

Link to post
Share on other sites
1 minute ago, jerubedo said:

Oh yeah, other games are fine, but this worries me about upcoming releases. I play a lot of Ubisoft games. Valhalla is next for me after this lol. It will probably be the same :(. I really hoped the 3090 would power through dev issues and bull crap.

Maybe it does... But you have to consider some settings are often just not worth using and maybe even implemented to bring any card down. Like someone else said you need to find those settings and turn them down / off. 

 

It's quite possible some settings do exactly nothing except using up power (volumetric lighting often is such a setting) 

 

Also have to consider rtx doesn't exactly improve performance either. 

 

Basically. If you want to play at "max" max settings you also need to investigate a little regarding which games are actually optimized. 

I don't think this is a problem with the card after all. 

RYZEN 5 3600 | GIGABYTE 3070 VISION OC | 16GB CORSAIR VENGEANCE LPX 3200 DDR4 | MSI B350M MORTAR | 250GB SAMSUNG EVO 860 | 4TB TOSHIBA X 300 | 1TB TOSHIBA SSHD | 120GB KINGSTON SSD | WINDOWS 10 PRO | INWIN 301| BEQUIET PURE POWER 10 500W 80+ SILVER | ASUS 279H | LOGITECH Z906 | DELL KB216T | LOGITECH M185 | SONY DUALSHOCK 4

 

LENOVO IDEAPAD 510 | i5 7200U | 8GB DDR4 | NVIDIA GEFORCE 940MX | 1TB WD | WINDOWS 10 GO HOME 

Link to post
Share on other sites
2 minutes ago, Mark Kaine said:

Maybe it does... But you have to consider some settings are often just not worth using and maybe even implemented to bring any card down. Like someone else said you need to find those settings and turn them down / off. 

 

It's quite possible some settings do exactly nothing except using up power (volumetric lighting often is such a setting) 

 

Also have to consider rtx doesn't exactly improve performance either. 

 

Basically. If you want to play at "max" max settings you also need to investigate a little regarding which games are actually optimized. 

I don't think this is a problem with the card after all. 

Yeah, I've already identified that the biggest 2 settings are RTX of course and extra details. Unfortunately I see a WORLD of difference in reflections between RTX off and on. The extra detail is noticeable as well. With it at 100% the background objects are super sharp but at 0% they are blurry as hell, albeit then I get 60 FPS. 50% is a good compromise but it's still not 60 FPS (only 57) so my mentality is well if it's still not 60, might as well roll with 51 instead of 57 and get all the goodies.

Link to post
Share on other sites

No personal experience so you can stop reading here if you want but from what I've heard the 3090 is way beyond a mainstream gaming card to the point where unless you're gaming in 4k or 8k you shouldn't bother.

Link to post
Share on other sites
7 minutes ago, Shabba said:

No personal experience so you can stop reading here if you want but from what I've heard the 3090 is way beyond a mainstream gaming card to the point where unless you're gaming in 4k or 8k you shouldn't bother.

That would only hold true if the GPU wasn't being utilized close to 100%. If you can get a 1080p load to put 96% utilization on the GPU and not maintain 60 FPS, then that GPU, regardless of what it is, is not able to handle the load being requested of it. Another good example is Red Dead Redemption 2. With everything maxed it's able to pull about 75 FPS at 1080p in the most demanding scenes (snowy areas in the beginning as an example) with 99% GPU utilization. At 4K those same scenes are right around 60 FPS. That makes it just fine for 144Hz 1080p or 60Hz 4K. A side note: most people assume that San Denis is the most demanding area, but it's not. Those snowy areas are the worst case load.

Link to post
Share on other sites

I would try RTX Medium with DLSS Quality and Extra Details off, and see if that is a happy medium.

CPU: Intel i9 9900K | Motherboard: ASRock B365M Phantom Gaming 4 | RAM: 4x8GB 2666MHz HyperX Fury RGB | GPU: ZOTAC GAMING GeForce RTX 3080 AMP Holo | PSU: Corsair RM750x (2018) | Case: ASUS TUF Gaming GT301 | Cooler: Cooler Master Hyper 212 Black Edition | SSD: Samsung 970 Evo 500GB | SSD: PNY XLR8 CS3030 2TB | SSD: Samsung 860 Evo 2TB | HDD: Seagate BerraCuda 8TB | Monitor: GIGABYTE G27QC

Link to post
Share on other sites
5 minutes ago, Sycoblackburn said:

I would try RTX Medium with DLSS Quality and Extra Details off, and see if that is a happy medium.

RTX Ultra with DLSS off and Extra Details off actually does the trick. I guess I can try it for a bit.

Link to post
Share on other sites
4 minutes ago, jerubedo said:

RTX Ultra with DLSS off and Extra Details off actually does the trick. I guess I can try it for a bit.

Kinda weird but if it works. Extra Details seems to be even higher LOD and Geometry at a distance, IIRC.

CPU: Intel i9 9900K | Motherboard: ASRock B365M Phantom Gaming 4 | RAM: 4x8GB 2666MHz HyperX Fury RGB | GPU: ZOTAC GAMING GeForce RTX 3080 AMP Holo | PSU: Corsair RM750x (2018) | Case: ASUS TUF Gaming GT301 | Cooler: Cooler Master Hyper 212 Black Edition | SSD: Samsung 970 Evo 500GB | SSD: PNY XLR8 CS3030 2TB | SSD: Samsung 860 Evo 2TB | HDD: Seagate BerraCuda 8TB | Monitor: GIGABYTE G27QC

Link to post
Share on other sites

Somethings wrong I maxed out everything with raytracing at ultra and DLSS quality it averages over 60fps at 3440x1440 on a 3080.

AMD 5900X / Gigabyte X570 Auros Pro / 64GB @ 3600c16 / 1TB Samsung 980 Pro 4.0x4 / 4TB total Inland TLC 3.0x4 / EVGA FTW3 3080 / Corsair RM750x /Thermaltake View71

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to post
Share on other sites
1 hour ago, ewitte said:

Somethings wrong I maxed out everything with raytracing at ultra and DLSS quality it averages over 60fps at 3440x1440 on a 3080.

At the scene I was at? I can largely get 60 FPS or over as well but at certain spots like the one pictured it stays at a constant 51. What average do you get on the benchmark run?

Link to post
Share on other sites
15 hours ago, jerubedo said:

No, it gets worse at both 1440p and 4k. At 1440p it goes down to 47 FPS at the same scene (with 99% usage). And at 4k it goes down to 45 FPS (still with 99% usage).

Sounds like a classic CPU limitation.  Increase res (4k is a lot higher than 1440p!) and FPS only goes down by 2.

Did you check the correct CPU usage%?

Please remember there are TWO CPU usage measurements.  Some programs check the first one, which is *max thread usage*.  That is, if a single thread is loaded at 99% and all the other threads are at 5%, the usage will be reported as 99%.  The second measurement is all the threads combined.  On an 8 core processor, if one thread were pegged at 99% and the other 15 were at like 5, I think the total CPU usage would be around 15%.

 

I have MSI Afterburner's RTSS Plugin via HWinfo64 set, via info, to show me both the max single thread usage and total CPU usage.

That being said, the option 'CPU Usage' in MSI Afterburner should default to the correct "Total CPU usage".

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Newegg

×