Jump to content

Assassins Creed Origins high CPU usage

smeek14

Hey so I just started playing Assassins Creed Origins and my CPU usage is pegged at 100% in cities. Is this normal? I have an i7 4790. I'm not getting a solid 60 FPS either. I would get like 70% GPU usage, 50 FPS, and 100% CPU usage.

Link to comment
Share on other sites

Link to post
Share on other sites

It's normal. The game is very CPU intensive.

CPU: Intel i7 6700k  | Motherboard: Gigabyte Z170x Gaming 5 | RAM: 2x16GB 3000MHz Corsair Vengeance LPX | GPU: Gigabyte Aorus GTX 1080ti | PSU: Corsair RM750x (2018) | Case: BeQuiet SilentBase 800 | Cooler: Arctic Freezer 34 eSports | SSD: Samsung 970 Evo 500GB + Samsung 840 500GB + Crucial MX500 2TB | Monitor: Acer Predator XB271HU + Samsung BX2450

Link to comment
Share on other sites

Link to post
Share on other sites

It's normal. The 4790 is pretty old and modern games push past its limits easily

Link to comment
Share on other sites

Link to post
Share on other sites

Going by what CYRI says, I'd say even the Recommended would be for getting the game installed & running at lowest settings.

 

i7-4790 & what graphics card...?

I frequently edit any posts you may quote; please check for anything I 'may' have added.

 

Did you test boot it, before you built in into the case?

WHY NOT...?!

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Eighjan said:

Going by what CYRI says, I'd say even the Recommended would be for getting the game installed & running at lowest settings.

 

i7-4790 & what graphics card...?

2080 ti, I'm getting an i9 10900k installed in the next few weeks. I run at 4k max.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, 5x5 said:

It's normal. The 4790 is pretty old and modern games push past its limits easily

Even to the point I can't get 60 fps?

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, smeek14 said:

2080 ti, I'm getting an i9 10900k installed in the next few weeks. I run at 4k max.

2080ti will thank you...  😃

I frequently edit any posts you may quote; please check for anything I 'may' have added.

 

Did you test boot it, before you built in into the case?

WHY NOT...?!

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, smeek14 said:

Even to the point I can't get 60 fps?

Yes, this is the game that pushed me to upgrade to an 8600k and then 9900k just to eliminate all CPU bottlenecks.

Even with a 2080 Ti I think it may dip below 60fps in the cities (from what I've seen on benchmarks), they are VERY CPU and GPU intensive.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Alex Atkin UK said:

Yes, this is the game that pushed me to upgrade to an 8600k and then 9900k just to eliminate all CPU bottlenecks.

Even with a 2080 Ti I think it may dip below 60fps in the cities (from what I've seen on benchmarks), they are VERY CPU and GPU intensive.

Yea I can calculate by my GPU usage there are a few areas I might dip under 60.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, smeek14 said:

Yea I can calculate by my GPU usage there are a few areas I might dip under 60.

Honestly though, I do quite well on a 2080 at 4K for Odyssey, if only my TV was the model up with G-Sync I'd be fine as it spends a lot of time at 60.  The benchmarks that show drops I think use max settings, turning a few things down might get you there on a Ti.

Its interesting as even after upgrading the CPU I upgraded again from 2400Mhz RAM to 3200Mhz and that dramatically reduced the occurrence of stutters.  The Assassins Creed games just hit everything really hard.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, smeek14 said:

2080 ti, I'm getting an i9 10900k installed in the next few weeks. I run at 4k max.

Why on earth would you get a 10900K... terrible value. 

 

Welcome to Ubisofts DRM application btw. 

Dirty Windows Peasants :P ?

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, smeek14 said:

Even to the point I can't get 60 fps?

My 4.7Ghz 4790K with 4.4Ghz L3 Cache and CL11.12.12 2200Mhz Memory can dip to around 65fps but can also infrequently see sub60 often in the cities, without such OC-Tuning and at stock (Allcore 4.2Ghz/1866DDR3), the cities were like 42-45fps at the lows with near full CPU usage.
 

Now my OC tuned setup still can run into the 80-90% usages in such city environments, but my dips are now 55fps-60fps with high CPU usage.

The locked 4790 just doesn't have the frequency (also possibly using slower non-Z board 1600Mhz DDR3 support) for the high demands ACO puts on the CPU.
The game is 8thread aware, and is made to run at 30FPS and legit fast frequency+manycores brute force methods for PC to run 60FPS+ (minimums)

Spoiler

https://youtu.be/8c9ePsmEVzU?t=72

^Early City Environment

 

4m:20s 90-95% CPU usages out there in the Wild in some scenes.

 

 

 

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

I thought this game come with Denuvo and VMProtect that hammer your CPU usage? Did they removed those?

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, SkilledRebuilds said:

My 4.7Ghz 4790K with 4.4Ghz L3 Cache and CL11.12.12 2200Mhz Memory can dip to around 65fps but can also infrequently see sub60 often in the cities, without such OC-Tuning and at stock (Allcore 4.2Ghz/1866DDR3), the cities were like 42-45fps at the lows with near full CPU usage.
 

Now my OC tuned setup still can run into the 80-90% usages in such city environments, but my dips are now 55fps-60fps with high CPU usage.

The locked 4790 just doesn't have the frequency (also possibly using slower non-Z board 1600Mhz DDR3 support) for the high demands ACO puts on the CPU.
The game is 8thread aware, and is made to run at 30FPS and legit fast frequency+manycores brute force methods for PC to run 60FPS+ (minimums)

  Reveal hidden contents

https://youtu.be/8c9ePsmEVzU?t=72

^Early City Environment

 

4m:20s 90-95% CPU usages out there in the Wild in some scenes.

 

 

 

That's exactly my setup. I got 1600 mhz RAM. I'm doing a huge upgrade soon though.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Lord Vile said:

Why on earth would you get a 10900K... terrible value. 

 

Welcome to Ubisofts DRM application btw. 

I want something that will last a really long time.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, smeek14 said:

I want something that will last a really long time.

It will lol. People will point to the 3900X with more cores for the same/bit lower price, but it's only 2 more, and they're only useful if you actually need more cores. A 10c/20t chip will still hold up plenty fine. My 6950X is happily chomping away at games and that's a 10c/20t chip from 2015, the 10900K is that + a solid IPC and beefy clock headroom bump. Should keep chugging for a damn good while 👌. Just make sure to get appropriate cooling if you plan to manually OC, stock they're pretty docile compared to past CPUs. 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Zando Bob said:

It will lol. People will point to the 3900X with more cores for the same/bit lower price, but it's only 2 more, and they're only useful if you actually need more cores. A 10c/20t chip will still hold up plenty fine. My 6950X is happily chomping away at games and that's a 10c/20t chip from 2015, the 10900K is that + a solid IPC and beefy clock headroom bump. Should keep chugging for a damn good while 👌. Just make sure to get appropriate cooling if you plan to manually OC, stock they're pretty docile compared to past CPUs. 

Well 2 more cores, an actual upgrade path, PCIE 4.0 and all for less money. Or you could get the 3700X for half the price and get exactly the same performance because 4K = GPU bottleneck. 

Dirty Windows Peasants :P ?

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Lord Vile said:

Well 2 more cores, an actual upgrade path, PCIE 4.0 and all for less money. Or you could get the 3700X for half the price and get exactly the same performance because 4K = GPU bottleneck. 

And a 2080 Ti gets 'necked by PCIe 3.0 x8 by around 3%. And you can run PCIe 3.0 x16, meaning you need a GPU pushing roughly twice the bandwidth a 2080 Ti needs in games to start choking with PCIe 3.0. 4.0 is handy if your board supports splitting down to more 3.0 lanes (and you actually need more lanes, else it's useless), or you need the pure read/write PCIe 4.0 NVMe SSDs offer. Otherwise it has little to offer, and it'll likely be a while before it's anywhere close to needed. 

OP specifically said they're getting a 10900K because they "want something that will last a really long time". So I doubt an upgrade path is a massive concern if you buy a CPU for the express purpose of not needing to upgrade for a while. Given that my 10c/20t chip is still damn solid, I think it'll be a while before one 5 years newer starts choking. 

Less money is a valid point if the OP cares about that cost difference. If not, then it isn't really a factor. 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, Zando Bob said:

And a 2080 Ti gets 'necked by PCIe 3.0 x8 by around 3%. And you can run PCIe 3.0 x16, meaning you need a GPU pushing roughly twice the bandwidth a 2080 Ti needs in games to start choking with PCIe 3.0. 4.0 is handy if your board supports splitting down to more 3.0 lanes (and you actually need more lanes, else it's useless), or you need the pure read/write PCIe 4.0 NVMe SSDs offer. Otherwise it has little to offer, and it'll likely be a while before it's anywhere close to needed. 

OP specifically said they're getting a 10900K because they "want something that will last a really long time". So I doubt an upgrade path is a massive concern if you buy a CPU for the express purpose of not needing to upgrade for a while. Given that my 10c/20t chip is still damn solid, I think it'll be a while before one 5 years newer starts choking. 

Less money is a valid point if the OP cares about that cost difference. If not, then it isn't really a factor. 

But if you want it going into the future PCIE 4.0 is a good thing.

 

Well the 10900K is on a dead platform with PCIE 3.0 which will likely be a bottleneck in the next 2 GPU releases so...

 

What's the point in paying 2x more for the same performance? 4K will always be GPU bottlenecked unless you're playing something like CSGO and even then you're pushing 700FPS does the extra 100 really matter? 

Dirty Windows Peasants :P ?

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Lord Vile said:

But if you want it going into the future PCIE 4.0 is a good thing.

 

Well the 10900K is on a dead platform with PCIE 3.0 which will likely be a bottleneck in the next 2 GPU releases so...

 

What's the point in paying 2x more for the same performance? 4K will always be GPU bottlenecked unless you're playing something like CSGO and even then you're pushing 700FPS does the extra 100 really matter? 

Hmm, yea I'll be getting the 2022 release of GPUs. I upgrade them every 4 years. Will I get more performance with PCIE 4.0 in the future? By what I'm hearing AMD Ryzen offers it but Intel doesn't?

Link to comment
Share on other sites

Link to post
Share on other sites

AC:O is really strange.

If I have my character at one of the gates of Athens my GPU is at 2070mhz looking into the city but if I look out to the countryside it is only running at 1920mhz. The frame rate is the same at 60 since it is synced for HDR.  I get from 75 to 85% GPU usage when doing this and only get 98% if my character is standing on the shore looking out to the sea.

Sometimes I think I am CPU bound and other times I think I am GPU bound but since the frame rate is the same and it is a totally smooth experience I don't think it matters. 

 

 

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, smeek14 said:

Hmm, yea I'll be getting the 2022 release of GPUs. I upgrade them every 4 years. Will I get more performance with PCIE 4.0 in the future? By what I'm hearing AMD Ryzen offers it but Intel doesn't?

Intel won’t offer it until next year most likely. AMD 4000 chips are coming though so you might wanna hold fire for them. 

Dirty Windows Peasants :P ?

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, Lord Vile said:

Intel won’t offer it until next year most likely. AMD 4000 chips are coming though so you might wanna hold fire for them. 

Yea I think that's what I've decided to do. Sadly ill have to wait 3 more months :(

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×