Jump to content

I've found a purely gaming situation that uses more than 8 cores worth of Zen 3

YoungBlade

I've been doing various things with games to see how much usage I can push out of my 5900X. In most situations, the total usage doesn't break 33% according to HWInfo64, which fits with the understanding that 4c/8t CPUs are still okay for a lot of games. When it does go over that in more CPU demanding titles, it generally doesn't break 50%, so the idea of "6c/12t is enough for modern gaming" seems to hold true. Even the Ashes of the Singularity Escalation CPU Benchmark usually sits at just 55% and only sometimes goes into the mid 60s. That would imply that a 5800X would perform nearly identically. If even that benchmark can't break 67%, then surely 8c/16t is more than enough for gamers.

 

To be clear, it's not that some cores are seeing 0% usage - all of the cores get some usage in all of these scenarios - but the total level of CPU headroom afforded by a 5600X is basically never exceeded in games, and the headroom of the 5800X is straight up never exceeded, which is what I expected to find. No game should need more than that.

 

Except...

 

When loading Metro Exodus Enhanced Edition from my NVMe drive, my 5900X not only stayed above 50% the whole time, but it reached 70.8% total usage for a bit. That is 8.5 cores worth of usage (including SMT), which would mean that, if I had a 5800X, it could've been maxed out at 100%. The loading would have taken longer on an 8c/16t CPU of the same architecture.

 

I didn't have anything else going on in the background, so this is just the game and Windows using over 8 cores worth of Zen 3. Not a lot more, but any amount over that was unexpected for me.

 

To be fair, this is not happening in game. (That doesn't break 40% usage for me) But loading up a game is part of the game, and I definitely notice a huge difference in loading time compared to my previous CPU - a 9600K that absolutely did max out when loading that game. I used to get up and go get a drink or something while the game loaded, because it took so long. To be fair, I'm guessing I wouldn't notice a difference between this and a 5800X without timing it, but this is technically a situation where a game alone wants more than 8 cores from a modern CPU.

Link to comment
Share on other sites

Link to post
Share on other sites

I've been doing similar stuff trying to push the 12 cores to the limit, however the game I found was a bit different. Minecraft of all things is what actually brought my CPU to its knees, granted not under normal gameplay. I tried to spawn in a fireball of radius 1000, it ended up hitting a peak of 97% on the entire CPU. It may not be a regular usage in game, but it can bring the CPU to its knees. 

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, YoungBlade said:

so the idea of "6c/12t is enough for modern gaming" seems to hold true.

Always has been since Ryzen and when Vulkan and DX12 finally catch up steam in many titles.

Press quote to get a response from someone! | Check people's edited posts! | Be specific! | Trans Rights

I am human. I'm scared of the dark, and I get toothaches. My name is Frill. Don't pretend not to see me. I was born from the two of you.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, YoungBlade said:

To be fair, I'm guessing I wouldn't notice a difference between this and a 5800X without timing it, but this is technically a situation where a game alone wants more than 8 cores from a modern CPU.

I wonder how it would all fare if you went from your 2060 series to a 3080/3090 as more frames = higher CPU usages, plus the Nvidia scheduler behind the scenes also swaying CPU usages higher...when more frames are needed. (You can simulate this a little with 720p-900p/1080p DLSS) but that can skew drawcalls that shift in values with resolution changes in some engines.
I've seen Cyberpunk use 60-75% (spikes) of my 10 cores and heavily into HT,..but I forced it with settings,...10850K 5.1Ghz/4266C17 chasing above 120FPS with a 2080Ti at High details,1080p DLSS and... like you, wanted to see how the CPU fared when it's tasked with providing High amount of frames while still using High details for drawcalls as lowering details would shift results again.

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

Cyberpunk 2077 easily scales to 12c/24t. That's more the exception than the rule, right now, but I expect games to start really making use of more cores within the next year or two. So much of this is actually driven by the console market, because that's the closest thing to a guide that exists. There's so many variations of computer hardware, that you can't really design a game based on any particular PC configuration. You can, however, design a game for a specific piece of console hardware, and then port that to a PC of roughly equivalent performance grade. This is the first gen where consoles have had more than 7 relatively slow threads to play with, so as devs starting making use of that new found performance, it's going to spill over into PC game development, as well.

CPU: AMD Ryzen 9 5900X · Cooler: Artic Liquid Freezer II 280 · Motherboard: MSI MEG X570 Unify · RAM: G.skill Ripjaws V 2x16GB 3600MHz CL16 (2Rx8) · Graphics Card: ASUS GeForce RTX 3060 Ti TUF Gaming · Boot Drive: 500GB WD Black SN750 M.2 NVMe SSD · Game Drive: 2TB Crucial MX500 SATA SSD · PSU: Corsair White RM850x 850W 80+ Gold · Case: Corsair 4000D Airflow · Monitor: MSI Optix MAG342CQR 34” UWQHD 3440x1440 144Hz · Keyboard: Corsair K100 RGB Optical-Mechanical Gaming Keyboard (OPX Switch) · Mouse: Corsair Ironclaw RGB Wireless Gaming Mouse

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, SkilledRebuilds said:

I wonder how it would all fare if you went from your 2060 series to a 3080/3090 as more frames = higher CPU usages, plus the Nvidia scheduler behind the scenes also swaying CPU usages higher...when more frames are needed.

That is true. The 2060 Super isn't exactly weak, but yes, the GPU is holding the CPU back. It's at 98%+ in all the modern titles, especially because I'm playing at 1440p.

 

Once graphics card prices are sane again and I can upgrade, I'm sure I'll do more experiments to see how far I can push things.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×