Jump to content

Apps not utilizing full CPU potential?

mao91

See signature for system specs. 

Basically, almost no matter what I'm doing, I'm not seeing full CPU utilization during media encoding (video, audio, whatever) in any apps, whether it's Screencast-o-matic, Adobe anything, Pro Tools, Reaper, etc. Whatever. Whenever I'm encoding I'm always going from SSD to SSD (usually one SSD to another SSD) so there shouldn't be a huge bottleneck there, either (resource monitor shows disk utilization averaging under 10%). In this instance, I'm encoding a tutorial video from Screencast-o-matic for work. CPU utilization is fluctuating between 16-25%, but not going much higher (see screenshot). This general thing has been happening ever since I built my PC and I'm not sure how to get it to really utilize its own potential properly.

Does anyone know why this may be?
 

cpu utilization 1.png

If what I'm posting has already been posted, I'm sorry.

Link to comment
Share on other sites

Link to post
Share on other sites

WELCOME

 

To the reason why CPUs like the FX-6300 Dont game well, and the general reason why apps are slow in the first place!

 

Lets take the 6300 and look at its raw compute performance:

6 cores at 4 Ghz is 24 billion total Hz, there is a 4x multiplier on that (i think) to give you a total of 96 GFlops (dont quote me here). 

 

That means, in theory, the 6300 can do 9.6 BILLION floating point operations per second. 

 

But, software cannot utilize that, hence the issues we all deal with. 

 

Different PCPartPickers for different countries:

UK-----Italy----Canada-----Spain-----Germany-----Austrailia-----New Zealand-----'Murica-----France-----India

 

10 minutes ago, Stardar1 said:

Well, with an i7, GTX 1080, Full tower and flashy lights, it can obviously only be for one thing:

Solitaire. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Stardar1 said:

WELCOME

 

To the reason why CPUs like the FX-6300 Dont game well, and the general reason why apps are slow in the first place!

 

Lets take the 6300 and look at its raw compute performance:

6 cores at 4 Ghz is 24 billion total Hz, there is a 4x multiplier on that (i think) to give you a total of 96 GFlops (dont quote me here). 

 

That means, in theory, the 6300 can do 9.6 BILLION floating point operations per second. 

 

But, software cannot utilize that, hence the issues we all deal with. 

 

I understand the concept. But I've seen other systems not having this issue. I'm wondering if perhaps there is a certain way I should configure my system on the software side of things (perhaps a codec issue) to work around this.

If what I'm posting has already been posted, I'm sorry.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, B.Toast said:

I understand the concept. But I've seen other systems not having this issue. I'm wondering if perhaps there is a certain way I should configure my system on the software side of things (perhaps a codec issue) to work around this.

Im sure there is, but I wouldnt know. 

Different PCPartPickers for different countries:

UK-----Italy----Canada-----Spain-----Germany-----Austrailia-----New Zealand-----'Murica-----France-----India

 

10 minutes ago, Stardar1 said:

Well, with an i7, GTX 1080, Full tower and flashy lights, it can obviously only be for one thing:

Solitaire. 

Link to comment
Share on other sites

Link to post
Share on other sites

I had a friend with similar specs who bought creative cloud. he called adobe and they replied essentially "We know it's an issue. We're not doing anything about it at this time"

¢υѕтσм ℓσσρ σя ησтнιηg αт αℓℓ

Link to comment
Share on other sites

Link to post
Share on other sites

Generally speaking you should not have to set up any kind of system configuration. Out of the box both my i5 and i7 CPU's utilize 100% during a render workload. Make sure all drivers are up to date and working correctly - and perhaps check BIOS settings to make sure all your cores are enabled - you have a 5820K correct?

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Codyman125 said:

I had a friend with similar specs who bought creative cloud. he called adobe and they replied essentially "We know it's an issue. We're not doing anything about it at this time"

I've seen it use 100% on the 4790k system at my work, so I don't think this is it.

 

Just now, GrissmIN said:

Generally speaking you should not have to set up any kind of system configuration. Out of the box both my i5 and i7 CPU's utilize 100% during a render workload. Make sure all drivers are up to date and working correctly - and perhaps check BIOS settings to make sure all your cores are enabled - you have a 5820K correct?

All cores are enabled. See screenshot above. Though, hyperthreading is being a pain in the neck to get running as well. I'm sure there's a particular encoding codec I should be using, but I wouldn't know.

If what I'm posting has already been posted, I'm sorry.

Link to comment
Share on other sites

Link to post
Share on other sites

When rendering video you shouldnt need to use a specific codec. Things like the H.264 Codec are built and designed for streaming video or creating distributed content. The AVC-HD codec is another lossy codec that is used for capturing video on devices like a DSLR or camcorder. Then theres things like your Apple Pro Res which is as close as you can get to RAW video being stored on things like an SSD. My point being is that the codec you are using should not effect how much your CPU is being used in any way. 

Link to comment
Share on other sites

Link to post
Share on other sites

If we go down the list of potential bottlenecks we have it doesnt appear you really have one. Do you have a GPU installed that may be taking up some of the slack with the GPU Acceleration enabled? This decreases my CPU usage significantly where it applies on my 6700k - perhaps the 5820k is spreading the work out across all its threads so it does not register as a high use percentage especially when paired with a GPU? Just thoughts that I am throwing down. 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×