Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Are these ''OK'' temps for a Ryzen 7 5800X?

WeedIsMySin
 Share

I previously made a post asking how to reduce the default TDP to 105 watts and thanks to you guys i think i know how to. Havent done it yet though.

 

Everything is stock. All i've done is enable XMP in the bios.

 

Full list of specs:

Asus Tuf RTX 3070 OC

Ryzen 7 5800X cooled with a Cooler Master ML240L RGB V2

32GB Hyperfury at 3200mhz 2x16

MSI X570 Gaming Plus

750 Watt Cooler master PSU

 

It's built in a Asus GT 501 and has a total of 6 fans. 140mm fan in the back, 3 in the front and 2 mounted on top.

 

I'm pretty sure my airflow is good considering my GPU doesnt go over 65C at 99% usage. No matter how long i play

 

In none of the games i played have i ever seen  the CPU reach 80C. (RDR 2 Shadow of the Tomb Raider Cyberpunk2077) Usually its chilling in the high 60's to mid 70's.  It got close a few times though.

 

So i thought to myself whats the most demanding game on the CPU that i own? Assassin's Creed Origins

 

Thanks to Ubisoft's attrocious DRM it's quite heavy on the CPU. I've had issues with GPU usage on this game on the previous 2 systems i owned, and this one is no exception.

Only this time i measured the CPU temp to see if it spiked when the GPU usage dropped. And yea, it did. Highest i think i've ever seen it was 84, but it's only for a second or 2 before it corrects itself back to the mid 70's. 
This also only happens during the first few minutes of gameplay. Idk why.

 

I've added some screen shots of a AIDA 64 benchmark that ran for about 15-20 minutes. Idle temp was 36. While under 100% load it reached as high as 91C. I know the AMD guy said that this chip is designed to run at temps up to 95C. 

 

I found the exact quote:

"Yes. I want to be clear with everyone that AMD views temps up to 90C (5800X/5900X/5950X) and 95C (5600X) as typical and by design for full load conditions. Having a higher maximum temperature supported by the silicon and firmware allows the CPU to pursue higher and longer boost performance before the algorithm pulls back for thermal reasons," Hallock said.

 

Also i added a Cinebench23 multi core benchmark result. I think it was 14970.

 

Is this something i should be worried about?

 

 

 

The company that built this computer for me is super chill when it comes to upgrading and replacing parts. So i could just change some parts out if i wanted to.

 

Also i read online that the default TDP is actually 142 watts, instead of the 105 watts it should run at, according to him.  And i should expect cooler results if i changed it to 105. 

So i will probably do that.

 

But what are you guy's opinions on these stock temps at full load?

Also i would like to hear what you think about the AC origins thing. Thanks guys

 

Also if you think i should replace something in this PC. Feel free to lmk.

 

 

 

 

 

 

Schermopname (6).png

Schermopname (12).png

Schermopname (8).png

Schermopname (15).png

Schermopname (16).png

Link to comment
Share on other sites

Link to post
Share on other sites

I think you need a better cooler and more airflow. My 5800x at full load just breaks 82c after looping Cinebench R23. That's with a -20 all core undervolt using the curve optimizer and with PBO enabled. Maintains 4.7Ghz and 147W according to HWInfo. 

 

Using an EK 360 AIO in the top of a 011D Mini as exhaust. 

5800x/3090

Link to comment
Share on other sites

Link to post
Share on other sites

I run mine at 105W, everything else auto with a Noctua NH-D15S and the max I see is 84c, usually 74-80 at most when gaming.. At default 142W it hit 92-93c, wasn't comfortable with that.

Ryzen 7 5800X w/Noctua NH-D15S / Gigabyte RTX 2070 Super / 32Gb Vengeance 3200 (4x8) / MSI X470

Sabrent Rocket 512Gb m.2 / 2x1Tb T-Force Vulcan SATA SSD / Seasonic Focus GX-750

Dell S3220DGF - curved 32" 1440p 165Hz

 

Ryzen 5 3600 w/Stealth / MSI RTX 2060 Gaming Z / 16Gb Vengeance 2666 / MSI X470

Intel 660p 1tb / 1Tb T-Force Vulcan SATA SSD / Thermaltake Smart 600W 80+

HP 32Q - 32" 1440p 60hz

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, rickeo said:

I think you need a better cooler and more airflow. My 5800x at full load just breaks 82c after looping Cinebench R23. That's with a -20 all core undervolt using the curve optimizer and with PBO enabled. Maintains 4.7Ghz and 147W according to HWInfo. 

 

Using an EK 360 AIO in the top of a 011D Mini as exhaust. 

Yea a better cooler might help. However i have to disagree with the airflow part. Everything else in my system runs perfectly cool.

Link to comment
Share on other sites

Link to post
Share on other sites

Tbh as long as it isnt regularly hitting 90c you should be fine

 

Personally i prefer 80c and under but thats just because i dont want my system to crash since i like ocing the sht out of my pc

Link to comment
Share on other sites

Link to post
Share on other sites

There are two schools of thought about temperature:  the old school and the new school.  The old school says too hot.  You want to worry about anything running above 75c for a protracted period.  The cooler the better really because heat wears solid state parts.  Then there’s the new school which says “run it as high as it will run and don’t worry about it”.

the argument for this one using old school thinking is while it may wear it, the thing is going to be obsolete inside 10 years no matter what, and a laptop isn’t even going to last that long, so worrying about temps to make the thing last 30 years are stupid.  Are there people that do better? Probably.  You’re not temp throttling though, so it becomes a question of do you want to make the thing last a really long time?   If the answer is no, there isn’t a lot of point. My machine was put together about 6 years ago and I followed the old school thought, plus I never got the OC I wanted, which meant I had too much cooler. so when I built the thing I was seeing torture test numbers in the high60s.  Maybe the occasional 71c.  In that time my run temps have climbed a bit.  I saw a 73 recently.  So there is still a bit of wear from 2014 to 2021 even at those low temps. If you’re getting a 91c max, you’ve got what? 4c before you start to have issues?  Took me 6 years to see not even that much rise.  You’re starting higher than me though. 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Bombastinator said:

There are two schools of thought about temperature:  the old school and the new school.  The old school says too hot.  You want to worry about anything running above 75c for a protracted period.  The cooler the better really because heat wears solid state parts.  Then there’s the new school which says “run it as high as it will run and don’t worry about it”.

the argument for this one using old school thinking is while it may wear it, the thing is going to be obsolete inside 10 years no matter what, and a laptop isn’t even going to last that long, so worrying about temps to make the thing last 30 years are stupid.  Are there people that do better? Probably.  You’re not temp throttling though, so it becomes a question of do you want to make the thing last a really long time?   If the answer is no, there isn’t a lot of point. My machine was put together about 6 years ago and I followed the old school thought, plus I never got the OC I wanted, which meant I had too much cooler. so when I built the thing I was seeing torture test numbers in the high60s.  Maybe the occasional 71c.  In that time my run temps have climbed a bit.  I saw a 73 recently.  So there is still a bit of wear from 2014 to 2021 even at those low temps. If you’re getting a 91c max, you’ve got what? 4c before you start to have issues?  Took me 6 years to see not even that much rise.  You’re starting higher than me though. 

Thanks for replying.  Im not planning on using this CPU for anylonger than 2 years TOPS. Probably 1 year. Your saying im not thermal throttling which is good.

 

But if you dont mind me asking whats your opinion about the AC origins temp spike? I saw these GPU usage drops on my other systems aswell.

 

Do you think that the GPU usage dropped BECAUSE the CPU spiked to 84C? Do you think thats related? Because the AMD dude said that up to 95C was fine. 

 

This is the only game that does this. Tbf i knew this particular game was gonna run hotter than other games because i remember the old DRM debate when this game launched in 2017. This DRM appartently takes up 30/40% CPU usage.

 

Thanks for your time.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, WeedIsMySin said:

Thanks for replying.  Im not planning on using this CPU for anylonger than 2 years TOPS. Probably 1 year. Your saying im not thermal throttling which is good.

 

But if you dont mind me asking whats your opinion about the AC origins temp spike? I saw these GPU usage drops on my other systems aswell.

 

Do you think that the GPU usage dropped BECAUSE the CPU spiked to 84C? Do you think thats related? Because the AMD dude said that up to 95C was fine. 

 

This is the only game that does this. Tbf i knew this particular game was gonna run hotter than other games because i remember the old DRM debate when this game launched in 2017. This DRM appartently takes up 30/40% CPU usage.

 

Thanks for your time.

Not a clue.  I’m not even sure what you mean.    

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 month later...
On 9/9/2021 at 6:15 AM, WeedIsMySin said:

Thanks for replying.  Im not planning on using this CPU for anylonger than 2 years TOPS. Probably 1 year. Your saying im not thermal throttling which is good.

 

But if you dont mind me asking whats your opinion about the AC origins temp spike? I saw these GPU usage drops on my other systems aswell.

 

Do you think that the GPU usage dropped BECAUSE the CPU spiked to 84C? Do you think thats related? Because the AMD dude said that up to 95C was fine. 

 

This is the only game that does this. Tbf i knew this particular game was gonna run hotter than other games because i remember the old DRM debate when this game launched in 2017. This DRM appartently takes up 30/40% CPU usage.

 

Thanks for your time.

The GPU usage may be dropping directly because the CPU gets more load, therefore giving it less free time to "feed" the graphics card. However, it doesn't have to be because of the processor throttling, it may just be that it gets more load in some moments. Basically your GPU may be bottlenecked by the CPU in this specific game, which doesn't mean it will behave the same in other games or programs.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

Newegg

×