Jump to content

Heavily underclocking CPU boosts FPS by a lot ???

Hello good people. I have discovered this weird thing and I hope that you guys can help me understand.

Namely, I have two profiles in Ryzen Master, one for gaming and another one with low clock speeds for low voltage and lower temperature and I am switching between them depending on what I'm doing.

Profile 1: 2825MHz @0,975V
Profile 2: 4475MHz @ 1.25V
One time I forgot to switch profiles when I was gaming, at that specific moment and place I had 85FPS, but when I switched to gaming profile the FPS dropped to 75!
Does anyone have any idea how HEAVILY UNDERCLOCKED CPU actually boosted my FPS by 10?

Here's a YouTube video I've uploaded as a proof:

System config:
GPU: RTX 3080
CPU: R5 5600X
MOBO: B450
RAM: 2x3200MHz CL16
OS installed on SATA SSD
Games installed on NVMe SSD
WinVer 21H1





Edit: I can confirm this happened right now in another game dropping from 93 to 80 **

Thank you in advance!

Link to comment
Share on other sites

Link to post
Share on other sites

Just a theory, but I can see this. Vertical Sync turned off  = higher fps, but with higher CPU load. Underclocking CPU keeps it cooler and more stable and consistent to handle the extra load with Vertical Sync turned off. Likely the CPU was doing some moderate thermal throttling before being underclocked, explaining the FPS increase that came with the resulting cooler temps.

Edited by An0maly_76
Revised, more info

I don't badmouth others' input, I'd appreciate others not badmouthing mine. *** More below ***

 

MODERATE TO SEVERE AUTISTIC, COMPLICATED WITH COVID FOG

 

Due to the above, I've likely revised posts <30 min old, and do not think as you do.

THINK BEFORE YOU REPLY!

Link to comment
Share on other sites

Link to post
Share on other sites

Notice the GPU usage also went down as FPS increases. The game seems to be scaling with reduced hardware capability. I dont recognize the game so I'm not sure

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

The Afterburner overlay there isn't particularly helpful. There just isn't enough information. Can you add the per thread utilization, so we can see how the threads of the CPU are being leveraged? In theory, if the downclock is working properly, we should see increased utilization vs with Profile 2, because the cores have to use more of their available resources, as the available resources are now decreased.

 

Also, I think you're missing the significance of having only 2 cores at 2825MHz while the rest are at 600MHz. That's a potentially confounding variable in your testing.

 

My hypothesis is that, since you limited boosting to your two best cores, the game might have its resources redirected to those two and only those two. This would mean that the game's thread(s) are no longer bouncing between cores, which can incur a performance penalty. So if for Profile 2, the world thread is constantly switching between the 6 cores, the overhead associated with task switching, as the game's data is pushed in and out of the L1 and L2 caches of the relevant cores, could result in worse performance.

 

Another factor I'm considering here is the Nvidia driver, which uses the CPU for scheduling and is itself multi-threaded. It could be that in your Profile 1 configuration, the Nvidia driver is forced onto the slower 600MHz cores, which is still enough resources for the driver, but now that driver is also not having to deal with task switching.

 

Basically, just like how turning off HT/SMT can sometimes boost gaming performance, and how it was common to see improvements from AMD's "Game Mode" that would disable cores to give you a single coherent L3 cache back in the early Zen days, and how using Process Lasso to lock a processes to certain cores can sometimes help, it's possible that it's actually beneficial for this particular game to soft limit it to just 2 cores and 4 threads. The other cores are still there and available, so when the game needs additional resources, it can still use them, but it's less likely to task switch.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, An0maly_76 said:

Just a theory, but I can see this. Vertical Sync turned off  = higher fps, but with higher CPU load. Underclocking CPU keeps it cooler and more stable and consistent to handle the extra load with Vertical Sync turned off. Likely the CPU was doing some moderate thermal throttling before being underclocked, explaining the FPS increase that came with the resulting cooler temps.

Gotcha. One small problem tho - I was switching from low profile to high profile. If I am running a low prifile on low voltage and low clocks resulting in low temperature, how can that make the CPU throttle straight away when I switch to the gaming profile? Shouldn't it take time for it to reach some sort of a power limit before it starts throttling?

 

(I am not at all saying your theory is bad, just trying to understand all this and connect the dots.)

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Jurrunio said:

Notice the GPU usage also went down as FPS increases. The game seems to be scaling with reduced hardware capability. I dont recognize the game so I'm not sure

The game from the video is Assassin's  Creed Valhalla but I tested it afterwards with Rise of the Tomb Raider and the results were the same. I will have to check some settings when I come back from work.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Jeppes said:

Weird results usually come from bad testing.

Define bad testing?

I have used cinebench r20 and r23 for stability, results were very consistent. Care to enlighten me about my mistake and what I should have done differently?

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, JaYZaaR said:

Define bad testing?

I have used cinebench r20 and r23 for stability, results were very consistent. Care to enlighten me about my mistake and what I should have done differently?

Compare against stock is the first thing to do as bad overclocks can hurt performance. Second thing is to test with something that is consistant, some games have varying fps by design.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, JaYZaaR said:

Gotcha. One small problem tho - I was switching from low profile to high profile. If I am running a low prifile on low voltage and low clocks resulting in low temperature, how can that make the CPU throttle straight away when I switch to the gaming profile? Shouldn't it take time for it to reach some sort of a power limit before it starts throttling?

 

(I am not at all saying your theory is bad, just trying to understand all this and connect the dots.)

I think you're getting my theory backwards. I was referring to the difference in CPU performance when thermal throttling versus NOT thermal throttling (cooler = faster basically). Stay with me here...

 

Vertical synchronization caps FPS to stabilize for low refresh rates.

When vertical synchronization is turned off, this will increase your FPS, but will more or less load the CPU more to do it.

Overclocking the CPU makes it faster, but kicks temps up. Therefore, underclocking will make it run cooler.

Cooler temps = better performance, because the processor doesn't have to throttle back for thermals.

 

Therefore, when you underclocked your CPU, that lowered the temp...

Which reduced the need for thermal throttling...

Which enabled the CPU to better handle the load of vertical synchronization.

 

Think of it like this. Vertical synchronization caps your FPS to compensate for displays that can't handle over 60 FPS. BUT...

 

For those displays that CAN handle over 60, you can turn it off, which kicks some of the graphics rendering load over to the CPU. Ergo, you get higher FPS, but increases your CPU usage. However, a CPU that is overclocked, under high load or otherwise already in thermal-throttling won't be able to help much, because it's trying to protect itself. But when you underclocked, you brought the temp down and gave it some room to do so. More or less the underclock 'supercharged' the CPU's ability to handle vertical synchronization.

 

Edited by An0maly_76
Revised, more info

I don't badmouth others' input, I'd appreciate others not badmouthing mine. *** More below ***

 

MODERATE TO SEVERE AUTISTIC, COMPLICATED WITH COVID FOG

 

Due to the above, I've likely revised posts <30 min old, and do not think as you do.

THINK BEFORE YOU REPLY!

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, YoungBlade said:

The Afterburner overlay there isn't particularly helpful. There just isn't enough information. Can you add the per thread utilization, so we can see how the threads of the CPU are being leveraged? In theory, if the downclock is working properly, we should see increased utilization vs with Profile 2, because the cores have to use more of their available resources, as the available resources are now decreased.

 

Also, I think you're missing the significance of having only 2 cores at 2825MHz while the rest are at 600MHz. That's a potentially confounding variable in your testing.

 

My hypothesis is that, since you limited boosting to your two best cores, the game might have its resources redirected to those two and only those two. This would mean that the game's thread(s) are no longer bouncing between cores, which can incur a performance penalty. So if for Profile 2, the world thread is constantly switching between the 6 cores, the overhead associated with task switching, as the game's data is pushed in and out of the L1 and L2 caches of the relevant cores, could result in worse performance.

 

Another factor I'm considering here is the Nvidia driver, which uses the CPU for scheduling and is itself multi-threaded. It could be that in your Profile 1 configuration, the Nvidia driver is forced onto the slower 600MHz cores, which is still enough resources for the driver, but now that driver is also not having to deal with task switching.

 

Basically, just like how turning off HT/SMT can sometimes boost gaming performance, and how it was common to see improvements from AMD's "Game Mode" that would disable cores to give you a single coherent L3 cache back in the early Zen days, and how using Process Lasso to lock a processes to certain cores can sometimes help, it's possible that it's actually beneficial for this particular game to soft limit it to just 2 cores and 4 threads. The other cores are still there and available, so when the game needs additional resources, it can still use them, but it's less likely to task switch.

First of all, thank you very much for your time to write all this, I appreciate it.

I will give you more info when I am back from work. Right now I can only explain some things regarding custom profiles.

Leaving CPU on stock boosts itself to 4650MHz @1.4V and I REALLY don't like anything above 1.3V, that is too much voltage for my liking, not to mention it overheats the CPU with no benefit (negligible boost such as 1FPS for 10°C more, quite literally). That is why I created Profile 2, which has practically same performance but with A LOT lower temperature. Of course, by having a custom Profile enabled, voltage no longer bounces from 0.9 to 1.4 and is constantly at 1.25 as mentioned in the original post.

Profile 1 was created because I don't need extra voltage and heat when I am browsing the internet or when it is idle listening to the music. So I created that lowered one with 0.975V for the idle times and gaming profile for self explanatory reasons.

Since it is for idle reasons and light work, I can put my cores to sleep, no? Was that a mistake? I tested with just 2 and all 6 at 2825MHz and FPS was the same with or without "sleeping" cores. If you have any words of advice, I am happy to listen.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Jeppes said:

Compare against stock is the first thing to do as bad overclocks can hurt performance. Second thing is to test with something that is consistant, some games have varying fps by design.

Right. Stock settings give me 4650MHz but at 1.4V and that is too much voltage for my liking, especially because my temps go too high. I don't overclock, I just undervolt to get the same performance with less heat and less voltage, hence why I lowered my clock to 4475MHz at 1.25V, and temps are 10°C lower with negligible drop in performance (~1-2 FPS). I used The Witcher 3, Rise of the Tomb Raider and AC Valhalla for testing.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, JaYZaaR said:

Right. Stock settings give me 4650MHz but at 1.4V and that is too much voltage for my liking, especially because my temps go too high. I don't overclock, I just undervolt to get the same performance with less heat and less voltage, hence why I lowered my clock to 4475MHz at 1.25V, and temps are 10°C lower with negligible drop in performance (~1-2 FPS). I used The Witcher 3, Rise of the Tomb Raider and AC Valhalla for testing.

And limiting clocks to under 3GHz gives you better performance than stock in all games? Under 80C is just fine for a cpu.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, An0maly_76 said:

I think you're getting my theory backwards. I was referring to the difference in CPU performance when thermal throttling versus NOT thermal throttling (cooler = faster basically). Stay with me here...

 

Vertical synchronization caps FPS to stabilize for low refresh rates.

When vertical synchronization is turned off, this will increase your FPS, but will more or less load the CPU more to do it.

Overclocking the CPU makes it faster, but kicks temps up. Therefore, underclocking will make it run cooler.

Cooler temps = better performance, because the processor doesn't have to throttle back for thermals.

 

Therefore, when you underclocked your CPU, that lowered the temp...

Which reduced the need for thermal throttling...

Which enabled the CPU to better handle the load of vertical synchronization.

 

Think of it like this. Vertical synchronization caps your FPS to compensate for displays that can't handle over 60 FPS. BUT...

 

For those displays that CAN handle over 60, you can turn it off, which kicks some of the graphics rendering load over to the CPU. Ergo, you get higher FPS, but increases your CPU usage. However, a CPU that is overclocked or otherwise already in thermal-throttling won't be able to help much, because it's trying to protect itself. But when you underclocked, you brought the temp down and gave it some room to do so. More or less the underclock 'supercharged' the CPU's ability to hand vertical synchronization.

 

Alright, I understand it completely, that was quite detailed, much appreciated. Now, the only question I have left is at what temp does the CPU start to throttle? Because in the video I have 51°C and it "throttled" dropping 10FPS, compared to 48°C with lowered clock. Trust me, it is hard to wrap my brain around the fact that 3°C made that much of a difference for already very cool CPU (I consider 70°C getting hot, but 50 is chill imo) 😵

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Jeppes said:

And limiting clocks to under 3GHz gives you better performance than stock in all games? Under 80C is just fine for a cpu.

My CPU temps never exceeds 70°C, keeping it lower at all times. I understand that clocks work depending on temperature, they drop to put less heat and save the chip, but since I am already keeping it cool, and in the video the temp is 51°C, I fail to understand why I lose performance. I haven't tested all games I own, just 3 before I went to work, but I will confirm as soon as I am home. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, JaYZaaR said:

My CPU temps never exceeds 70°C, keeping it lower at all times. I understand that clocks work depending on temperature, they drop to put less heat and save the chip, but since I am already keeping it cool, and in the video the temp is 51°C, I fail to understand why I lose performance. I haven't tested all games I own, just 3 before I went to work, but I will confirm as soon as I am home. 

Temps are not the problem. Its your testing methods with a 99,9999999999999% probability.

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, JaYZaaR said:

Alright, I understand it completely, that was quite detailed, much appreciated. Now, the only question I have left is at what temp does the CPU start to throttle? Because in the video I have 51°C and it "throttled" dropping 10FPS, compared to 48°C with lowered clock. Trust me, it is hard to wrap my brain around the fact that 3°C made that much of a difference for already very cool CPU (I consider 70°C getting hot, but 50 is chill imo) 😵

I can't offer specifics, but I think this will vary by CPU model, as differences in cores and threads will certainly vary heat, TDP, and average / max safe operating temp. For example, I run a 5900X. AMD is on record as saying the 5950(X), 5900(X), 5800(X) are safe to 90C under full load, and that the 5600(X) is safe to 95C under full load. I've found info from a 5900X test indicating an average idle temp of 40C-45C, averaging 60C under nominal load, peaking under full load around 70C-72C, if memory serves. No info on case / cooling, etc. used.

 

Now, for my build, I used a Corsair RM850x PSU, Corsair 4000X iCUE RGB case (which included three front-mount 120mm fans), Asus Tuf B550 Plus board, Ryzen 5900X, Corsair iCUE Commander Core XT, three additional 120mm fans, Scythe Mugen 5 CPU cooler, Asus KO RTX3060ti 8GB OC, 2x16 Crucial Ballistix DDR4-3200, with a 1TB WD Blue SN570 M.2 primary and WD Black 6TB HDD for secondary. I also run a custom fan curve, ramping 70%-100% between 30C and 80C, controlling RGB color by temp to alert me to any issues. My scheme is blue to 66, yellow 67-77, red 78 and up. And those fans stay blue (66 or less) all day.

Result? My temps are a bit cooler than the cited test -- Armory Crate shows 29C-39C idle, 47C-57C under load, peak of 64C. So a bit cooler than published median testing temps I've found, and this thing SCREAMS! Very stable and very consistent, apart from rendering issues when recording 4K video, though I'm learning a few tricks to work around that courtesy of some other members here. Cinebench even rated this machine about 3%-5% faster than a typical 5900X machine. I was amazed. So temps really do make a difference, especially with vertical synchronization.

 

As far as machine-reported temps, try shooting various components with one of those laser-pointer temp guns after testing it on a surface of known temp to gauge accuracy or offset. I found that Asus' Armory Crate reports my actual CPU temp a bit lower than the typical 'package' temp normally reported by the system.

Edited by An0maly_76
Revised, more info

I don't badmouth others' input, I'd appreciate others not badmouthing mine. *** More below ***

 

MODERATE TO SEVERE AUTISTIC, COMPLICATED WITH COVID FOG

 

Due to the above, I've likely revised posts <30 min old, and do not think as you do.

THINK BEFORE YOU REPLY!

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, Jeppes said:

Temps are not the problem. Its your testing methods with a 99,9999999999999% probability.

Again, define bad testing? I am not overclocking, only undervolting. Profile 2 is still undervolt and slight underclock compared to stock out of the box settings. And EVEN STOCK is giving me less performance than Profile 1. Talking about "bad testing" how can stock out of the box with no testing and changes made be worse than heavily underclocked and undervolted settings?

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, An0maly_76 said:

I can't offer specifics, but I think this will vary by CPU model, as differences in cores and threads will certainly vary heat, TDP, and average / max safe operating temp. For example, I run a 5900X. AMD is on record as saying the 5950(X), 5900(X), 5800(X) are safe to 90C under full load, and that the 5600(X) is safe to 95C under full load. I've found info from a 5900X test indicating an average idle temp of 40C-45C, averaging 60C under nominal load, peaking under full load around 70C-72C, if memory serves. No info on case / cooling, etc. used.

 

Now, for my build, I used a Corsair RM850x PSU, Corsair 4000X iCUE RGB case (which included three front-mount 120mm fans), Asus Tuf B550 Plus board, Ryzen 5900X, Corsair iCUE Commander Core XT, three additional 120mm fans, Scythe Mugen 5 CPU cooler, Asus KO RTX3060ti 8GB OC, 2x16 Crucial Ballistix DDR4-3200, with a 1TB WD Blue SN570 M.2 primary and WD Black 6TB HDD for secondary. I also run a custom fan curve, ramping 70%-100% between 30C and 80C, controlling RGB color by temp to alert me to any issues. My scheme is blue to 66, yellow 67-77, red 78 and up. And those fans stay blue (66 or less) all day.

Result? My temps are a bit cooler than the cited test -- Armory Crate shows 29C-39C idle, 47C-57C under load, peak of 64C. So a bit cooler than published median testing temps I've found, and this thing SCREAMS! Very stable and very consistent, apart from rendering issues when recording 4K video, though I'm learning a few tricks to work around that courtesy of some other members here. Cinebench even rated this machine about 3%-5% faster than a typical 5900X machine. I was amazed. So temps really do make a difference, especially with vertical synchronization.

 

As far as machine-reported temps, try shooting various components with one of those laser-pointer temp guns after testing it on a surface of known temp to gauge accuracy or offset. I found that Asus' Armory Crate reports my actual CPU temp a bit lower than the typical 'package' temp normally reported by the system.

Leaving settings out of the box with no changes made I get 4650MHz at 1.4V and that STILL gives me 75FPS compared to heavily cut performance underclocked to 2825MHz and undervolted to 0.975V.

Mymind is full of WAT.

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, JaYZaaR said:

Leaving settings out of the box with no changes made I get 4650MHz at 1.4V and that STILL gives me 75FPS compared to heavily cut performance underclocked to 2825MHz and undervolted to 0.975V.

Mymind is full of WAT.

Hey, if you don't want the FPS, send it to me, I can use all I can get when recording 4K. 🤣

 

I did a little poking around and can't find much official, but some mention seeing signs of throttling around 80C with a 5600X. That said, another site claims AMD's official statement is 90C without throttling.

 

I could be wrong, but I call BS, I hardly think AMD would have a processor only begin throttling back at 5C below its max safe temp. Way too close to the danger zone for logic. 🧐

 

Strictly conjecture on my part, but in a perfect world, I would say my 5900X, having a max safe temp of 90C under full load, would start throttling back hard around 75C, but could start around 65-70C. Your 5600X, having a claimed max safe temp of 95C, would probably start throttling back around 70C. Again, strictly conjecture, and I didn't design them, so I'm not in a position to preach this as gospel.

 

I didn't see your cooler type / style mentioned. I run a Scythe Mugen 5 with six 120s, three up front, two up top, one to the rear, ramping 70%-100% between 30C and 80C. In a temp-controlled room at 68F (20C), my idle temps range 29C-39C, nominal load around 46C-57C, have yet to peak over 64C-65C.

 

Fun experiment -- check your in-game settings and see if any mention is made of VSync or Vertical Synchronization -- if so, see if it is enabled or not. It certainly sounds like it is disabled, and enabling will cap your FPS to 60 and put you right back where you were.

Edited by An0maly_76
Revised, more info

I don't badmouth others' input, I'd appreciate others not badmouthing mine. *** More below ***

 

MODERATE TO SEVERE AUTISTIC, COMPLICATED WITH COVID FOG

 

Due to the above, I've likely revised posts <30 min old, and do not think as you do.

THINK BEFORE YOU REPLY!

Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, An0maly_76 said:

Hey, if you don't want the FPS, send it to me, I can use all I can get when recording 4K. 🤣

 

I did a little poking around and can't find much official, but some mention seeing signs of throttling around 80C with a 5600X. That said, another site claims AMD's official statement is 90C without throttling.

 

I could be wrong, but I call BS, I hardly think AMD would have a processor only begin throttling back at 5C below its max safe temp. Way too close to the danger zone for logic. 🧐

 

Strictly conjecture on my part, but in a perfect world, I would say my 5900X, having a max safe temp of 90C under full load, would start throttling back hard around 75C, but could start around 65-70C. Your 5600X, having a claimed max safe temp of 95C, would probably start throttling back around 70C. Again, strictly conjecture, and I didn't design them, so I'm not in a position to preach this as gospel.

 

I didn't see your cooler type / style mentioned. I run a Scythe Mugen 5 with six 120s, three up front, two up top, one to the rear, ramping 70%-100% between 30C and 80C. In a temp-controlled room at 68F (20C), my idle temps range 29C-39C, nominal load around 46C-57C, have yet to peak over 64C-65C.

 

Fun experiment -- check your in-game settings and see if any mention is made of VSync or Vertical Synchronization -- if so, see if it is enabled or not. It certainly sounds like it is disabled, and enabling will likely put you right back where you were.

Yeah, man, as far as I am concerned, I am HAPPY to discover 10FPS from nothing! But now I just have to question myself why I upgraded from 2600X if I can't even use the full potential of 5600X. I mean, I am "stuck" at 2.8GHz instead of 4.8GHz that is max OC for it. Literally "wasted" money for 2GHz that I can't use?

 

I am playing UWQHD (3440x1440) and the higher the resolution is, the less important CPU is, meaning I would maybe see big difference at 1080p but at almost 4K basically nope.

 

I have Arctic Freezer 34 Esports Duo, 0% until 40°C, 45% at 55°C and 70% at 70°C, but I never even reach 70°C so I just keep it quiet that way.

 

Case is MS Industrial Black Widow with 2 fans at front and 1 at back. Clean straight line with intake and exhaust through the CPU fan.

 

I 100% agree with your idea of what is hot and when the throttle should begin, hence why this thing bugs me out since my temp is 51°C in the video.

 

About V-Sync, I keep it disabled because of G-Sync being active. Also my monitor has refresh of 144Hz, so I don't know if it would have effect since I can shoot for much higher than 60. Do you recommend to disable G-Sync and play with V-Sync instead?

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, JaYZaaR said:

Yeah, man, as far as I am concerned, I am HAPPY to discover 10FPS from nothing! But now I just have to question myself why I upgraded from 2600X if I can't even use the full potential of 5600X. I mean, I am "stuck" at 2.8GHz instead of 4.8GHz that is max OC for it. Literally "wasted" money for 2GHz that I can't use?

 

I am playing UWQHD (3440x1440) and the higher the resolution is, the less important CPU is, meaning I would maybe see big difference at 1080p but at almost 4K basically nope.

 

I have Arctic Freezer 34 Esports Duo, 0% until 40°C, 45% at 55°C and 70% at 70°C, but I never even reach 70°C so I just keep it quiet that way.

 

Case is MS Industrial Black Widow with 2 fans at front and 1 at back. Clean straight line with intake and exhaust through the CPU fan.

 

I 100% agree with your idea of what is hot and when the throttle should begin, hence why this thing bugs me out since my temp is 51°C in the video.

 

About V-Sync, I keep it disabled because of G-Sync being active. Also my monitor has refresh of 144Hz, so I don't know if it would have effect since I can shoot for much higher than 60. Do you recommend to disable G-Sync and play with V-Sync instead?

Quiet is nice, don't get me wrong, but I'll take a little light noise for cooler temps and better performance. It doesn't bother me either way because I have a respiratory condition that requires air-conditioning even in the winter, so the A/C makes more noise than any of it. You might try a more aggressive fan curve as a test just to see if you get any further performance / FPS gains.

 

I doubt you'll get much better cooling with that case unless the fans are tiny and you can go bigger, which doesn't appear to be the case -- those look like 120s up front. My system doesn't top 63 with three of the six 120s I have shut down.

 

About V-sync, no, just the opposite. Leave it disabled. It loads your CPU more, but increases FPS. I would, however, recommend you rework your fan control profile to be a bit more aggressive, at least as a test. You might just be able to get the extra performance you paid for...

I don't badmouth others' input, I'd appreciate others not badmouthing mine. *** More below ***

 

MODERATE TO SEVERE AUTISTIC, COMPLICATED WITH COVID FOG

 

Due to the above, I've likely revised posts <30 min old, and do not think as you do.

THINK BEFORE YOU REPLY!

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, An0maly_76 said:

Quiet is nice, don't get me wrong, but I'll take a little light noise for cooler temps and better performance. It doesn't bother me either way because I have a respiratory condition that requires air-conditioning even in the winter, so the A/C makes more noise than any of it. You might try a more aggressive fan curve as a test just to see if you get any further performance / FPS gains.

 

I doubt you'll get much better cooling with that case unless the fans are tiny and you can go bigger, which doesn't appear to be the case -- those look like 120s up front. My system doesn't top 63 with three of the six 120s I have shut down.

 

About V-sync, no, just the opposite. Leave it disabled. It loads your CPU more, but increases FPS. I would, however, recommend you rework your fan control profile to be a bit more aggressive, at least as a test. You might just be able to get the extra performance you paid for...

Top priority is low temperature and that stands above all - but my temps are already low. I will however try out all fans at max speed, 2x120 at front, 1x120 at back and 2x120 from CPU cooler, and see if that helps with this or changes anything for that matter. That won't be for another 6 hours until I get back home, but I will keep you updated.

Link to comment
Share on other sites

Link to post
Share on other sites

I truly wasn't expecting mine to run as cool as it does, as I knew of the 90C max safe temp, I figured on a median of around 60-65.

I don't badmouth others' input, I'd appreciate others not badmouthing mine. *** More below ***

 

MODERATE TO SEVERE AUTISTIC, COMPLICATED WITH COVID FOG

 

Due to the above, I've likely revised posts <30 min old, and do not think as you do.

THINK BEFORE YOU REPLY!

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, An0maly_76 said:

which kicks some of the graphics rendering load over to the CPU. Ergo, you get higher FPS, but increases your CPU usage.

This is false, the CPU is not fast enough at graphical calculations for offloading some of the work to be worth it. What is actually happening is the CPU has to work harder to supply the GPU with things to work on if the GPU is chewing through those things faster. This is the same reason why a CPU can bottleneck a GPU, because a too slow CPU cannot send enough information fast enough to keep up with the GPU.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×