Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

mrhollygoddamn

Member
  • Content Count

    1
  • Joined

  • Last visited

Awards


This user doesn't have any awards

About mrhollygoddamn

  • Title
    Newbie
  1. Hello and good afternoon, i am new to the forum so if i post this in the wrong spot im sorry, just lmk and i will move it to the appropriate spot. So i will start by saying i thought i had a grip of conventional wisdom when it came to understanding clock speeds and CPU/GPU usage and how it correlates to gaming performance but i have stumbled across something on accident and it has thrown me for a loop. For starters i have a all AMD build current a Ryxen 7 3800X and a 5700 XT. The CPU 90 percent of the time is not overclocked, i leave it on the High performance mode for the power setting (which is going to be the main topic of my confusion) and the CUP will boost to 4.45GHz as an average and stay there. Which is nice as it keeps everything running smooth and fast while gaming or while I do my productivity work. The part that has me confused is last night i was just watching YouTube so i turned the power setting to power saving to let the chip go down to 2.18 Ghz to let it run cooler and well its YT, i dont need full power. I have monitoring software open in the background full time running on one of my monitors just for shits and giggles i like keeping a eye on things, anyways i fired up a game today (COD warfare in this instance) and played a few games and it ran smooth like normal averaging well over the refresh rate of my monitor of 144HZ and i play in 1440P. Well when i went to check temperatures and to see how everything was going i noticed a few things. 1. The CPU was running at the 2.18Ghz and was at the normal utilization i normally see around 28% and the chip was running really cool around 40C B/c the low clock speed and my GPU was boosting higher then normal, maintaining around 2100 MHz (overclocked). The part that is confusing to me is and the crux of this post is if i turn the CPU up to the high performance i get marginally no FPS boost just literately a few extra frames. So everyone always talks about how the faster the core speed the more FPS you will achieve right? Why is it with my chip running quite literately half as fast resulting in no extra performance and i had no lag spikes or no ill effect of any kind. in this specific case is there a benefit of me running the CPU faster? to me if im loosing literately 2 or 3 Frames average and no dips in FPS never went below 150 FPS as a minimum then whats happening here? Im currently trying other games which i know are more demanding (like Battlefield for example) and finding a loss of FPS but nothing dramatic. BF lost around 15 to 20 FPS which is bigger but was still totally playable. can someone help enplane this to me please i thought faster the CPU speed the better. P.S. The games were ran almost maxed out all the way minus AA, B/c 1440P. Thanks!
×