Jump to content

Majestic

Member
  • Posts

    8,232
  • Joined

  • Last visited

Reputation Activity

  1. Agree
    Majestic got a reaction from DJ46 in Why can't Motherboard manufacturers get Fancontrol right?   
    CPU is a terrible datapoint to use for fanspeeds espcially today. Packages get smaller, meaning the temperatures will become more erratic.
    Hence you need to create a TDP based model based on multiple factors (inside case temp, GPU temp, CPU temp), average it and put a hysteresis/delta on it, to prevent it from spiking.
     
    This is something you can do in speedfan, and something you can't using FanXpert, or the BIOS software. Did I mention Speedfan is free?
  2. Agree
    Majestic got a reaction from DJ46 in Why can't Motherboard manufacturers get Fancontrol right?   
    So I recently upgraded my Haswell system to a new Ryzen platform, using a 2600X and a B450i ROG STRIX motherboard from ASUS. And to my dismay I found out that speedfan is not supporting this chipset or it's sensory data.
    I've been using that utility for years, since it was a free utility, that offered a featureset that even paid software couldn't remotely rival. I liked it so much i've sent the creator a few donations over the years.
     
    Now sadly I was stuck using the ROG STRIX Bios for fancontrol and oh dear.... Motherboard partners haven't changed or improved a single damn thing since my previous board, a 2013 Z87i Gaming AC from MSI. To illustrate just how lacking their software is.
     
    Let me list you some of the features of Speedfan 4.52, and then after that, what the ROG Strix (and yes this includes their AIsuite3) offers. 
    - Set static Fanspeeds
    - Setup a fancontrol for each seperate fanport, based on whatever sensory data you have available. (IE. set up casefans based on GPU, or mainboard, not just CPU).
    - Setup fancontrol that offers a MAX of SUM of speeds on the same fanport, with multiple sensory streams. (IE, set up casefans based on CPU + mainboard+ GPU, whichever is the hottest, or combined for cooling based on predicted TDP. So 12% for the CPU, 15% for the GPU, 10% for mainboard = 37%).
    - Setup a 16-point accurate responsecuve for whichever fan controller point you want, including all mentioned above, with a specific hysteresis per sensory datapoint (so you can set 5degrees for CPU, 2degrees for GPU on the same fan). Hysteresis being a setting whereby the temperature has to reach a certain delta before it changes rpm. So the fans won't constantly wobble between 2 datapoints.
     

     
    - Use delta-stepping for control curves. Whereby the program follows the curve you've setup via a moving average filter. So you can, F.E. set it to 2%/sec delta, meaning every update interval instead of directly following the curve which is set for the specific fan+sensor, it will slowly follow it over-time. So if your CPU fancurve is set 30% at 40 degrees, and 70% at 80 degrees, and it goes to 70 degrees for 2 seconds, the fancurve only changes by 2*2% = 4% max. And not ramp up and down between 30% and 65% in 2 seconds. This is not only annoying to hear in the background, this also wears out fans super fast, as you get precession forces in the bearings which is their biggest wear factor (other than plain use ofc).
    - Temperature warnings for every datapoint, and a failsafe whereby fans ramp to 100% if something overheats.
    - Does not have arbitary min. fan RPM blocks to prevent you from setting it to low and risk overheating something. As in, it provides the warning at the start but respects your intelligence from that point on. 
    - Change fan control from DC to PWM if applicable, meaning you can revert it back to BIOS setting, use a DC controller or PWM controller (IE for example for setting AIO pump speeds, which are predominantly DC controlled). 
     
    Now let's look at what this $150 motherboard shipped with....

     
    A 3-point, CPU-only barebones response curve. That offers no delta stepping, delays or hysteresis in it's update cycle. So every time the CPU does anything and it increases in temperature (even with the flat section i've added to elleviate this), the fans go with it. Absolutely unacceptable lack of decent software. Instead of turning every motherboard in a christmas tree, they should ask Alfredo Milani Comparetti from Almico how to properly provide their customers fan control software.
     
    Cuz...

  3. Funny
    Majestic got a reaction from r2724r16 in Why can't Motherboard manufacturers get Fancontrol right?   
    It is.

    *fans ramp up in the background*
    I've looked at that, it's part of their AIsuite, and it offers not much more than this. Certainly no delta steps, or multiple-sensor fancurves.
     
  4. Like
    Majestic got a reaction from Constantin in gpu boost not activating in csgo only (rtx 2080)   
    It doesn't require that much GPU compute, so it clocks down to save power.
  5. Funny
  6. Like
    Majestic got a reaction from Cyberspirit in 144Mhz Monitor G-Sync vs Freesync help   
    Anything that is not the native refreshrate will get some tearing. But compared to 60hz panels it will stay up far shorter, plus you have a bigger chance of it not tearing due to shorter scantimes. 
     
    So yes you will get tearing. But less frequently and the tear doesn't stay up as long so you notice less. Plus the time between scantimes also means the delta between the teared images is less severe. Meaning the top-half and lower-half will not be as far apart.
  7. Informative
    Majestic got a reaction from H1ry09 in Underperforming hardware   
    This is never the start of a productive topic.
  8. Agree
    Majestic got a reaction from Ektelion in Asus Strix RTX 2080 Ti OC random huge fps drops and stutters? what!?   
    No i think it just runs like a turd on all machines. Just EA not giving a fuck.
     

  9. Like
    Majestic got a reaction from KingBlue72 in Is G-sync worth the price?   
    No not really. At 144hz tbh it's hard to tell the tearing. And due to the refreshrate there is much less chance of tearing anyway.
    I explain it in a single image in my signature.
     
    Get one with Freesync, so you atleast have the option later down the line. Though every 144hz display probably has freesync. 
  10. Like
    Majestic got a reaction from Slottr in Help for Stuttering on Assassin's creed Odyssey   
    I love how all of these posts have like one OP post, and then just stop responding. Do they make these to troll people or expect some magical easy fix?
  11. Agree
    Majestic got a reaction from Hiya! in 1440p 144hz IPS w/ Freesync vs 1440p 144hz w/ G-Sync   
    My experience is that with 144hz monitors, the issue of tearing is very hard to spot at higher framerates since the time to refresh is much shorter and thus the time between the segmented frames is shorter. Ill draw something to illustrate shortly.
     
    @huilun02 fast sync is not meant to be used when you only barely hit the native refreshrate. It's designed when you hit well over twice the refreshrate. As at lower framerates, skipping a frame can result in poor frametime consistency.
  12. Like
    Majestic got a reaction from jeffmeyer5295 in HELP! Why is my fps lower now?   
    They probably changed it, yes. Never compare with settings you didn't yourself put in place.
  13. Informative
    Majestic got a reaction from xg32 in Max temp of the gtx 1070 while playing shadow of the tomb raide   
    Yes. If the noise bothers you, try my undervolting topic:
    https://linustechtips.com/main/topic/792666-pascal-gpu-boost-30-topic-something-every-pascal-owner-should-look-at/
     
    Though it's somewhat of an advanced user thing.
     
    while = although
     
    And technically it's not throttling until it goes under the baseclock. Anything above is a boostclock. It operates under GPU Boost 3.0, which takes into consideration power and temperature to determine the voltage and frequency used.
  14. Agree
    Majestic got a reaction from Brooksie359 in What CPU should I get for my Upgrade! ( I have a GTX 1080 FE)   
    Not really worth going from a 3930K to a 2600 with such slow memory and a cheap board. Save up for at least a 2700X with 3ghz+ ram. 
  15. Agree
    Majestic got a reaction from EarthWormJM2 in What CPU should I get for my Upgrade! ( I have a GTX 1080 FE)   
    Not really worth going from a 3930K to a 2600 with such slow memory and a cheap board. Save up for at least a 2700X with 3ghz+ ram. 
  16. Informative
    Majestic got a reaction from bleedblue in Poor gaming performance on good PC   
    Not necessarily. Lowering settings also lowers settings like Physics interations with Objects, the amount of physics particles on-screen, Geometry complexity of the scene/objects, draw distance (and therefor also more geometry). Yes you run the risk of lowering GPU load and therefor increasing the distribution towards the CPU, but it does not increase CPU load by lowering settings, quite the opposite. It's one of those memes/myths of PC gaming that is impossible to get rid of, not necessarily blaming your for it.
     
    Check my topic on the subject here and post results in your startpost and tag me afterwards:
    https://linustechtips.com/main/topic/894384-stutters-framedrops-lag-how-to-provide-detailed-information-with-your-bottleneck-question-as-well-as-a-few-solutions/
     
  17. Like
    Majestic got a reaction from r2724r16 in Stutters, framedrops, "lag". How to provide detailed information with your "bottleneck" question, as well as a few solutions.   
    Intro
     
    As I see this question a lot, and the general way of solving the problem is time consuming, a lot of members flat out ignore these types of questions. Much to the, no doubt, dismay of the victims of said micro stutter.
    I'm going to make this thread for myself to put in my signature, but feel free to paste it whenever is having these particular issues. Also feel free to ignore this in "newly posted" if it doesn't apply to you. 
     
    Terminology.
     
    Lets set something straight first, and that is the terminology used. A lot of people misconstrue "lag", "stutter" and "microstutter". Lag, or latency, is a terminology usually reserved (in IT) for network related issues. Which i'm not going to delve into in this topic. I will only refer to "input latency", so if this is what you mean with "lag", as in the perceived delay of input to output, refer to it as such. I will refer to "stutter" as engine halts or noticeable frametime spikes in afterburner (more on that later), and microstutter as mostly undetectable frametime/framepacing inconsistency usually only picked up by FCAT. Something like a hardcoded double-buffer v-sync that outputs 16,67ms - 33,33ms - 16,67ms - 33,33ms frametimes like the new Deus Ex. You will see a stable 60fps, but the visual appears jittery.
     
    How do GPU's output frames.
     
    This is important to understand when you're experiencing stutter or microstutter. Most of the times it's vsync related. Without delving into much detail on the specific details of v-sync, here is the gist of it. Going from the perspective of a static refresh rate panel at 60hz. If you want a really in-depth explanation, I suggest reading this excellent article from Anandtech:
     
    https://www.anandtech.com/show/2794
     
    The short version is that the GPU exports drawn frames onto the framebuffer (a portion of the VRAM) that is currently designated to output to the screen. The monitor 'scans' images at 60hz, meaning every 16.67 milliseconds, it scans a new frame from the GPU buffer. This all works well assuming the GPU can output a new frame within 16.67ms. When the GPU however cannot maintain 60fps and starts outputting some frames later than this window, in traditional V-sync operation the framebuffer will not swap and the previous frame that was drawn will remain in the front buffer. What this means is that the monitor will scan the same image again, and as a result you will be displayed the same image twice (or even three times assuming how long the GPU takes to render the new frame). This visually appears as a microstutter or stutter to the player, because it's image is frozen for 33.33ms or even 50ms. 
     

     
    As you can see, frame 1 and 3 are drawn within the window, but frame 2 is drawn outside the windows. The monitor will scan the image twice creating a stutter. As v-sync is mostly turned on by default, most players who experience stuttering are most likely just experiencing double scans. The way to solve this is two fold. Either turn off v-sync or invest into adaptive refreshrate monitors. Turning off v-sync will get rid of the stuttering but it will leave you with so-called tears across the screen. Due to the GPU no longer synchronizing the output with the scantimes of the monitor, it will swap the front and back buffer whenever the frame is completed, regardless of whether the monitor was currently scanning an image. Resulting in the displayed image on the screen consisting or 2 or more frames. 
     

     
    Note that by disabling v-sync the screen will not appear more fluent, as the scantimes and framedraws are unsynchronized. It should, however, remove the double scans and result in less severe stuttering. Some other solutions exist, but those are constantly in flux/development, and if you've established it's a synchronization issue the topic can be dealt with swiftly and specifically (as the solution depends on the application, adaptive/fast sync, etc). If this didn't solve the issue, something else is wrong and the members are going to require information to determine the problem.
     
    How to fix this?
     
    You can disable v-sync by using the ingame options, and you can also force the v-sync on/off by going to your Graphics Manuf. control panel under the 3D Graphics settings. Try using the graphics card provided options instead. What usually works well is adaptive v-sync for Nvidia (or Enhanced Sync for AMD), or when the game is really high framerate, something like "Fast Sync". Make sure the game is set to "fullscreen" though, they won't work at windowed or borderless modes.
     
    Online games disclaimer. Also make sure when determining where the problem lies, that you also test out offline games. In online games your dodgy internet could be causing (actual) lag that can be misinterpreted as stutters. 
     
    Collecting the information.
    First start by collecting your system information if you have not done so already. You can do this by pressing Start and typing in DXDIAG.exe. It will collect your system information which you can store like this:
     
     
    Now open the .txt file, press CTRL-A and CTRL-C, and paste it in the thread.
     
    When pasting this information into the thread, make sure you put [ spoiler] [ /spoiler] around it (without the spaces), or when having it selected press the "spoiler" button (eye icon) on the draft. As it will quickly flood the page.
     
    As system halts or game engine halts can be caused by a plethora of reasons, a selection of data is required to determine which. For this example i'm going to be using MSI Afterburner, downloadable here:

    http://download.msi.com/uti_exe//vga/MSIAfterburnerSetup.zip
     
    Install both Afterburner and the Rivatuner Statistics Server from the executable and launch MSI Afterburner. It should launch like this:
     
     
    Press the highlighted cog and lets set a few options. I think the new skin is very convoluted so I'm going to be first making it intelligible by going to "user interface" and selecting this option.
     
    To go back to the settings on the new skin, the cog has been replaced by a button "settings" in the bottom-right. 
     
    Go to the monitoring tab and set "hardware polling period" to 100ms. And in active graphs click on the little checkmark in front of:
    GPU Usage Core Clock Power Memory Clock Memory Usage Framerate Frametime CPU Temperature (not CPU1, CPU2 etc) CPU1 t/m CPUn Usage (meaning all numbers of threads you have, don't select CPU Usage without a number) CPU clock speed RAM Usage Pagefile Usage De-select anything besides those listed. And press "OK" to leave the settings.
     
    For AMD Graphics card users:
    In the "Genaral" Tab, in the bottom select the feature "Enable Unified GPU Usage Monitoring"!!
     

     
    Now you can monitor these real-time in the game by also pressing "show in OSD", but i'm making this topic for the layperson, and i'm going to make this topic so that I can check on the data myself instead of relying on a third party to relay the information to me.
     
    Storing the information and posting it.
     
    Now that you have the proper nodes graphing at sufficient pollingrate, now you need to collect the data on the game you're having issues with. Make sure that you have already established V-sync isn't the issue and that any form of v-sync is disabled. Either via the game, control panel (or both). 
     
    You can log the data by rightclicking on the graph of MSI Afterburner like so:

     
    Then before you start the game, hit "clear history" and then click "log history to file". It is now logging to a file. Start the game, start playing for a few minutes and make sure the issue occurs a few times preferably and close or alt-tab out of the game. Rightclick the graph again and deselect "log history to file" by clicking on it again.
     
    Now go to your MSI Afterburner root folder (by default: C:\Program Files (x86)\MSI Afterburner) where you installed it and it should contain a file called "HardwareMonitoring.hml". Just like with the DXDIAG put this file into the attachments of your post. Now people can view your issue in detail by opening up this file, which should dramatically speed up the process of finding the issues. If you open it yourself it should look something like this (make sure to press the datapit at the top). 
     
    Thanks for reading. Additional information could be added in time.
     
  18. Agree
    Majestic got a reaction from ARikozuM in Why Ubisoft Games Are Badly Optimized?   
    High quality assets and drawcall ceilings designed for 30fps. Drop the object quality and it'll be ok.
  19. Agree
    Majestic got a reaction from ARikozuM in Why Ubisoft Games Are Badly Optimized?   
    I don't think it's denuvo, from what i've seen the geometry just goes banana's when you switch it to ultra. And yeah, you'd be hardpressed to notice the difference between high and ultra when in motion.
  20. Like
    Majestic got a reaction from Paul Rudd in Why Ubisoft Games Are Badly Optimized?   
    High quality assets and drawcall ceilings designed for 30fps. Drop the object quality and it'll be ok.
  21. Agree
    Majestic got a reaction from Paddi01 in Why Ubisoft Games Are Badly Optimized?   
    Yes the levels are quite small and the geometry is very simple. It does most of the work with large textures.
  22. Like
    Majestic got a reaction from Mira Yurizaki in Why Ubisoft Games Are Badly Optimized?   
    High quality assets and drawcall ceilings designed for 30fps. Drop the object quality and it'll be ok.
  23. Agree
    Majestic got a reaction from Domifi in Why is my dark rock 3 cpu cooler so loud?   
    Check your manual my dude, don't let people do it for you. Put it in the CPU header, and check the manual for CPU fan speed settings. 
  24. Agree
    Majestic got a reaction from handymanshandle in Will a Ryzen 7 1700 Bottleneck a GTX 1080 TI?   
    A 1700 will get respectable framerates, as will a 1080TI. There is always a bottleneck in a system.
    Linus put out a good video on it, and it's funny that noone watches it. This has to be the #1 most asked question.
  25. Informative
    Majestic got a reaction from kingmustard in Are the VRMs decent on the Gigabyte Z370XP SLI?   
    Don't go only for "expensive". A lot of them use the same VRM design for the entire stack. Just look at some buildzoid video's (Actually hardcore overclocking). 
×