Search the Community
Showing results for tags 'performance'.
-
"Optimizations for windowed games improves gaming on your PC by using a new presentation model for DirectX 10 and DirectX 11 games that appear in a window or in a borderless window." So for DirectX 12 games, is it indifferent?
-
3060 Ti + i5 10400f build, Performance problems. **Status:** Unresolved **Computer Type:** Custom Build Pc **GPU:** Galax Nvidia 3060 ti (1 Click OC) **CPU:** Intel i5 10400f **Motherboard:** Asus h510m-k **RAM:** 2x8 Crucial 3200mhz Capped to 2666mhz because cpu doesn't support 3200mhz [16 gb in total] **PSU:** Gigabyte P650B 650W **Main drive [OS Drive]** 250 Gb lexar SSD **Secondary [Games drive]** 500 Gb WD HDD **Operating System & Version:** Windows 10 Pro Version 23H2 **GPU Drivers:** GRD 551 **Description of Problems** Low fps in Almost All Games! Stutters making games unplayable, extreme fps drops, computer is laggy even when opening files, sometimes crashing (explorer.exe has stopped responding) being slow in literally everything, gpu usage drops lower than cpu while in games causing low fps. Games that have problems are: Doom eternal, Resident evil 4 Remastered, Fortnite, Elden Ring, GTA Troligy Remaster. No clear reason, temps are fine (no 80 degrees and higher) no clear mining running, haven't run anything sketchy There are some other minor problems: I noticed that when moving the mouse the pc makes coil whining sounds [ONLY WHEN MOVING THE MOUSE] and weirdly only in the desktop, tried lowering mouse reporting hz, didn't help, also gpu clock speeds and usage vary when in use, gpu usage drops lower than cpu causing massive stutters and clock speeds are flunctioating under load. Also speakers, headphones make electrical noises, weird high pitch noise whenever it's plugged in, even when browsing [unrelated probably, ground loop?] Tried : Ddu to clean install drives, reverting nvidia drivers to version 537.58 (as recommended by others), resetting nvidia control pannel settings, turning off game mode, hardware acceleration, gpu scheduling, xbox game bar, moving games to ssd, none worked Haven't tried : clean install of windows, don't have any spare parts to try incase a part is broken. Note that i ran stable diffusion [ai art] on the pc, Although i did with keeping temps low and with power limits, could it have killed the gpu or any part?
- 7 replies
-
- performance
- gpu
-
(and 3 more)
Tagged with:
-
My primary display is starting to die. It works fine then randomly flickers, goes blank, etc. Before it dies completely I want to get a new monitor. I'm currently running a dual monitor setup, but I'm looking for some input from those who have used both ultrawide and dual monitors. 1: Can you run an ultrawide like a dual monitor setup? For example: I've got Youtube, Floatplane, Netflix, or w/e else on the second screen while gaming on the main one. Can I split an ultrawide down the middle (or into thirds) so I can still do this? 2: Performance hit or bonus? I'm running a 4080 Super, but I live in California where electricity cost is insane. Has anyone noticed a difference in power draw and/or FPS for dual monitors vs Ultrawide while using it in situations like Question 1? 3: If you've gone from Dual Monitors to Ultrawide what has your personal experience been? Positives and negatives. 4: If you run an Ultrawide setup and have used multiple Ultrawide monitors what is something you've learned to watch out for? Any Surprises? Thanks for anyone who reads and responds!
- 10 replies
-
- ultrawide
- dual monitor
-
(and 3 more)
Tagged with:
-
Let's take a look at one method we can use to measure the efficiency of a Graphics Card (GPU) at various Power Limits. With electricity costs soaring globally and the need to reduce heat emissions running Folding@Home can be a delicate balancing act between contributing to a worthwhile cause and keeping your Electricity Bill low and the Temperature in your home at a reasonable level. Modern GPUs, like CPUs, have a power-efficiency curve that is exponential. At the upper end of the curve you get diminishing increases in yields. So our goal is to find the most efficient Power-Level to run a GPU at. We can define Efficiency as the Yield (PPD) at a specific Power Level (W). For convenience we will use kPPD/W as the measurement of Efficiency. What you will need: Folding at Home Advanced Control Harlam's Folding Monitor (HfM.net) (Windows only or using Wine in Linux) nvidia-smi (Bundled with NVidia drivers on Windows and Linux) Excel or Google Sheets An hour or two per GPU. The best way of measuring efficiency in Folding@Home, given the variable yields in differing Work Units (WUs), is to run a GPU at a target Power-Level over a period of several days recording the Aggregate Yield of the GPU and dividing it by the Power-Level to obtain the Efficiency at that Power-Level then adjusting the Power-Level and repeating the measurements. However, a quick indication of a GPUs efficiency can be measured by observing the changes in Yield (PPD) during a single WU as the Power-Limit is adjusted. Frame Time (TPF) is the time required to complete 1/100th of a WU. In this example we will look at a EVGA RTX 2070 Super XC Hybrid (08G-P4-3178-KR) running project 18202 as the WU. First we need to configure HfM.net to calculate it's estimate of Yield (PPD) using the last 3 Frames as the Sampling Window. A larger Sampling Window might provide more accuracy but will take more time to measure. Select Preferences in the Edit Menu in HfM and choose "Last 3 Frames" to Calculate PPD based on and Click OK. Note that TPF appears to be calculated across all Frames so PPD will be a better measurement. Select a GPU to profile taking note of which Slot on which Host it is running. First we need to determine the Minimum and Maximum Power-Levels supported by the GPU. Open a Command Prompt (Windows) or a Terminal Window (Linux) and enter nvidia-smi -q to query the capabilities of the GPUs installed in the system: Power Readings Power Management : Supported Power Draw : 126.81 W Power Limit : 125.00 W Default Power Limit : 215.00 W Enforced Power Limit : 125.00 W Min Power Limit : 125.00 W Max Power Limit : 240.00 W where: Power Limit: Current value Power-Limit is set to Power Draw: Current Power consumed by the GPU Default Power Limit: Min Power Limit: Max Power Limit: Here we see this GPU has a minimum Power of 125W, a Maximum of 240W so we will want to measure the Yields between these two Limits. We will use 25W as the step size and record Yields at: 125, 150, 175, 200, 225 and 240 Watts. Next open the Folding@Home Advanced Control application from the Task Bar. Select the system with the GPU under test click on the "Log" tab to view the log checking the "Filter" option and selecting the appropriate "Slot" from the drop-down list.: Here we can see that this WU Checkpoints every two frames. We want a consistent sampling window with the same number of Checkpoints as the Checkpoint process adds a slight delay reducing the Yield. In this case we choose to record the Yield after an odd percentage has completed every 6th percentage as we want a sampling interval (6 frames) wider than that used for the Yield estimate (3 Frames) but with a consistent number of Checkpoints (3). It is important we measure the actual Power Draw rather than the set Power-Limit as at lower and upper bounds the GPU may have trouble enforcing the Power-Limit. Wait until the WU is 5-10% complete before starting measurements. In our Command Prompt (Windows) or Terminal (Linux) enter: nvidia-smi -i 0 -l 1 --format=csv,noheader --query-gpu=temperature.gpu,power.draw,clocks.gr,fan.speed which will query GPU 0 (-i 0) on this system and display the GPU temperature, Power Draw, Graphics Clock Speed and Fan Speed once a second. While the sampling window for the current set Power-Limit is in progress we will use this to estimate the Power Draw during the sampling window. In the above example with a 125W Power-Limit we see that the GPU appears to be averaging around the set value of 125W. Next we create a spreadsheet to record our values: The first Column is our "Set" Power-limit; the second our Observed Power-Draw; The third the Percentage measurement point; the fourth the TPF in Seconds from HfM; the fifth the Yield from HfM and the 6th the calculated Efficiency (E/B/1000) in kPPD/W. In a second Administrator Command Prompt (Windows) or Terminal (Linux) set the GPU starting with the lowest Power-limit at the end of a Frame. nvidia-smi -i <GPU#> -pl <Min. Power> In this instance I used: nvidia-smi -i 0 -pl 125 Watch the nvidia-smi window during the sampling interval and record the estimate of the Power-Draw. Populate the Command Prompt or Terminal with the next set-point in preparation for when the current sampling window ends. As soon as the current sampling period finishes (watch the Log in Advanced Control) change to the next set-point (nvidia-smi -i <X> -pl <Y>) and record the TPF and PPD estimate from HfM for the previous sampling window. It helps to record the TPF and PPD values a couple of times later in the sampling interval as they should be fairly stable after 3-5 frames have completed and it will give you a good estimate of the final values. As HfM calculates the Yield (PPD) based on the last 3 Frames and our sampling window is 6 Frames you do not have to be super accurate how soon after the Frame Completion you change to the next Set Point. Here are the final values. The values seemed inconsistent after the 175W Set-Point (completed 15:02) so I took measurements adjusting the Power-Limit down from the Maximum for comparison. Perhaps the calculations performed on the WU around this point got more complicated? Here is the smoothed (5-minute average values for PPD and Power) efficiency for this GPU over the initial test run from my Zabbix server for comparison. I then calculated the Average Efficiency over the two measurements for each of the Set-Points: We can then create a scatter graph of the data including a Trend line and display the Confidence or "Fit" of the Trend line (R^2 value): For this WU on this GPU we see the Efficiency is highest at the lowest Power-Limit and gets exponentially worse as the Power-Limit is increased. To put it another way, dropping from 225W, which is close to the 217W Default, to the Minimum 125W Limit we see a 7.53% decrease in PPD for a 44.4% decrease in Power.
- 16 replies
-
- folding
- performance
-
(and 4 more)
Tagged with:
-
I've just built my first PC about a month ago. The building process went perfectly to my delight (thank you Linus PC Building POV) however I've had some performance issues since. The only things brought from my previous build are my EVGA 750 GQ 80+ Gold and my 3060 ti. Everything else is completely brand new. Things were working great, except I would get some small stutter in Tony Hawk Pro Skater 1 + 2 where it'd stutter for a second and drop 1 fps to 59. I wanted to fix that and tried a youtube videos suggestion of increasing page file system to be the same amount as my RAM and using msconfig to set my CPU Cores. Doing this made everything worse. Tony Hawk was constantly stuttering to unplayable amounts. So I set those settings back to defaults, everything was still messed up. Tried system restore, it was successful (initially) but Chrome wouldn't open, so I restarted the PC and immediately started boot looping to BSOD Critical Process Died. Couldn't repair or load into safe mode. So I reinstalled windows. I was back up and running, but now all of my games and benchmarks intermittently stutter, BAD. Yesterday everything worked perfectly (HZD Benchmark and playtest, perfect. Shadow of Tomb Raider Benchmark, perfect. Tony Hawk, a few small stutters, but smooth 60 fps otherwise.) Today, every single thing stutters. Even Superposition has hitching. I've tried everything I can think of. My CPU temps never rise above 40 C gaming, my GPU usually sits around 64-74 max usage. Vids of hitching linked below. Please, any help is appreciate. (Crash Bandicoot is the smoothest, but you can see the FPS still waivers randomly.) I'm also on the latest Nvidia drivers. System CPU i5 12600K Motherboard Asus Prime Z690-A RAM 32GB DDR5 4800 Mhz GPU Zotac 3060 ti LHR Twin Edge 8GB Case NZXT H5 Flow RGB Storage Western Digital WDS500G2B0A 500GB SATA SSD, 4tb WD Blue HDD PSU EVGA 750 GQ 80+ Gold Display(s) 22" 75Hz Sceptre, 20" 75Hz Sceptre Cooling NZXT T120 CPU cooler, 2x 140mm NZXT RGB Front Case Fans, 120mm NZXT GPU Case Fan, 120mm NZXT Exhaust Fan Keyboard Coolermaster SK650 Mouse Razer DeathAdder Essential Sound Cyber Acoustics 2.1 Operating System Windows 10 Pro
-
I have a pretty old laptop with integrated graphics. Which would be the best linux distro that is focused on RAW PERFORMANCE. I do not care about looks, simplicity, ease of use, desktop environment, etc. Only pure performance. I know i sound like a newb but I have used debian based distros before but started questioning whether they were the right choice. Please recommend which distro and version would give me maximum performance for my hardware. I just want it to get the job done no compromises on fps.I really dont mind spending a little bit more time if it means that its more stable and even a bit better. Im not able to choose a single distro. I dont prefer changing desktop environments after installation due to conflicting packages so please recommend a distro. Thanks in advance.
-
Budget (including currency): 600-700€ Country: Portugal Games, programs or workloads that it will be used for: League of Legends, Metin2, Fifa, Call of Duty Warzone, CSGO, NBA2K Other details (existing parts lists, whether any peripherals are needed, what you're upgrading from, when you're going to buy, what resolution and refresh rate you want to play at, etc): Hey guys, im looking to build a high performance budget desktop, soo what I'm considering right now for parts is this: Processor: Intel Core i5 12400F GPU: NVidia 3060 RTX or AMD 6650 XT (Still uncertain here, would appreciate your opinions) Ram: Kingston Fury Beast 16 GB (1 x 16GB) DDR4-3200 CL16 (to later expand to 32 in the future) Memory: Kingston A400 960 GB 2.5'' SSD MotherBoard: Still uncertain, would appreciate tips in here, i'm looking to save as much as possible in the board without compromising performance. Power Supply: Also something not expensive, just reliant and enough to power the machine. Case: Also dont know yet, but not looking to spend more than around 50€, just some atx mid tower that is stylish and with good air venting and build for the price. Would really appreciate your opinions on where I am having doubts, but also feel free to coment if there is anything you think I could improve, thank you
-
Everyone hates bottlenecks. It’s a common question on our forums to ask whether a part will be a bottleneck… But what does a bottleneck even look like, and how can you avoid one? Buy a Core i7 8700K: On Amazon: http://geni.us/JlcIN On Newegg: http://geni.us/RLfD Buy a Ryzen 3 2200G: On Amazon: http://geni.us/jaZX9RZ On Newegg: http://geni.us/1vxxfGB Buy a GeForce GT 1030: On Amazon: http://geni.us/RUD3m On Newegg: http://geni.us/mSAqGt Buy a GeForce GTX 1060 6GB: On Amazon: http://geni.us/9iVpPWq On Newegg: http://geni.us/R10t
- 288 replies
-
- bottleneck
- performance
-
(and 4 more)
Tagged with:
-
Hello, I've been a theoretical PC enthusiast for a while but I recently finally decided to build my own pc. specs: CPU: Intel Core i7-13700K 3.4 GHz 16-Core Processor Thermalright contact frame CPU Cooler: Thermalright Phantom Spirit 120 SE ARGB 66.17 CFM CPU Cooler Motherboard: ASRock Z690 PG Velocita ATX LGA1700 Motherboard Memory: G.Skill Ripjaws S5 64 GB (2 x 32 GB) DDR5-6000 CL30 Memory Storage: Western Digital Black SN850X 1 TB M.2-2280 PCIe 4.0 X4 NVME Solid State Drive Storage: Western Digital Black SN850X 2 TB M.2-2280 PCIe 4.0 X4 NVME Solid State Drive Video Card: PNY RTX A-Series RTX A4500 20 GB Video Card Case: Antec DP503 ATX Mid Tower Case Power Supply: Thermaltake Toughpower GF3 TT Premium 850 W 80+ Gold Certified Fully Modular ATX Power Supply Case Fan: ARCTIC P12 PST 56.3 CFM 120 mm Fans 5-Pack Windows 11 pro Thermals are stable. After some testing I had to set the the Ram to 5600 cl28 which solved my BOD problems and passed memtes86 and windows memory diagnostics. The PROBLEM: One of the main purposes of the build is to work on Solidworks, unfortunately after soldiworks loads up simple activities on a blank file are laggy (especially smart dimensions). This is very starange since this poblem occurs even a simple rectangle. I have a thinkpad p53s which in specs is much more inferior but it is fast enough for small to medium projects. VAR and solidworks forums so far have not been very helpful. Any soldiworks user out there that may have faced something similar? Please help.
- 4 replies
-
- solidworks
- performance
-
(and 2 more)
Tagged with:
-
Hello, I have a rather weird problem where my pc loses performance after waking from sleep. It all started after BIOS update Components: Gigabyte B650 Gaming X AX (rev. 1.2) Ryzen 5 7600x RTX 4070 Corsair rm750e G.SKILL 32GB KIT DDR5 6000MHz CL30 Trident Z5 NEO AMD EXPO - CL30-38-38-96, voltage 1.35V (Expo1) ARCTIC Freezer 34 eSports DUO White Samsung 970 EVO PLUS 250 GB - boot driver (occupying slot M.2A) Kingston NV2 2 TB (occupying slot M.2B) NZXT H5 Flow with rgb fans Windows 11 pro Version (10.0.22621 Build 22621) Bios updated from version F9d to F20 using the Gigabyte Control Center then manually flashed to F21 with the issue not resolved. https://www.gigabyte.com/Motherboard/B650-GAMING-X-AX-rev-10-11-12/support#support-dl-bios site I used to download BIOS, maybe it's related to new AGESA version? Anybody has an Idea on what's the problem? Thank you in advance for any helpful information.
-
I have a 13900K, a Zotac RTX 4090 Trinity OC, 128GB DDR5. My system is about to turn 1, and never had any issues with it before. Two days ago, after updating the Nvidia driver to the latest version, I noticed my framerate tanked in Forza Horizon 5. I'm talking going from a never wavering 60fps down to 42fps or lower. Restarting the game or Windows didn't do anything to fix it. I reinstalled the game -- didn't work. Installed the Microsoft Store version and it did work for a while, until I fast travelled and the game started having the same issue. Reinstalled it there -- didn't work. Tried other games I knew I never had any issues like that before; same thing. Looking for possible culprits, I started monitoring temps but didn't see anything out of the ordinary. GPU didn't go above 60C, and most of the time it spent around the 40-45C range. Nothing weird on the CPU front, but I did notice it running 5.8Ghz locked at all times, even when the PC was just sitting idle. Went into the BIOS and noticed that I had the Asus optimizations enabled, so I turned it off and set it to enforce all Intel limits. But it did nothing to fix the issue. So I figured I'd format my SSD and do a clean Windows 11 install, but the issue still persisted. I proceeded to test different driver versions within the 546 branch, and all of them seemed busted. So I downgraded to a different branch altogether and it did fix the issue. Played all evening last night with no issues at all. It's Sunday morning now and I just turned on my computer, and the issue is back and it's exactly the same. Weird and sustained frame drops with frame pacing all over the place. I'm running the 1.2 VBIOS version for my GPU (it came with this version out of the box), and as far as I know there isn't a newer version available. But I did notice my CPU is still running clocks higher than it needs to. I only have the browser open right as I type this message, and it's going at 5.7Ghz for no reason. I don't have any manual overclocking (unless it's a stock overclock I'm unaware of). I'm ripping my hair out trying to figure out what the hell is going on, so if anyone had anything similar happening to them or have any clue where to look, that'd be greatly appreciated.
-
Its time we had a discussion... So many people are buying extremely high-end components that are simply unnecessary and overkill. I figured I would create a guide to help people decide what hardware is best for their use case. How many frames per second do I need? What settings should I run, what resolution should I play at? Lets talk about all these things today... FPS: Lets start by talking about refresh rate and frames per second (FPS). Linus PROVED a while back in a video called "Does High FPS Make you a Better Gamer" that for the VAST MAJORITY of gamers, frame rates beyond 144 FPS/Hz simply make no difference whatsoever. At 144Hz, the smoothness and responsiveness is so fast that your keyboard and mouse themselves attribute the vast majority of the input lag and the human body simply cannot detect lag caused by FPS at these extremely high frame rates. Even professional E-Sports gamers stated in the video that they feel there is basically NO DIFFERENCE WHATSOEVER between 144 and 240 Frame Rates. These professional gamers tend to go for extremely high frame rates anyways to reduce latency in the game's processing, but this guide is aimed at the average gamer, not a professional. These response times on the levels of a few thousandths of a second and the average gamer will NEVER detect them. In the Linus Video, they explored the differences between 60FPS/Hz, 144 FPS/Hz, and 240 FPS/Hz. What they did NOT talk about is at what point do we hit diminishing returns? At what point before 144 FPS is that magical number that feels incredibly smooth and responsive? Allow me to explain... 60 FPS may seem smooth to watch and indeed it is, but 60 FPS isn't quite fast enough for fast-paced gaming. This is because of how much time is needed for your machine to produce the next frame. The amount of time needed to produce each frame at 60 FPS is long enough to where if you move your mouse back and forth quickly, you will notice slight lag, it just won't feel all that responsive, and this is why the ideal gaming frame rate is well above 60 FPS. So how far do we need to go? Well it is somewhat subjective but there are some general "rules of thumb" that can help you decide. Even just a bit above 60 FPS at 75 FPS, things will begin to feel fairly responsive, but if you want the best performance before diminishing returns, there are still gains to be had. At 90 FPS, things will look and feel fantastic, with incredible response. At 90 FPS you will probably be quite satisfied with your experience, but we should reach a bit higher, and here is why... When your machine experiences a huge load in a demanding area of a game, the FPS will drop for a single instant here and there, this is what we know in benchmarking as 1% lows. The problem with 90 FPS gaming is that these 1% lows can still drop you into the 60's of FPS. And when that happens, you will certainly feel it. This is why 90 FPS gaming isn't quite enough to be considered a pretty-much perfect gaming experience. What you should aim for is 100+ average FPS. Once you cross the triple-digit threshold, your gaming experience will be extremely satisfying at all times, because your 1% lows will stay at or above 75-80 FPS. These 1% lows will be so smooth that you probably won't have even noticed that they happened, and so 100+FPS is ideal target for the best value gaming. However, as newer games come out, and your hardware ages, performance will drop a bit over time. But to counter this, most reviews on PC gaming content focus on performance results at Ultra Settings. So as your PC ages, you should be able to drop your settings from Ultra to High in order to keep about the same frame rates in upcoming games for a few years. Keep this in mind for later... GAME SETTINGS: So what about settings and resolution? Lets get settings out of the way - aim for High settings. High settings are a very satisfying experience and the visuals will be stunning. Very High/Ultra settings demand a lot more power for not a whole lot more beauty and so in reality, Ultra settings are unnecessary. But if you aim too low and end up with low or medium settings, you will definitely notice the difference between these settings and High settings. RESOLUTION: Finally - resolution. Resolution is completely subjective. Some people enjoy 1080p, others enjoy 4K. But, there are diminishing returns with graphical fidelity, so lets outline those now: 1080p is great for very small monitors. It looks about as sharp as it can get on 20" and smaller screens, and still looks pretty good up to 30" gaming monitors, but most people can see the difference between 1080p and 1440p, so that choice is yours. 1440p is actually the ideal computer monitor resolution in most cases. It looks insanely sharp on pretty much any standard-sized monitor and most people cannot tell the difference between 1440p and 4K. I personally play at 1440p ultrawide for that extra side-to-side vision and its an incredible experience (3440x1440p) 4K is actually pretty useless in more situations than you might think. Research and scientific studies have shown that the human eye with perfect 20-20 vision actually CANNOT see the difference between 1440p and 4K on any screen up to 50" with your face just 18" away from the screen. Linus even did a video a while back called "4K Gaming is Dumb" that goes over why its pretty much completely unnecessary. Choosing Hardware Based on this Information: PC Hardware reviews pretty much always focus on performance delivered at Ultra gaming settings. Since Ultra settings are unnecessary, you can use the performance gap between High and Ultra settings as your performance overhead for upcoming games in the future. And so... If you want a gaming experience so good that you wouldn't be able to tell if you upgraded further, and you want to use a standard sized monitor like most PC gamers do, then you want to purchase hardware that is shown to deliver an average of at LEAST 100 FPS at Ultra Settings for your chosen resolution, and 1440p is the recommended resolution for standard-sized PC monitors. Now the hardware you choose will depend on if you want standard 2560x1440p resolution or ultra-wide 3440x1440p or whatever other form of 1440p you might want, but if you do your research and keep these rules of thumb in mind, you will end up with an incredible gaming machine without over-spending. Of course everyone will have their opinions, Im sure plenty of people will say I am wrong. But think about it before commenting. When was the last time you played a game at 90-100 FPS and you thought to yourself "This is a terrible experience, I need to upgrade my PC because this is just un-playable".
- 17 replies
-
Hi guys, here for some help and recommendations from you. Current situation: My pc is a mini-itx FF with these components: i7-13700k 32GB (2x16) ddr5 6600mhz Asu Rog Strix B760-I GAMING WIFI 1st nvme ssd: 500GB 970 evo (GEN3 3,5GB/S) 2nd nvme ssd: 1TB Sabrent Rocket (GEN3 3,5GB/S) RTX 3070 These new components have been chosen cause the main use for this pc is just working with Adobe programs. Photoshop, Illustrator, Premiere Pro, and After Effects. About Premiere Pro, i'll use raw 4k 60fps H264 and 1080p 60fps h.264 footage, then always export everything as 1080p 60fps (nothing so long cause these are just social media reels, stories ecc). In AE i'll use both 4k and full HD compositions. I'm planning to add a new 2TB nvme GEN4 ssd (7GB/S) as projects/files drive but... Questions: 1) As this motherboard has only 2 m.2 slots, is it optimal to keep the 970 as main OS + programs drive and change the Sabrent with a new 2TB gen4 and create a 500gb partition on it as a cache drive? The remaining 1,5TB will be my projects/files drive. (The Sabrent will be sold or just forgotten in a drawer lol) * i read around that rarely cache drive might fail or it might just happen. In this case just the 500gb partition will fail or the entire 2tb ssd? 2) Using 3 ssd nvme, the main OS + Programs drive will be the Sabrent, the new 2TB gen4 drive will be the projects/files drive and the 970 will be the cache drive BUT in an external ssd case plugged via usb type-c directly to the mobo on a 10gb/s port (1000mb/s). Is this speed enough for a cache drive in my case for editing 4k and 1080p footage? 3) Is there any actual difference in performance with a nvme gen3 3,5GB/S, a nvme gen4 7GB/S, and a nvme that runs at around 1000MB/s as mentioned above? Thx in advance.
-
TLDR: Successfully applied Arctic Silver 5 and Arctic Thermal Pads to eliminate thermal throttling on MSI GS65 8RE gaming laptop. I have an MSI GS65 8RE Stealth Thin gaming laptop (i7-8750H, GTX 1060) which has suffered from thermal throttling ever since it was purchased in 2018. It's an old model and there are plenty of reviews so I won't get into any further details or comments about build quality, difficulty of upgrade and so on. Of course, if anyone reading this has any specific questions, please feel free to ask and I'll be happy to answer. To solve the thermal throttling problem, I first tried the following "non-invasive" measures with varying degrees of success but the problem was not eliminated; nor was I satisfied having paid for a moderately top-tier hardware package only to limit its capabilities in order to use it for its intended purpose. - Turning off Hyper Threading - Turning off Tubro Boost - Undervolting - Turbo Boost frequency limiting - Package power limiting During heavy gaming (I'm saying "heavy" relative to the performance capacity of this particular hardware) such as Jedi Fallen Order on "Epic" 1080p graphics settings, the CPU would hit 93C with the fans sounding like a 747 during takeoff. HWiNFO showed sustained thermal throttling with the CPU frequency hovering around 2.7Ghz whereas the i7-8750H is rated up to 4.1Ghz boost (2.2Ghz base). Cinebench R20 benchmark scores were hovering around 2250 (average of 3 consecutive runs). Obviously, I did not expect this 17.9mm-thick laptop to be equipped with a cooling solution that can handle the heat dissipation requirements of an i7-8750H processor at its specified 90W, 4.1Ghz boost power limit and frequency, respectively. I would settle for a sustained boost anywhere above 3.0Ghz at its Intel-designated TDP-down power limit of 35W, without thermal throttling being activated. I think that's a fair expectation for hardware at this price point. Having opened the laptop before for SSD and RAM upgrades, I was familiar with its internals. I had also read a lot of users complaining about thermal throttling and claiming they were able to fix it with a re-paste of the heat sink fan rig. So I took the plunge and re-pasted the HSF with Arctic MX-4 thermal compound. Ensuring proper contact upon replacement of the cooling hardware on the motherboard and taking care to not bend or twist any part of it, I closed up the laptop and booted. No improvement. I was disappointed, to say the least, and I began to regret my purchase. That was last year. Fast-forward to the present and I came across a post where a user had not only re-pasted the CPU and GPU but also replaced the thermal pads on the VRAM and other surrounding components. I decided to give it one last shot. I opened it up (this is an achievement by itself, just look up a teardown video), cleaned up the old paste, applied new paste and replaced the thermal pads. This time I used Arctic Silver 5 and Arctic Thermal Pads (1.5mm thick). I also did something that may be frowned upon: I replaced all thermal pads, regardless of stock pad thickness (which varied from 0.5mm to 1mm and 2mm in one area) with 1.5mm and a 1.5+1.5mm thermal pad sandwich for the 2mm. To make sure that the copper parts of the HSF rig were properly making contact with the CPU and GPU, I carefully placed the HSF rig back over the motherboard and held it in place without the screws. Then I meticulously squished each of the thermal pads where they were too thick, causing a gap between the other contact points. Now the HSF was sitting flat on the CPU and GPU. Next I slowly removed the HSF rig ensuring not to let the thermal pads peel off or shift. After applying the thermal compound on the CPU and GPU, I again placed the HSF rig back on and screwed it in place. I had extra thermal pad left over so I placed one on top of what looked like the PCH (maybe?), as well as beneath and above the two NVMe SSDs and on both RAM sticks. After closing up carefully ensuring all the cables and screws were securely in place, I booted up and ran Cinebench R20. To my absolute and complete delight, the core temp value reported by HWiNFO did not exceed 84C. By the way, I had a stable undervolt of -155mV already set, which obviously helped. The machine sustained a boost frequency around 3.3Ghz at 35W power limit and resulted in an average Cinebench score of 2460 over three consecutive runs. Other than undervolt and power limit, there are no other performance/heat limiting settings enabled. For good measure, I also ran Prime95 stress test and Unigine Heaven graphics benchmark to check system stability. All checked out. Temperatures did not exceed 84C. I want to clarify that this was in a non-AC room where the ambient was around 24C which is on the warm side of comfortable. Needless to say, thermal performance would be even better in a cooler environment. The Nvidia GTX 1060 hits 90C running Heaven benchmark on Extreme preset but the CPU caps out at 71C. The fans are loud, which is to be expected, but they run noticeably quieter than before. For reference, the max fan speed set in MSI Dragon Center is 65% for CPU and 75% for GPU. I hope this helps anyone else facing similar problems. And as I said earlier, if anyone has any questions, please feel free to ask!
- 15 replies
-
- laptop cooling
- repasting
- (and 4 more)
-
Hello I was just wondering why my pc seems to have such bad performance for my specs, ive done research on where i should be with my exact specs and im nowhere close. My specs: Ryzen 5500 Rx 6600 8gb 16 gb Ram SSD 480gb Ill list some of the games ive tested: -battlefield V ultra around 80 high around 80-100 -Rust 55-90 -Hell let loose 50-70 with alot of stutters -CS2 90 fps on high To me these fps numbers seem very low in comparison to what ive seen and gotten as estimates before, Would love some tips on what could be wrong and if anyone has the same specs that could share some fps numbers! This is also with massive stutter problems usually with every game i test. Regards Harley
- 9 replies
-
- fps
- performance
-
(and 1 more)
Tagged with:
-
Can someone explain why Corsair recommends disabling Fast Boot to make your SSDs run faster? This sounds insane, and they give absolutely no reason: https://help.corsair.com/hc/en-us/articles/360052822291-Ensuring-peak-performance-for-your-MP600-M-2-SSD Corsair is the same company that wrote a whitepaper 15 years ago about some DDR2 or DDR3 RAM speeds being 40,000% faster (when it was only calculated as 400%). So I don't have much trust in what they're advocating.
- 12 replies
-
- ssd
- performance
-
(and 1 more)
Tagged with:
-
My friends build had a 1050ti before this with a Ryzen 5 3400g, now he upgraded to a 4070. The BIOS had to be updated before the install so he tells me (he went to a local PC shop to get it installed) We're incredibly disappointed by the performance and were wondering if any of you could help. We suspect the RAM being the issue due to its low speed, but can't be sure if the GPU really is this bottlenecked by the CPU. The things we've tried: -Clean driver installs -Enabling Hardware accelerated GPU scheduling in windows -Updating windows -Reinstalling all the games and nvidia programs At 1080p: (Both Cyberpunk demos had a mix of mostly medium and high with DLSS Performance mode on) -Cyberpunk 2077 Frame Gen on, RT Off in-game benchmark has average fps be 84, minimum 25, max of 141 (tho in actual gameplay we saw closer to 30-40 in the open world around 70-90 indoors) -Cyberpunk Frame Gen off , RT Off in-game benchmark has average fps 51, minimum 13, max 85 (again in actual gameplay it felt significantly lower than the benchmark) -Riders Republic with ultra settings and AA set at medium the in-game benchmark average fps 43, minimum 7 and max 60 -Fortnite at its Epic preset was very stuttery, jumping up and down between 30 and 60fps, took ages to load in one match. -Apex Legends set at all max settings, tripple buffered vsync, reflex+boost enabled, TSAA enabled basically locked at 60 - Spiderman Miles Morales was a complete mess, barely climbing out of 30s at all high RT off In all of those examples the CPU was going at all times around 100& while the GPU was chilling around 30-50% utilisation The screenshot was taken before we enabled XMP PSU is a Corsair CV650 So we're looking for your help to explain this odd behaviour, or if its to be expected with the specs
- 8 replies
-
- ryzen 5 3400g
- 4070
-
(and 1 more)
Tagged with:
-
As it says in the title. I got a new 4TB NVME drive on Black Friday (this one, specifically: Predator GM7000 M.2 up to 2TB storage and 7400 MB/s read speed (predatorstorage.com)), and the first benchmark was in line with my 970 EVO. After getting my OS and files migrated and using it for about a week, performance dropped off a cliff. Here's the before and after in Samsung Magician: Is there anything I should know/try before returning this drive and getting a replacement? Am I misunderstanding SSDs, is this just a sucky one? I thought I did my research, and I made sure to get a drive with a DRAM cache (this one has 4 GB). I know it's TLC, but I thought that impacted lifespan way more than immediate performance, and I got a really really good deal so I wasn't worried if it only lasted several years instead of 5+.
-
Budget (including currency): ~2000-2300€ Country: Germany Games, programs or workloads that it will be used for: Cyberpunk, Wildheart, Lies of P, Starfield etc. and I've been trying out streaming on twitch the past months Other details: I currently have a Ryzen 7 2700 and a 2060 Super with 6GB VRam, a 1440P Monitor is present Hi guys, I've been using a pre-built for years now and since it's time for an upgrade I wanted to take it on myself this time, can anyone tell me if this is a good setup, what you would recommend to swap or what might even be overkill? Link: https://pcpartpicker.com/list/jc2hKX (I didn't add any fans since the case should come included with 4 pieces, as stated on their site?) Thanks in advance!
- 12 replies
-
Budget (including currency): 1200 usd Country: usa Games, programs or workloads that it will be used for: fortnite performance mode Other details (existing parts lists, whether any peripherals are needed, what you're upgrading from, when you're going to buy, what resolution and refresh rate you want to play at, etc): I need a monitor as well 240hz minimum that will be included in the budget. I made this part list please tell me how good it is and if I should stick with it https://pcpartpicker.com/list/XRZbwg
- 8 replies
-
- fortnite
- performance
-
(and 2 more)
Tagged with:
-
So I moved across the country and my old pc is back home, it's all well amd good but I let my dad take it for work, so I'm building a new one and the only questions I need answered are this, 1 does stickerbombing my pc case have any issues with heat or performance, but I will not cover any vent holes just flat case, or anything else long term. 2 can someone help me find a good case with at least a window to see my stuff I don't need a whole wall of glass, and just general space for like 200 stickers maybe more Long post sorry but thanks
- 3 replies
-
- pc case
- performance
-
(and 1 more)
Tagged with:
-
Would a Gen 4x4 NVMe M.2 drive work faster than a native Gen 3x4 NMVe M.2 on a Gen 3x4 socket? (Taking into account they are designed to transfer data at speeds above the 3.94GB/s of Gen 3x4). Original: ¿Funcionaría más rápido una unidad M.2 NVMe Gen 4x4 que una nativa Gen 3x4 en un zócalo Gen 3x4? (Teniendo en cuenta que están diseñadas para transferir datos a velocidades superiores a los 3,94GB/s de Gen 3x4).