Jump to content

Quad Damage

Member
  • Posts

    8
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

Quad Damage's Achievements

  1. There doesn't seem to be a dedicated thread for this clip, but there was talk of turning it into a full video. I've been using an LG OLED as my main monitor for a couple of months now, too - and have not seen most of the issues that Linus describes. There's no hint of burn-in on my panel after more than 500 hours. That's not to say his issues with it aren't real, but a few things did stand out to me: 1. Linus describes issues with full-screen images being dimmer than small windows - which get overly bright. That indicates he has its brightness set above ~150 nits, because what he describes is the "automatic brightness limiter" (ABL) kicking in. Now he says that he doesn't mind a dimmer display, but the intended brightness for SDR content is 100 nits - which is around 30–40/100 OLED Pixel Brightness. If you're using the display in HDR mode rather than SDR mode, Windows' HDR/SDR brightness balance should be set to 5/100. I know that sounds very low, but it starts at 80 nits at 0 and increases by 4 for each tick. I don't believe there is any way to disable the ABL. The only thing you can do to minimize it is disable the energy saving mode in the system menu. Beyond that, it's kicking in to protect the panel. 2. These displays have multiple protections for static images, which may also dim the image over time - even if it is set below 150 nits (though it doesn't sound like these are what Linus was describing). There is the "Adjust Logo Brightness" setting. This tries to dim static elements on-screen like channel logos or game HUDs, rather than dimming the entire screen - and can be disabled via the system menu. There is also the "automatic static brightness limiter" (ASBL), which dims the entire display if it detects the image as being "static" - which can happen when using the screen as a monitor. Below 150 nits in SDR, I have found the ASBL effects to be mild, except when writing text in a full-screen white window. It is quite aggressive in HDR, regardless of the content brightness - which would be distracting if you run the desktop in HDR mode (even at 5/100). The ASBL can be disabled via the service menu by disabling the hidden TPC and GSR options (which I have done on my TV). 3. Since Linus is using this as a work display, I wonder if it's allowed to run its automated burn-in compensation while on standby - or if the power is being cut at the end of the day. The automatic compensation runs for a few minutes when the TV is switched off after every 4 hours of total use. You can tell when it's running because the relay takes 5–10 minutes to click off, rather than ~3 seconds. Every 2000 hours, there is a major compensation cycle that runs, which can take up to an hour. This major compensation cycle can also be run from the user menu, but it is not advised unless you are experiencing problems. Of course, none of this changes the fact that Linus seems to have experienced burn-in in a very short amount of time on his displays - which could be a reason to rescind his recommendation of buying an OLED to use as a monitor. But I'm surprised that his experience is so different from mine in a similar time-frame. I have no regrets in making the change so far - especially when it replaced a dying ASUS ultra-wide monitor that I bought five years ago at the same price as the OLED. Or when considering that HDR monitors with 'dense' local dimming backlight arrays cost more than twice as much as the OLED (including the burn-in protection warranty I bought with it).
  2. Thanks for killing this for the rest of us. I tested most of the popular applications recently, and TreeSize Free was the fastest by a significant margin when working on 50TB of data (500K files). The only one faster was TreeSize Professional, which is about 50% faster than the free version - and that's about 10x faster than WinDirStat. WizTree was disqualified from the test by requiring Admin permissions to run. That's never going to happen.
  3. It's easy to test this out for yourself. Just leave a CPU monitoring program running in the background and toggle game mode. For this test I disabled hardware acceleration in Firefox and played a pair of 1440p60 videos which had about 20-30% CPU usage total. Alt-tab to Crysis with Game Mode on, and all other processes are moved to the first 4 threads, while Crysis gets 3/4 of the CPU to itself... even if it's not like Crysis needs that much of the CPU to be free since it's practically a single-threaded game. Alt-tab away from Crysis and the restriction is removed, as Game Mode is only in effect when the game is the foreground application. I would recommend that you avoid loading up the first 1/4 of the CPU completely like this though. While the game itself is running smoothly, mouse aiming in the game was very laggy. I guess the interrupts always run on CPU0 or something like that. That's not usually an issue, so long as you are not intentionally setting up a lot of background CPU usage for a demonstration like this. I guess it might be an issue on dual-cores or slower quad-cores without hyperthreading.
  4. Game mode reserves 3/4 of your CPU for the game when it's the foreground application, and moves all background processes to the remaining 1/4. If you have nothing else running on the PC like your test setup, you're basically throwing away 25% of your CPU performance, which is why you would see a reduction in performance. If, however, you normally have media players, voip clients, web browsers and other applications open in the background when you're gaming, restricting them to using 1/4 of the CPU at most while the game is the foreground application can help improve performance. Even if those background tasks do not total 1/4 of your CPU, removing any possibility of thread contention can help with consistency even if the average may drop a little. Game Mode is especially good for games which do not fully utilize your CPU though. If you were to play a game like Crysis - which you showed in your video but apparently did not test - that would show improvements on the quad-core CPUs when using Game Mode because the game only uses two main threads, and giving them exclusive access to a CPU core each ensures that no background processes can eat into your precious single-core performance. It's not only old games which are affected by this. Forza Horizon 3 is one of the games Microsoft uses as an example because it's so poorly optimized and the majority of the game's workload runs on a single thread. Enabling Game Mode guarantees that main thread has exclusive access to a CPU core all to itself, instead of potentially being shared with background processes. If you're trying to play a game like Deus Ex Mankind Divided - a game which requires more CPU power than a quad-core can provide - enabling Game Mode is effectively removing a core from an already CPU-starved game. Unless, of course, your background processes total more than 1/4 of your CPU's capabilities, in which case Game Mode will limit their impact on the game's performance. Since many games are only built to utilize 4 cores or up to 8 threads, I've found Game Mode to be quite useful on higher core-count CPUs like AMD's Ryzen because it will completely reserve 9 threads for the game on an R5 or 12 on an R7. Even if you don't have much else open, there's still some CPU activity from system processes, and Game Mode ensures it has no effect on the game with these CPUs.
  5. Please learn how to do proper game performance testing. If you're testing a CPU, you need to remove any possibility of a GPU bottleneck. That means running at the lowest resolution available, not the highest - or at least something low like 720p. Your testing methodology is completely flawed if your conclusion from testing Deus Ex: Mankind Divided was that it isn't a useful game to test with. That game's performance is hugely affected by the CPU in many locations. (though possibly not the built-in benchmark) It will put 100% load on all available CPU cores and you can swap out the GPU without seeing any performance change at all - you just see the GPU load drop when you use a faster GPU, which makes it a pure CPU test. And it's demanding enough that even the fastest CPUs are all going to test well below 100 FPS in the demanding areas of the game, so it's not like testing older games where one CPU might run the game at 500 FPS and another at 550 FPS, which doesn't really tell us anything useful.
  6. Disappointing to see that the LTT channel has been full of clickbaity content-lite videos recently. Reasons to buy a high wattage power supply: Efficiency under load. High wattage 80 PLUS Titanium PSUs are generally still as efficient if not more than lower wattage PSUs under gaming loads. Inefficiency at idle doesn't matter. Even if your high wattage power supply is 10-20% less efficient than an "optimally" sized PSU for your build based on its efficiency curve, that only works out to be single digit differences in power consumption now that hardware is so efficient. It's high loads where efficiency really matters for a desktop system. Higher wattage power supplies are built to stricter standards and use better components for maximum efficiency and lifespan. If we stick with Corsair, the high-end (high wattage) ones are warrantied for 10 years. High wattage PSUs can remain fanless under higher loads. An AX1500i is fanless up to ~525W. An RM650i is only fanless up to 260W. Keep in mind that these are ratings for 25℃ - the fan will kick in sooner if things heat up, or after prolonged load. Do you need more than a 650W power supply for a typical modern single-GPU build today? No, not at all. Are higher wattage power supplies stupid? Definitely not. My AX850 is nearing the end of its 7 year warranty at which point I'll be replacing it - and likely doing a complete new build. If anything, I'm thinking that I will be going up in wattage, not down - if only to keep the PSU running fanless most if not all the time. Personally I'd never recommend a passively cooled power supply. I'd much rather that the PSU included a fan and was able to regulate its temperature if necessary (e.g. prolonged gaming loads on a hot summers day) than running hot/inefficient because it can't do anything about it. Better to keep the components at a stable temperature for longevity.
  7. Your math is wrong for the latency calculation. Latency is 13 frames on the server and 24 frames on the client with a 240 FPS camera. Each frame takes: 1000 ÷ 240 = 4.167ms Server Latency: 13 × 4.167 = 54ms Client Latency: 24 × 4.167 = 100ms Instead of calculating the latency, you divided the number of frames by the duration of a frame. 13 ÷ 4.167 = 3.12 24 ÷ 4.167 = 5.76
×