Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Hans Power

  • Content Count

  • Joined

  • Last visited


This user doesn't have any awards

1 Follower

About Hans Power

  • Title

Profile Information

  • Location
  • Gender


  • CPU
    AMD R5 5600x
  • Motherboard
    MSI MAG B550m Mortar Wifi
  • RAM
    32GB Crucial Ballistix 3200 Rev E
  • GPU
    MSI RTX 2070 Armor
  • Case
    Xigmatek Aquila
  • Storage
    SSD: Crucial BX300 120GB | Samsung 840 EVO 120GB | Crucial m500 120GB | HDDs: 2x Seagate Barracuda 4TB
  • PSU
    Corsair RM650i
  • Display(s)
    MSI Optix MAG274R, liyama XB2783HSU-B1DP
  • Cooling
    CPU Cooler: Scythe Fuma 2 | Casefans: Bitfenix Spectre LED red 200mm (Intake), Bequiet Pure Wings 2 140mm (Exhaust)
  • Keyboard
    Cherry MX-Board 3.0
  • Mouse
    Logitech G9x
  • Sound
    Realtek® ALC1200/ALC1220P Codec
  • Operating System
    Windows 10 Pro 64 Bit
  • Phone
    Motorola Moto X4

Recent Profile Visitors

1,807 profile views
  1. Yeah, that does it, thanks - also changing "monitor technology" to fixed refresh rate in the app specific driver settings works as well.
  2. So, I just noticed - when I have freesync (premium) active some apps (For example my 3D Printing tools Cura and Simplify 3D) tank the refreshrate in Windows 10 from 144Hz to 50Hz when the respective Window is focused. I can actually see it in my Monitors OSD that the refreshrate decreases to 50Hz and recovers to 144Hz. When I deactivate Freesync, those apps and Windows 10 alongside with it will run as expected at 144Hz. Games run without issue with every advantage you'd expect from Freesync Premium, btw. Is there any way to solve this other than deactivating Freesync when those tools are activ
  3. Took me quite a bit of tinkering but this doesn't look half bad in my opinion. First I apparently needed the latest GPU drivers to get HDR colors properly displayed on my MSI MAG274R. Then I used the MSI Dragon Center to crank up contrast and color tonality/curve (not sure what's it labeled in english) to the max. And lastly ingame I reduced brightness from the standard 50% to 30% and increased the HDR luminance to 4000 (although after 2000 you only see marginal differences - setting it to the 300 or 400 the monitor is supposed to work at, looks garbage, though for some reason). After first at
  4. I think if anything it has more to do with the PBO limits you disable during the process. Cause the clocks were dropping even right from the start before reaching any significantly high temperatures. I need to check how it works with disabled PBO limits but without undervolting eventually.
  5. I undervolted my R5 5600x almost instantly after buying it but I just did a BIOS update which always resets my settings so I thought I'd take the chance to let the system run at stock voltage and see how it performs with my relatively new Scythe Fuma 2. So, first, albeit increased, temps are still fine, although the cooler certainly spun up a bit more often. However, boost clocks are so much more stable after undervolting the CPU. It makes about a ~200 MHZ difference while boosting under heavy load. Undervolted, even under full load tested with OCCT, the CPU is almost always boosting all cores
  6. So, up until now I had a normal 60Hz monitor which I used with V-Sync on which meant that my GPU rarely ran under maximum load. Now I upgraded to an MSI MAG274 Freesync Premium monitor and I gotta say - Freesync is flatout amazing! Everything is soooo smooth - I honestly had no idea on what I was missing out on till I had it. However - with that I run pretty much every game at an open framerate and that means - 100% GPU load for most of the games I play. The GPU (MSI RTX 2070 Armor) is also overclocked (+190Mhz GPU, +600Mhz Memory, 114% Power Limit). Temps get as high as 78°C (at least during
  7. So, I'm wondering - how is freesync ideally to be used on Nvidia GPUs? Examples: Deactivate V-Sync ingame/control panel Activate V-Sync ingame/control panel Activate Fast Sync in the control panel Use the FPS limiter to limit games to the monitors maximum refresh rate (144Hz in my case) So, which would be the ideal standard?
  8. The first image is with HDR disabled (which is how it should look like) and the second with HDR disabled - as you can see - colors are just undersaturated, washed out and too bright, without much contrast.
  9. Yeah, but I doubt that it's supposed to be this unusable. It's just super overbright with washed out colors. As if I turned the Gamma slider in the Nvidia control panel all the way up.
  10. So, new monitor just came in but unfortunately when I activate HDR, the colors looks completely pale and washed out. In windows 10 as well as games - I tried Cyberpunk 2077 and Assassins Creed Odyssey. I already tried it with the supplied HDMI cable as well as a display port cable. I also tried to install and enforce the MSI color profile but that also doesn't make a difference. Windows 10 has all the latest updates and is on version 20H2. SDR/HDR brightness setting doesn't fix it either, nor does using full dynamic range. So, what else can I try?
  11. Alright, the second attempt at getting a USB frontpanel for the internal USB-C header on the mainboard actually worked and I could finally get rid of that DVD Burner (I still have a laptop drive which I can use as an external drive via adapter, though if some medieval knight happens to show up with his DVD collection). First panel looked nice and had all the things but unfortunately most of these things didn't work. This one is slightly less fancy but it'll serve its purpose. Also fits the aesthetics of the case perfectly and was quite a bit cheaper than the first thingamajig. So, I guess, now
  12. Well, the whole thing might also be just a driver or engine issue. I know with Control there is a registry fix which prevents driver crashes when overclocked, which actually works, so I'm not quite sure if I put it correctly when I call it "stability" issues. But whatever the cause the driver crashes - they are related to overclocks and they gotta go. I think some games are extremely sensitive when it comes to overclocks while others don't care. I mean, I can stress test for hours without any error found, after all with my original overclocks. Right now I'm trying to find an overclock whi
  13. Right now it sits at +500. Decreased it to +300 but it still crashed the driver, so I don't think it has an impact on stability. I think I even had the mem clocks at +600 initially and it seemed to work fine. Don't need to push it, though - 100mhz more or less on the memory doesn't seem to have a huge performance impact anyway. But that begs the question if increasing MEM clocks eats at the power limit and if it should be decreased so that the GPU can clock higher instead (if the silicone is able to, that is).
  14. Yeah, that would make the most sense. I gonna try with +190Mhz next. Can't be that much since it runs like 95% stable already, but those 5% start to get on my nerves.
  15. Yeah, but won't letting it boost higher due to a higher power limit decrease stability by simply allowing the higher maximum clock speeds? Or would the higher power limit actually increase stability because the card IS able to draw more power? It seems to run longer till I get a driver crash with MW5 with stock power limits - however that might also indicate that my GPU clock is just too high so when I let the card "stretch it's legs" it starts to stumble with the increased power limit. I'm just not quite sure how things interact with GPU Boost 2.0.