Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Mortis Angelus

  • Content Count

  • Joined

  • Last visited


This user doesn't have any awards


About Mortis Angelus

  • Title

Profile Information

  • Location
  • Gender
    Not Telling


  • CPU
    Intel i7-8700K (currently at stock settings)
  • Motherboard
    Asus Prime Z370-A
  • RAM
    Kingston HyperX Predator 4x4GB DDR4 3000 Mhz (running 3200 Mhz) CL15 (HX430C15PB3K4/16)
  • GPU
    Gigabyte Aorus GTX 1080Ti Xtreme Edition 11GB (currently running factory OC)
  • Case
    Fractal Design Define R6 Tempered Glass
  • Storage
    SSD 1: Crucial MX500 500 GB | SSD 2: Samsung 850 EVO 1TB
  • PSU
    Corsair RM750x v2 750W
  • Display(s)
    AOC Agon AG271QG and Dell U2711
  • Cooling
    Cooler Master Nepton 240M AIO
  • Keyboard
    Cooler Master CM Storm Trigger Z (Cherry MX Brown)
  • Mouse
    Logitech G502 Proteus Core
  • Sound
    Creative GigaWorks T40 Series II | Sennheiser RS 165 Wireless Headphones + Zoom H4n used as desktop microphone | Creative Sound Blaster Tactic3D Sigma Headset
  • Operating System
    Windows 10 Home

Recent Profile Visitors

1,847 profile views
  1. (I wasn't sure in which sub-category to post this, so feel free to move it) I know windows HDR is not that great, and frankly I don't care that much for it. But for games it would be dope to have enabled. So can I keep windows in non-HDR and then in a specific game (e.g. shadow of the tomb raider) can I then choose HDR for the game? Or will that not work?
  2. Just wanted to say, I just ordered the C9. Thanks everyone for your input and thoughts! @Stahlmann, if I may, can we stay in contact in DM regarding that calibration further down the line?
  3. Apparently. Just said when I googled that it wouldnt work. If it would work I could actually do it myself
  4. Thanks for the insight. As a former photographer and content creator I do own Xrite ColorMunki already and have my PC monitors calibrated. I didn't know they used that kind of hardware for TVs too. Thought they hade some other fancier stuff. Will colormunki work with Calman Pro to, do you think? EDIT: Just checked: Of course the Colormunki DOES NOT work with Calman Home... Is AutoCal now available for CX as well? I thought that was unavailable for CX according to Quantum TV? Or has that been added in firmware update then?
  5. So BFI is really no use then? Movies I will watch in original fps anyways (usually 24 fps) and most other content will be 60 fps on youtube etc. Gaming will be 1440p 120 hz, as I dont have a HDMI 2.1 GPU yet. And there I guess gaming mode wont allow BFI anyway to keep response times as low as possible? Or am I wrong here? Good to hear about the upscaling; I have some vague memory that the AI upscaling of the CX is quite the hit and miss. I currently dont have any DTS speaker system, but I like to have the option if I want to upgrade down the line though...
  6. Gaming is secondary function as I have my high refresh monitors for that. Main use is for movie watching; home cinema experience. But I will be connecting PC as well for casual gaming. Might be getting a PS5 down the line.
  7. I've been looking to buy an OLED TV now for quite a time. My goal was to wait for the C9 to drop price significantly. But it is too popular. So now both the C9 and CX are priced at the same price here where I live. Which one to choose? I've been nerding out on reviews from HDTV Test, Rtings, Quantum TV and Digital Trends, but I still cannot make a decision. C9 has that full bandwidth 48 gbps HDMI 2.1, DTS encoding, higher peak brightness and not as aggressive ABL CX has 120 hz BFI, better color accuracy, no yellow tint, theoretically better 4K up
  8. On September 23, I got a major update with the win10 2004 update. First thing I noticed was the usual: All my microphone settings for all games and software were reverted to default, all my sound devices were reverted to default so I had to set up my sound-card again, and all the gain-settings etc.This is pretty normal, and has happened everytime I've received a core update. However, what is more annoying is that in games now I've noticed that I now miss some audio. Like there is LEFT SPACE and RIGHT SPACE, but no middle. In XCOM Chimera Squad, I could not hear any cut-
  9. Thanks for the advice. Albeit I think you misunderstood the concept; I would be using one PSU for both systems. That is what those Phanteks components are designed for (and phanteks even have a similar board for combining two PSUs as well). The whole reason I ask about this is because I dont want to have a million different switches. Before pumps and fans were PWM controlled and were manually controlled, how did that work?
  10. Please elaborate. I am a complete beginner when it comes to electrical systems and connecting electrical stuff outside of intended use
  11. Let's say I have two separate PC systems, but I only want one loop. If I would use a Phanteks dual PSU or their splitter-board, either system should power up the PSU (also how does booting on the second system not shut down the psu for the first system?) and if I have manual pump and fan control hooked up directly to the PSU it should work right as the fans and pumps are not dependent on the temps of either systems?
  12. Is there any truth to these claims found on the nvidia devblog site? https://devblogs.nvidia.com/nvidia-turing-architecture-in-depth/ "Turing introduces a new processor architecture, the Turing SM, that delivers a dramatic boost in shading efficiency, achieving 50% improvement in delivered performance per CUDA Core compared to the Pascal generation. " If this is true, CUDA performance of 2080 should be equivalent to roughly 4400 Pascal CUDA cores (1080Ti "only" has 3584 CUDA cores). I know it usually doesn't scale 1:1, but it might be safe to say 2080 should be better at
  13. How so? Are CUDA cores also affected by the clock speed? I really have no clue to how those calculation cores work in cooperation with the normal GPU core
  14. Hello, For work, we'd need to use CUDA cores for calculation and the software manufacturer has recommended RTX 2080 for this task. But afaik, it uses only CUDA cores. Wouldn't the 1080Ti be a better option then as it has 500 more cuda cores, has higher memory bandwidth and has more VRAM? On the third hand this will probably be used via external enclosure and Thunderbolt 3 interface so might not make any difference at all. But I'd image we could save a buck as well going 1080ti