Jump to content
To encourage social distancing, you must leave two blank lines at the start and end of every post, and before and after every quote. Failure to comply may result in non-essential parts of the forum closing. Click for more details. ×
Search In
  • More options...
Find results that contain...
Find results in...


  • Content Count

  • Joined

  • Last visited


This user doesn't have any awards


About Kalm_Traveler

  • Title

Profile Information

  • Gender
  • Location
    Utah, USA
  • Occupation
    IT Security Analyst


  • CPU
    i9 10980XE
  • Motherboard
    Asus Rampage VI Extreme Omega
  • RAM
    64GB 3600mHz g.skill Trident Z CL14 DDR4 8x8gb
  • GPU
    2 Nvidia Titan RTX NVLink SLI
  • Case
    Thermaltake Tower 900 black
  • Storage
    2TB Samsung 960 Pro and 4TB Samsung 850 EVO
  • PSU
    Corsair AX1600i
  • Display(s)
    LG 38GL950G-B
  • Cooling
    2x 560mm radiators for separate CPU and graphics cards loops
  • Keyboard
    Massdrop CTRL with Kalih white switches, mix of several PBT keycap sets
  • Mouse
    Speedlink Decus Respec
  • Sound
    onboard Realtek
  • Operating System
    Windows 10 Pro
  • Laptop
    Razer Blade Pro 2019 refresh

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. That doesn't mean that temperature reductions can't be had by delidding and replacing the solder with liquid metal or just going direct die cooling. "worth it" is subjective - if a person values a 4c temp reduction, it would be 'worth it' to them. I've delidded a 9900k and 9900ks for direct die which gives quite a good temp drop and would like to delid my 10980xe if not for fear of shearing off all the SMDs near the IHS contact points but the work of having to resolder them has made delidding that soldered CPU "not worth it" to me so far. Might change my mind in the future. *EDIT* The 9900k doesn't have any SMDs near the IHS contact points so no risk there, though I think some have suggested you might crack the silicon die itself when breaking the solder with a delid tool. I've only delidded one 9900k and one 9900ks so not a lot of experience delidding soldered CPUs but neither appeared to be much at risk for damaging the silicon.
  2. yeah this was the first time I'd come across it. I tried 3 mice all less than 2 years old and they all did the same thing - they'd flicker on and off. They all 3 work fine through the old Asus monitor, and all 3 don't work through this brand new LG. I asked around a few places and someone mentioned that some USB hubs (in general, not just built-in monitors) don't handle "polling" devices well - which I guess a mouse is.
  3. should work in any USB-C or Thunderbolt 3 port. We use those types of multi-port adapters on Macs at work (Thunderbolt 3 Macs) all the time. It isn't a Thunderbolt dongle, just normal USB-C so it should work in any USB-C port (laptop, desktop, phone etc).
  4. you may want to test things - the USB hub built into my older Asus PG348Q monitor works fine for keyboard and mouse but the one build into this new LG monitor doesn't seem to work correctly with mice. Keyboard worked fine in either port but the mouse would never connect so I ended up just saying screw it and plugged them both directly into motherboard USB ports.
  5. I think you're confused because those types of PCI-E riser cards (mainly used for mining) use the physical USB 3.x connectors and cable to connect their two pieces. Nothing on those is electrically USB at all - they are only using the physical connectors and cable to send a 1x PCI-E signal between the small piece that goes on the motherboard and the larger piece that a graphics card connects to. ------------------ As far as an external graphics card over USB 3.1, it is physically possible but the results would be much worse than Thunderbolt 2 and Thunderbolt 3 external docks due to limited bandwidth and that USB is a serial connection. Someone correct me if I'm recalling this wrong but devices on a serial bus all have inherent latency because the devices running on them are polled in series. A Thunderbolt-connected device essentially has direct PCI-E lanes (at least to the TB controller) which is not a serial connection and thus can make direct communication with the cpu at any time - kind of like how PS/2 keyboards/mice can interrupt the cpu to process their input at any time but a USB keyboard/mouse can't.
  6. let us know if you find anything out about this. I have both versions of the Xbox One Elite controller and initially was sad that the original requires an Xbox One wireless adapter on PC (never saw anything about getting a normal bluetooth dongle to work with it). Granted I don't use bluetooth on the new version now since I just got used to using the original cabled but still curious.
  7. ahh good idea and thank you - that should work for sure howdy - yes indeed they do sorta track sleep via movement. I have been wearing a Samsung Galaxy Fit 2 smart watch since they launched last I think October which also tracks it via heart rate, and for just over two weeks with an oximeter (the things they put on your finger to test how much oxygen is in your blood) which tracks oxygen, heart rate and movement. I'm definitely moving a lot, and get random weird heart rate spikes from 40-50 suddenly up to 100-120 but I want a camera to see what the heck I'm doing during all this time.
  8. Kind of weird question I know - but I'm looking for a camera that I can set up to watch/listen to my sleeping because I have been diagnosed with severe insomnia and sleep apnea but my health insurance says they won't pay for a sleep study because in their words "your records show that you do not have insomnia or daytime sleepiness, so this is not medically necessary". Anybody know of a good cheap camera with decent-enough IR night vision etc that I could have streaming to my PC in another room (and software to do the recording) ? I guess a baby monitor camera would work but I have no experience with anything like this so not sure what to look for exactly. Thanks in advance!
  9. thank you So a 2070s will be able to play current games at 4k resolution with max graphics settings and maintain 60+ fps? I don't think the 3700x stock cooler would fit in the Node 202 case, so I'll be keeping the Scythe Big Shuriken 2 unless it also is too tall on top of AM4. You may be on to something about just upgrading the graphics card for now though. I know there are some games already that want 8 physical cores (RDR2) so that's why I was thinking the 3700x might be a good choice - double the cores and threads plus higher boost clocks and better IPC in general. I don't have any consoles so that's not a worry, but I just want to make sure that the TV setup can game as well as everything else in the man cave for when I have a few friends over.
  10. posted this on r/htpc but I want your guys input too if that's all right: Hi folks - I currently have a decent HTPC hooked up but after upgrading to a 4k HDR tv I'm noticing that it is struggling to keep up with modern PC games and kind of want to pull the trigger on new guts for it. I'd like to keep it in the existing Node 202 and retain the SFX 600w PSU if it's enough to power the new parts. Speaking of, current parts are as follows: Asus ROG Strix Z270i gaming mb Intel core i7 7700 (non-k, delidded + liquid metal) Scythe Big Shuriken 2 rev. B cpu cooler with Noctua thin 120mm fan 16gb (8gb x 2) 2666mhz CL15 Corsair Vengeance LPX DDR4 MSI Geforce GTX 1070 Gaming graphics card Samsung 850 Pro 512gb nvme ssd Use is maybe 50/40/10 couch gaming / emulators / media watching so obviously a better graphics card will help with the 4k resolution. I figured these parts are all fairly old now though and would like to upgrade the full platform so it stays adequate for a good 4-5 years before I get the bug to upgrade it again. Parts I'm thinking to plunk down for: Asus ROG Strix x570-i gaming mb AMD Ryzen 7 3700x cpu (because it is much beefier than the 7700 and still 65w tdp) 16gb (8gb x 2) 3600mhz CL16 DDR4 best air-cooled 2080 Ti that will fit the Node 202 decent 2TB nvme SSD - maybe Samsung 970 Evo Plus but maybe something PCI-E gen 4 because why not So my questions for you guys: Does this parts list seem like a good logical upgrade overall that should last me at least 4 years for 60+ fps gaming? Any reason other than price to drop to a 2080 or 2080 Super instead of 2080 Ti? Will the existing Corsair 600w SFX PSU be enough for the new parts? Will the Scythe Big Shuriken 2 rev B. cooler still fit both on the AM4 mb (I am pretty sure it included AM4 brackets) and within the Node 202 case? I don't know if there's a mounting height difference but as is I had to remove the internal filter to fit it on top of the 7700. Please let me know what you guys think and thanks in advance!
  11. Anecdotally I've seen more Micron VRAM Turing cards have memory issues, and historically Samsung memory chips seem to be better for overclocking but I wouldn't blanket statement say that Micron is bad. They are one of the largest memory manufacturers in the world because generally their chips are good - I think they just had a big batch of worse-than-expected chips that ended up in the first few rounds of RTX graphics cards. That being said, I am paranoid and have only bought RTX cards with Samsung memory as a result of those early issues.
  12. we need to make a movie about you - a rural programmer in the mid 2010's getting by on a 20 year old rig and a 6 GB monthly data cap... getting chills just thinking about it. I've lived in Utah my entire life and always thought we had it bad out here due to low population density but I remember growing up we had 256kbps DSL in 1997, and when I first moved out with friends in 2003 we had cable that was I think 50mbps.
  13. I love the progress pics and how clean you were able to do this... but after reading the OP I have just one question... How in the world were you running anything on 512mb of RAM in 2010 before that upgrade?? I had 2gb RAM in my final Pentium 4 rig, and 4gb in my last Windows Xp rig with the Core 2 Duo E8500 in I think 2008 ?
  14. I would check with HWInfo to see what those temp sensors actually are - "Temp #11" doesn't give you any clue as to what it is, though I'd be a little concerned myself of anything being 83 c.