Jump to content

eats_vegetables

Member
  • Posts

    7
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

eats_vegetables's Achievements

  1. Awesome, thanks everyone! This gave me just the right amount of stuff to google I think the peer to peer approach will be perfect for now even without the switch as at most, I'll have two computers feeding off the server. And realistically only the workstation will benefit from 10GbE, the render node can take it's sweet time.
  2. Hey folks, Last time I studied networking, gigabit was still pretty new! quick question. I have just built an unraid based nas setup and I'd like to use it as the actual work storage for my workstation (VFX stuff, fairly large sequential reads and writes). 10GbE would be fantastic, but I don't have the budget to rebuild my entire home network. I have a gigabit router upstairs connected to several devices, then a cat5e cable through to the basement connected to a switch, then two workstations and the NAS connected to the switch. Is it possible to replace the switch and all the downstream devices with 10GbE parts but leave the router? I realize this falls squarely in the stupid question bin, but.. please I'm old!
  3. yeaaaah it can be nice to have a fresh start, but.. Man I carried the same windows install over the last 2 years through multiple system updates including Intel to AMD swaps, multiple GPUs, etc Finally did a fresh install this weekend while trying to diagnose an issue, that turned out to not be related at all ? Still, Windows installs so quickly and easily these days that if you can spare the little bit of time to backup and reinstall the bits you need, it's a pretty nice feeling to have a clean slate.
  4. Also the usual make sure the ram is seated properly, etc. Corsair is pretty decent RAM but you might have gotten unlucky. For reference I ran memtest86 the other night for a little over 12 hours, no errors on 64GB of Corsair Vengeance LPX 3200.
  5. aaaand replying back for posterity. If you're suffering from inexplicable crashes in Houdini (and Houdini only..) and you've got a high core count AMD cpu, try disabling Core Performance Boost in your BIOS. Ugh. Or be like me and waste 5 days trying almost everything else..
  6. Well it appears this is caused by Houdini not playing nice with Core Performance Boost and potentially SMT. Ugh.. slightly disappointing but I'll be incredibly relieved if this actually works.
  7. Hey everyone, Currently experiencing some perplexing crashes in Houdini 18 (heavily multi-threaded 3D software application) - these are hard shutdowns, no bluescreens, no crashes to desktop, just full on *poof* black screens and a locked up PC. When this happens, not even holding down the power button will work, I have to flip the switch in the power supply to shut it down and turn it back on. This ONLY happens in Houdini! I've tried running every stress test I can find. Ran memtest86 for 12 hours straight with no errors, Intel Burn Test on "very high" WHILE running furmark to stress both the CPU and GPU to their max. I've played games while letting Mantra use half the cores, the system is perfectly stable in all conditions. Except.. On random but frequent occasions, clicking a node in Houdini will cause the black outs. Could be opening the parameters for a node.. Could be adding a new tool from the shelf, could be clicking the display flag on a node. I've gone two days without crashes only to then have the system lock up every 5 minutes for 2 hours.. It appears to have started after a system upgrade. Sorry for the long history but.. maybe it'll give anyone some clues. I was running Houdini 18 (latest build, although this has happened with fresh installs of both 18 and 17.5) perfectly fine on an AMD 3600X build. This was my gaming PC with the following specs: AMD Ryzen 5 3600X Corsair H115i Pro 280mm AIO cooler MSI B450 Pro Carbon AC 16GB Corsair Vengeance LPX 3000 Ram Corsair 550W PSU Gigabyte GTX 1080 1TB nvme, 250GB SSD, 1TB HDD. Houdini is a pretty demanding software package so I quickly realized I needed better hardware. Initially swapped the CPU and RAM for a 3950X and 64GB of RAM. Right away the crashes started happening. It seemed to point towards the PSU being insufficient, with the added power consumption from the 3950X, so I upgraded to a Corsair RM750x psu. This appeared to fix the issue, for two days I was happily chipping away, rendering, simulating stuff, having a ton of fun. Until yesterday, when it came back with a vengeance. I did a fresh Windows 10 install yesterday, which didn't help. Again, I ran every stress test I could think of, with no errors or issues. Things on my list to try: - delete Houdini18 preferences folder.. Although a fresh install of 17.5 with it's own prefs folder didn't help - move preferences, project and temp folder from the nvme to the SSD or HDD. The nvme drive is about 6 months old and no signs of problems, but.. who knows?! - swap out GPU for an older gtx 1070 - Try it on Linux. For completeness' sake, here's the full system as it currently stands: AMD Ryzen 9 3950X Corsair H115i Pro 280mm AIO cooler MSI B450 Pro Carbon AC 64GB Corsair Vengeance LPX 3200 Ram (2x32, slots 1/3) Corsair RM750x PSU Gigabyte GTX 1080 1TB nvme, 250GB SSD, 1TB HDD. Fresh Windows 10 install with latest NVidia Studio drivers and fresh Houdini 18/17.5 installs. Anything else anyone can think of? This really seems like a hardware issue, but I'm stumped at not being able to replicate the crashes on ANYTHING other than Houdini. The CPU would be the most likely culprit, but I'm not sure I can find a better way to test it since it seemingly works fine with anything else.. Nothing is overclocked, ram is running on the XMP profile. Someone suggested disabling Core Performance Boost. Any other BIOS tweaks I can try? Everything is stock in the BIOS, other than boot order and disabling wifi/BT adapters. When the crashes happen, the MSI "ezdebug" CPU led is lit up. Kinda wish that debugging feature was a bit more descriptive....
×