Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Ibanez343

Member
  • Content Count

    23
  • Joined

  • Last visited

Awards


About Ibanez343

  • Title
    Newbie

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Noticed a weird issue after folding for a while on my GTX 1080. I'm getting weird distortion lines through my primary 144hz monitor whenever I finish a work unit. Seems to stop after several minutes pass of inactivity. I wish I could take a picture but I'd never be fast enough to catch it. It doesn't seem to affect my other 2 monitors. Just wondering if anyone else had seen something like this, it really worried me when I first saw it but once I saw the pattern I'm not as worried for my monitor. The card never sees above 70C and I have no other issues besides this. Maybe it's something to do with changing power states?
  2. I started folding way back in 2004, I wish I had some pictures of the stuff I strung together back in the day. A member of another forum helped me get a diskless farm going, basically it was a PXE linux boot server that allowed other PC's to boot off the network and fold full time. These were the days when hitting 100k total points was a big deal. With all the recent focus on F@H it's brought me great nostalgia for the "good old days" and has had me reading many forum posts from a decade and a half ago. I'm up to 3 GPU's folding as many WU's as I'm assigned and it feels good to be back in the fold . Unfortunately my ASRock B450 pro4 motherboard in my living room VR rig will not detect anything connected into the second 16x PCIe slot so I had to get creative. It's janky but it works . I was running that GPU in my computer room on my secondary Unraid server but it was just too much heat being generated in one room. Anyone else have any jank setups they care to share? Whether you are new to folding or been at it for many years, I thank you for your contribution! Also a quick note to do a put some thought into your folding efforts, my power usage has gone up quite a bit watching my daily usage graph from my electrical company. I'm strictly folding on GPU's to maximize contribution vs watts used / heat generated. If you have any "borgs" (PC's that you do not own or pay for power consumption on) only do so with permission, no matter how "robin hood" this practice is.
  3. I only know of one overclocker friendly dual socket board out there, the EVGA SR-2. I've seen one as low as $200, but it sometimes only detected 36GB of RAM with 40GB installed (which in my experience is somewhat common on X58 boards). Besides setting an eBay watched search and playing a long waiting game be ready to spend $400. I was watching a recent Gamers Nexus video with Buildzoid, and he was saying most server boards do not have overbuilt VRMs like most consumer boards. Even if you find a way to overclock a non-overclockable server board, I doubt you'd get far. Honestly with all of the IPC improvements over the past 10 years, I'd say your best bet is to buy newer hardware, even if you end up with less threads you may end up with the same performance. I'll link an example SR-2 listing below https://www.ebay.com/itm/EVGA-270-WS-W555-A2-Classified-SR-2-Dual-LGA1366-Xeon-Intel/303216067451?hash=item469916037b:g:7KYAAOSwQtxdIite
  4. Are you by chance using task manager? If so have you right clicked the graph and changed to logical processors? If this isn't your issue, maybe include some more details or screenshots so we can help figure it out
  5. I always check eBay when comparing prices of used components (Buy it now, or look at auctions that have already ended). I recently picked up a 1080 for $330, that's a far cry from $899
  6. short answer: yes. Long answer: You'll have to have a motherboard that will be able to passthrough 3 GPU's and not have IOMMU conflicts. My motherboard only allows me two x16 PCIe slots on their own group, I use one for a GPU and the other for a USB 3 controller card. All of the other 1x slots and the 3rd 16x slot are not on their own group. Even if you have three 16x slots available for GPU passthrough you will need additional slots for USB 3 cards if you want USB hot-plug for dedicated VM's. If hot-plug is not a necessity you can manually assign individual keyboards and mice (different brands to avoid conflicts) to each VM. Just noticed you posted a link to the config, some X370 (as well as many other chipsets) boards lose GPU passthrough compatibility with the more recent AGESA BIOS updates. I had to downgrade my BIOS to get it to work and it's an old enough version that it doesn't support the Ryzen 3000 series. Luckily I have a ryzen 2700 in my Unraid server.
  7. The most important thing is to make sure it will actually work before investing a lot of money / time into it. I only have experience with X370 (AMD) and X58 (Intel) boards so far. I didn't get GPU passthrough working at all on my X370 board until I found a solution, I had to downgrade my BIOS to an older AGESA version. Many boards are affected by this, not just older boards like mine. Even after that I was only able to get 2 x16 ports passed through. I haven't been able to isolate the other 1x slots and the 3rd x16 slot to their own IOMMU groups. I've tried a few methods to separate them but no luck so far. This makes it difficult to have USB hot plug on 2 VM's at the same time. Right now I have a Nvidia NVS 300 in the top 1x PCIe slot for Unraid, a GPU for my VM in the top 16x slot and a PCIe USB 3 controller in the other 16x slot. My options basically end there since I am using the onboard USB 3 controller for Unraid (it is at least in it's own IOMMU group). TLDR: The specs aren't as critical as finding the right motherboard / CPU / BIOS combination that will work in your use case. Maybe find someone online that built a dual gaming setup and mimic it, and maybe even read about their experience / pitfalls encountered
  8. I imagine he means "shucking" the drives, a lot of external drives are actually white label NAS drives and are often cheaper or on deep discount. So people are buying the external drives and disassembling them to take the drive out and use. There are some drives that disable themselves if they sense 3.3v on a certain power pin, but you can put tape over the pins or use an adapter that doesn't supply 3.3v. As for an HBA to use with unraid you'll want one flashed to IT mode so the drives completely pass through to the OS. If you want something configured and tested already you can seach for "HBA unraid" or "HBA IT" mode on eBay. "The art of server" not only sells these cards on eBay but has a YouTube channel explaining his testing methodology as well as comparing features of various HBA controllers.
  9. I figured it out late last night, I was passing through the audio device for the GPU as well, like one would do using Unraid. Working fine now. I always figured there was more to that explanation. I guess everyone should say "Nvidia wants you to buy a vGPU supported Quardo".
  10. He is running a Quadro according to the screenshot. I purchased several older, lower end Quadro's for playing with Unraid and the only time I had an issue was when I used two of the same type (technically one was a 1x interface while the other was 16x), I tried to use two Quadro NVS 290's, one for unraid and one for a VM and had a code 43 error. If I use an NVS 295 or Quadro 410 for the GPU passthrough I don't have this issue. Not sure if this is applicable at all but something I encountered. I'm just now starting to play around with Proxmox since I don't want to buy a second license for Unraid for testing and playing around with. I can't even get my VM to start though ?. Best of luck!
  11. hmmm, I also saw someone saying their BIOS never showed the NVME drive but the OS did, I'm assuming your OS doesn't?
  12. Did some digging to see if I could find anything, so far I do see someone that said theirs only worked after updating the BIOS, which was exactly what I was going to recommend anyway (note it looks like it will operate in 2x mode rather than 4x). There can be some "gotcha's" with BIOS updates. For instance on my ASRock Taichi X370 I had to use a guide to downgrade the BIOS so that I could use GPU passthrough with unraid. The latest BIOS I could use had AGESA 1.0.0.6, once it is updated to 0.0.7.2 (not a typo, it's a newer version despite the numbering) GPU passthrough no longer works. Linking the forum post that said a BIOS update fixed it.
  13. What software are you using? Any software I've ever used passes the entire GPU through to the VM and can't be passed through to multiple VM's. So you could have as many monitors as the card / VM OS can support, but only for one VM.
  14. Unraid also has docker support, sort of like a VM but only loads resources it needs instead of running an entire operating system. There is a docker for PLEX as well as a Minecraft server but i don't have any experience with those yet.
  15. I know Unraid catches some flack for being paid software but for me it was WELL WORTH the money, it's a one time fee based on the number of storage devices with three tiers that can be upgraded later. Give the free trial a go on any PC (preferably one that supports virtualization and is enabled in BIOS) and see what you think. You'll have more time to spend working on / troubleshooting your VM's rather than the host operating system. It automatically sets up a VNC connection that you can use through the web interface and it doesn't have to wait for the VM to fully start before you can connect. It has many plugins and tons of GOOD videos online to show you how to set them up. I use the unassigned devices plugin to dedicated SSD's to VM's and a Western Digital surveillance drive to my NVR VM, rather than all that being written to the parity protected array. Do your research when it comes to hardware if you plan on passing through GPU's to your VM's, I had to downgrade the BIOS on my X370 Taichi board to make it work, and it's too old a version to support 3000 series Ryzen. Server class equipment "shouldn't" have this issue but I always stick to consumer grade hardware personally. I don't have the space for server equipment not to mention the noise. SpaceInvader One on YouTube has tons of videos on Unraid and it's plugins and features, he even helped me with downgrade my BIOS with a video tutorial since the manufacturer doesn't support it. I can post more info on my setup later when I get home from work, It's been a lot of fun and has been rock solid for me.
×