Jump to content

Multiple gaming machines on one system

Based on a recent LinusTechTips video, I'd like to look into building a single system for my other LAN gaming machines.

 

Pros/Cons

Things that are attractive about this setup:

  • One physical machine, case, etc.
  • One place to upgrade and maintain all hardware.
  • Easier to swap around hardware between machines.
  • CPU and memory will most-likely not be a bottleneck.
  • One place to turn on all machines.
  • One place to maintain the OSes.
  • Simple re-imaging if necessary.
  • Nearly unlimited drive space based on clusters of drives rather than individually smaller drives.
  • Potential to save on drive space on games using Unraid and a shared network Steam drive (just guessing).
  • Don't have to pay for separate motherboards, CPUs, RAM sticks, PSUs, and heatsinks for each machine.

Possible problems:

  • Complete loss of system portability if I want to bring this machine somewhere.
  • If there's an issue with this one rig, all other machines are outta commission.
  • Audio has to be through the graphics card's HDMI or DisplayPort (not a big deal since that's how i have it setup already).
  • Non-standard setup having a single machine running 5-7 others (non-standard in 2019 at least).
  • Difficult to find a case available for this many 2-slot graphics cards.
  • Might need to buy a new network switch with support for 10gig Ethernet or buy an extra multi-Ethernet NIC.
  • Limited motherboard selection.
  • Potential for expensive upgrades since I won't be able to do them in smaller increments when consumer hardware goes on sale, but also a potential for super-cheap upgrades if server hardware goes on sale.
  • I just upgraded my main rig a few months ago and wouldn't want to waste that money.

My Setup

I already have 7 gaming machines at home, but 5 of them are on aging hardware to the degree that the CPU and motherboard is starting to limit gaming performance. Upgrading the CPU either means grabbing the highest-end used hardware of the 2013-2014 Intel and AMD era or upgrading the mobo, CPU, and RAM if I wanna move to Ryzen.


Of the 7 machines, one is my main rig sporting a Ryzen 7 3800X. Most-likely, I wouldn't need to virtualize this as a Threadripper 3000 won't provide the same in terms of gaming performance. Another machine is on an Intel i7 4770K and acts as my Big Picture machine. This would be the only machine not physically in the same room, but I might be able to work around that with an active USB3.0 extender since the home theater is right next door.

 

The other machines are on pretty lackluster CPUs mainly because of cost and also because it's cheaper to cool them if I don't have to buy beefy heatsinks for each machine.

 

Reasons for a Single System

The main reason I want to move to a single system design is because I want less maintenance trying to keep these systems up-to-date and in a working state. Little things can go wrong in one system that isn't happening to the others, and I'm never sure what's wrong.

 

One machine, for instance, worked fine for a while and all-of-a-sudden turns off (completely powers off) if I play a specific game. If all these machines are running on the exact same virtualized modern hardware and the same SSD clusters, there's less of a chance of problems and if one comes up, all machines are affected, not machine X with a specific set of hardware that the other machines don't share.

 

Hardware

I was thinking of simply upgrading all these machines to Ryzen, but the cost to do so would be about the same as building a single system. Linus went with Epyc, but the core clocks are a lot lower with less threads compares to Threadripper. Since I'm going to be using these other machines for gaming, Threadripper seems like the right option.

 

PCI Express 4.0 lanes

While Epyc has 128 PCIe Gen-4 lanes, Threadripper has 72. All my graphics cards are PCIe Gen-3 (1080 Ti or lower). From what I've read, only the 2080 Ti will start using a bit more than x4 lanes of Gen-4. That means even if I have 72 Gen-4 lanes available, that's nine full x16 Gen-3 lanes. Plenty for 7 gaming rigs if I even put that many on this one machine.

 

Motherboard Limitations

I dunno where Linus found a motherboard with 7 PCI Express ports for Socket sTRX4, but even if I had only 4, I was thinking there'd be some riser out there that'd allow me to split them into 2. Since each Gen-4 x16 is technically two Gen-3 x16s, then a splitter should work fine right?

 

System Specs

Compared to Epyc, Threadripper seems a lot faster. The fact that it's also a consumer processor makes it easier to purchase and potentially more-likely to be available aftermarket when I upgrade in the future.

 

If I went with 6 machines (excluding my main rig), I'd only need 4 physical cores for gaming (pretty sure that's all most games use). The 24-core Threadripper 3960X looks like the right processor; although, I think you'd need some cores left-over for the host machine too which means the 32-core Threadripper 3970X is probably the lowest I could go; that is, unless I leave the Big Picture machine to its own hardware. That does save me $500 on the CPU alone.

 

Moving my main rig

I was also thinking of moving my main rig--AMD Ryzen 7 3800X w/ 32GB of RAM--to this system, but that seems like quite a downgrade just because of the need to divide resources among the other PCs on the same box.

 

Thing is, I'm pretty sure I could dynamically add and remove hardware as-necessary. Windows doesn't throw a fit like it used to. If Unraid supports multiple profiles, that'd be even better because I could use all cores and RAM when I'm working or gaming alone and move to a split-resources profile when I wanna load up the other gaming machines. Since Big Picture machine is most-likely not going to be used when I'm in this configuration so this setup might actually work.

 

Why I'm Posting
I wanna know if this is going to work. Linus has "proven" it in his videos, but that's not really enough for me since they never seem like real scenarios except the one in his house.

 

If you've done something like this or know someone who has, I'd love to know about your experience.

 

If you know of any motherboards that'd fit my scenario, I'd love to know about those too.

 

If you know of anything else I'm missing, that'd be a great help. It's possible I'm not considering everything necessary to visualizer multiple gaming rigs.

 

Lastly, if you know if Unraid supports virtualization profiles (machine X gets 24 cores, but gets only 16 when machine Y starts up), that'd be really helpful.

 

I've been trying to think if this is even worth it and also wanna know how much it'd cost if I was going to price it out. I already have multiple 1kW PSUs and the GPUs necessary so I'd need the mobo, CPU, heatsink+fan, RAM, and a case.

 

Also, I'd like to know if I should move my main rig into this system as well or leave it to the 6 other gaming rigs since I thought a 3800X was faster than Threadripper in gaming performance.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

linus proved that it would work after months of pulling his hair out and having the assistance of some of the finest staff in all of canada.

 

one system is certainly a lot smaller, and compact, but multiple smaller systems is the way to go in your situation imho.

I do have experience in trying to get cables routed from my pc to the other room so that I can just connect a couple cables and then do couch gaming in the other room, and the cables will be the worst part of this set up. almost everything breaks down after 15 feet. just check linus's latest video on his home server, a couple grand on just a fibre usb extension.

I wish you the best of luck, and I do not envy you.

CPU: Intel core i7-8086K Case: CORSAIR Crystal 570X RGB CPU Cooler: Corsair Hydro Series H150i PRO RGB Storage: Samsung 980 Pro - 2TB NVMe SSD PSU: EVGA 1000 GQ, 80+ GOLD 1000W, Semi Modular GPU: MSI Radeon RX 580 GAMING X 8G RAM: Corsair Dominator Platinum 64GB (4 x 16GB) DDR4 3200mhz Motherboard: Asus ROG STRIX Z370-E Gaming

Link to comment
Share on other sites

Link to post
Share on other sites

https://www.pccasegear.com/products/48680?gclid=CjwKCAiA9JbwBRAAEiwAnWa4Q3zY_o8OEJpOkjJt_g0ZZRsOr4kAEANoRssy4-uXs3XrO9mebn0lwBoCtq8QAvD_BwE

This might solve some of your problems for the motherboard section

.A few things though, are you using DAS or NAS based software? If so i would recommend GLUSTER to put all your space into a surmountable storage that all of your machines can access through the network. Or not, whatever works.

.Are you using individual sound cards? I have heard multiple rumours that sound cards are very important for VM,s because they decrease background CPU usage and decreases the necessary momentary idle clock and CPU usage spikes you might encounter trying to run too many audio channels from one CPU and its corresponding chip-set. Not 100% sure, but it does make sense. 

.Also unRAID has full support for nearly every virtualisation software that i know, so it should register each individual machine when it starts up, or when machine Y for example starts up. Problems with Core allocation and potential having to allocate them to each VM profile more frequently then one should, is more likely to happen to Dual Sockets machines, like the 1 CPU 8 Editors sitrep, when issues with NUMA comes into play.

 

https://www.corsair.com/ww/en/Categories/Products/Cases/Obsidian-Series™-900D-Super-Tower-Case/p/CC-9011022-WW#tab-overview

This may also be an appropriate case, as it has Dual PSU support, which you will clearly need. Should support the motherboard as well.

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On the topic of USB extensions, all these computers would be in the same room as the server so it wouldn't be a big deal. For the one PC in another room, I could always keep that on separate hardware, but I'm already using 15' active USB extension and fiber DisplayPort cables for VR on my main rig and those are working fine.

 

In terms of the case, that one does have more slots for graphics cards, but only enough for five 2-slot cards it looks like. I'd also have to use a riser for the others.

 

That motherboard, like the others I've seen, only has four PCIe x16 ports. I'm looking for one with 6-7 or some kinda riser that takes one x16 slot and splits it into 2. That would allow me to effectively split a single gen4 x16 slot into two gen3 x16 slots. Linus somehow had a mobo with the same socket that had at least seven x16 ports. Dunno how or where he found that.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×