Jump to content

Threadripper Hypervisor build - pfSense, FreeNAS, and Windows Gaming VM. Remote gaming anyone?

2bitmarksman

So I’m at a point where I need to build a new computer, and I know that I am going to be moving soon to an area where space, power, and especially noise is a premium. Thus, I decided instead of having 3 seperate boxes for my pfSense router, FreeNAS HTPC and Steam library, and a gaming PC, I’d like to build a single machine running ESXi, with them all running as VMs. Below is the parts list. Note that I already have a lot of the parts, such as the SSDs, RAM, PSU, some of the NAS drives, etc and are there for referencing what it will look like when done:

 

https://pcpartpicker.com/user/2bitmarksman/saved/hx49Jx 11

 

ESXi would rest on one of the two 250gb SSDs. I know this is massive overkill, but I have about 4 250gb drives I don’t know what to do with atm, so no sense buying a USB drive and taking up a valuable USB slot. The other 250gb SSD would be for a VMFS datastore (I’ll probably do a RAID 1 of 250gb SSDs honestly) for pfSense, FreeNAS, and possible for an Ubuntu/Docker VM later on. The 1TB NVMe drive I already have as well, and would be passed through via RDM to the WIndows VM.

 

pfSense would get 4 threads, 4gb of RAM, and 32gb VMDK for the VM, along with the 1000GT/VT passed through to it and this would be the WAN/LAN ports. I’ll most likely have a 3rd group for the VMs to communicate, as the WIndows VM will be using the virtual networking for communication.

 

FreeNAS would get 8 threads, 32gb of RAM, and 32gb VMDK. Perc H200 flashed to IT mode would be passed through, along with the 10g NIC port to allow for unfeathered access to the 8 NAS drives. This would store all my Plex data and serve as mass storage for important data and steam games. I may do some testing and benchmarking to see how good/bad it is to run games from it locally vs having to worry about network overhead because it doesn’t ever hit the wire so to speak. The 10g NIC is to allow it to connect to another homelab server and use it as a Datastore in the future. Not 100% sure atm for that. Plex and all its plugins (Jackett, Sonarr, Radarr, Ombi) would be loaded as well, though I may have another LInux VM handle those. Unsure how well FreeBSD would handle them until I research them further.

 

Windows gaming VM would get 12 threads, and 16gb of RAM. The 1tb NVMe drive and the 1080 Ti would be passed through, along with 4 of the USB ports on the back, and ideally the front panel USB ports at least (unsure how this shows up for passthrough) to allow for USB hotplug. Even if I need a USB card, I have a 1x PCIe slot and 16x PCIe slot free just in case, so no biggy if things go 100% as planned on that front. Note that hypervisor.cpuid.v0=FALSE needs to be set in the Advanced options in order to allow the Nvidia card to work, as GTX cards throw a Code 43 if it detects it is in a virtualized environment (gotta have the product seperation from Quadros).

 

This also has the implications that any other computer on the network that can run VMware Workstation Pro can connect to the ESXi host and game on the Windows Gaming VM with next to no latency (0.1-0.2ms max for most LANs)! This would allow you to build this in a Rackmount server chassis, such as the Rosewill RSV-R4000 or RSV-R4412, and stick it in a closet/basement/rack and have a minimalist desktop running something extremely light, such as an Intel NUC. This means you can optimize the rackmounted PC for cooling and performance, and have an office free of fan noise from a power hungry PC. Note however that this would require purchasing VMware Workstation Pro, which is ~$220-250 USD on VMware's website, or about $35 USD or less if you get it from other sources (however, you WILL NOT be able to contact VMware about any problems if you use these cheaper 3rd party options).

 

EDIT: Unsure how this will work with a physical GPU if it will actually show the video feed. It is still possible to make it work though, either using Looking Glass to display what the Nvidia GPU is displaying on the virtual GPU while remoted in, or by using NoMachine to remote into it from your LAN with minimal overhead.

Link to comment
Share on other sites

Link to post
Share on other sites

Will this be a build log or are you looking for advice on your build? 

 

I for one am highly interested and think the idea of the same VM on all kinds of machines very nice. After reading this I'm actually considering something like this myself but just the VM over the network and not the NAS inclusion as I feel like you are wasting quite a lot of energy powering those 12cores even when it's not in use. Threadripper really eats a lot of power. 

Gaming HTPC:

R5 5600X - Cryorig C7 - Asus ROG B350-i - EVGA RTX2060KO - 16gb G.Skill Ripjaws V 3333mhz - Corsair SF450 - 500gb 960 EVO - LianLi TU100B


Desktop PC:
R9 3900X - Peerless Assassin 120 SE - Asus Prime X570 Pro - Powercolor 7900XT - 32gb LPX 3200mhz - Corsair SF750 Platinum - 1TB WD SN850X - CoolerMaster NR200 White - Gigabyte M27Q-SA - Corsair K70 Rapidfire - Logitech MX518 Legendary - HyperXCloud Alpha wireless


Boss-NAS [Build Log]:
R5 2400G - Noctua NH-D14 - Asus Prime X370-Pro - 16gb G.Skill Aegis 3000mhz - Seasonic Focus Platinum 550W - Fractal Design R5 - 
250gb 970 Evo (OS) - 2x500gb 860 Evo (Raid0) - 6x4TB WD Red (RaidZ2)

Synology-NAS:
DS920+
2x4TB Ironwolf - 1x18TB Seagate Exos X20

 

Audio Gear:

Hifiman HE-400i - Kennerton Magister - Beyerdynamic DT880 250Ohm - AKG K7XX - Fostex TH-X00 - O2 Amp/DAC Combo - 
Klipsch RP280F - Klipsch RP160M - Klipsch RP440C - Yamaha RX-V479

 

Reviews and Stuff:

GTX 780 DCU2 // 8600GTS // Hifiman HE-400i // Kennerton Magister
Folding all the Proteins! // Boincerino

Useful Links:
Do you need an AMP/DAC? // Recommended Audio Gear // PSU Tier List 

Link to comment
Share on other sites

Link to post
Share on other sites

@FloRolf it will only eat power while its in heavy use, so not really bothered while it idles. I will be building this when I get back to the US, so in about a week and a half, assuming deliveries are made on time.

Will be sure to post back build pics and results :)

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 weeks later...
On 10/28/2018 at 9:06 PM, 2bitmarksman said:

@FloRolf it will only eat power while its in heavy use, so not really bothered while it idles. I will be building this when I get back to the US, so in about a week and a half, assuming deliveries are made on time.

Will be sure to post back build pics and results :)

Do you have any news yet? :)

Gaming HTPC:

R5 5600X - Cryorig C7 - Asus ROG B350-i - EVGA RTX2060KO - 16gb G.Skill Ripjaws V 3333mhz - Corsair SF450 - 500gb 960 EVO - LianLi TU100B


Desktop PC:
R9 3900X - Peerless Assassin 120 SE - Asus Prime X570 Pro - Powercolor 7900XT - 32gb LPX 3200mhz - Corsair SF750 Platinum - 1TB WD SN850X - CoolerMaster NR200 White - Gigabyte M27Q-SA - Corsair K70 Rapidfire - Logitech MX518 Legendary - HyperXCloud Alpha wireless


Boss-NAS [Build Log]:
R5 2400G - Noctua NH-D14 - Asus Prime X370-Pro - 16gb G.Skill Aegis 3000mhz - Seasonic Focus Platinum 550W - Fractal Design R5 - 
250gb 970 Evo (OS) - 2x500gb 860 Evo (Raid0) - 6x4TB WD Red (RaidZ2)

Synology-NAS:
DS920+
2x4TB Ironwolf - 1x18TB Seagate Exos X20

 

Audio Gear:

Hifiman HE-400i - Kennerton Magister - Beyerdynamic DT880 250Ohm - AKG K7XX - Fostex TH-X00 - O2 Amp/DAC Combo - 
Klipsch RP280F - Klipsch RP160M - Klipsch RP440C - Yamaha RX-V479

 

Reviews and Stuff:

GTX 780 DCU2 // 8600GTS // Hifiman HE-400i // Kennerton Magister
Folding all the Proteins! // Boincerino

Useful Links:
Do you need an AMP/DAC? // Recommended Audio Gear // PSU Tier List 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×