Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
OuterSpaceCitizen

Video / VM Server PCI Lanes

Recommended Posts

Posted · Original PosterOP

Hey guys!

 

I'm looking to build an unraid server at work to serve multiple vms:

 

1 VM 

  • USB3 card passthrough for keyboard / mouse (1 lane?)
  • nvidia 1080 (16 lanes)
  • Decklink 8k Pro (8 lanes)

This will actually be very graphics intensive, running a nv 1080 with realtime video input / transform / output sofware.

The decklink will output from said software.

 

1 VM

  • USB3 card passthrough for keyboard / mouse (1 lane?)
  • nvidia 1060, or other capable of 3 hardware outputs (16 lanes)

 

1 VM

  • USB3 card passthrough for keyboard / mouse (1 lane?)
  • GT710 for video out (8 lanes)
  • Sound card for ouput (1 lane?)

 

1 VM

  • USB3 card passthrough for keyboard / mouse (1 lane?)
  • GT710 for video out (8 lanes)

 

 

I'm afraid I'll run out of CPU PCIE lanes.

Motherboard would be Supermicro X11SPi-TF and a Xeon Gold 5118 CPU.

This should provide for 48 PCIE lanes. 

 

My guess is this hardware should take about 61 lanes.

Would i need to go for a 2 socket board, and get 2 cpus to handle this hardware?

Am I missing something here? Is there any other way to do things and keep it under 48 lanes?

 

Thanks for reading this far, and thanks for the help! :)

 

Supermicro motherboard link: https://www.supermicro.com/products/motherboard/Xeon/C620/X11SPi-TF.cfm

 

Link to post
Share on other sites
Posted · Original PosterOP

Wondering if i should go with the X11DPH-T motherboard - https://www.supermicro.com/products/motherboard/Xeon/C620/X11DPH-T.cfm and dual Intel® Xeon® Silver 4114 Processors...

Link to post
Share on other sites

If this is for work, I'd suggest scrapping the entire idea and building three separate workstations. Having that big of a single point of failure could be catastrophic. Furthermore, it's going to complicate the unRAID setup a lot.

 

It might not be "cool" to just build 3 workstations, but it'll be a lot more reliable - and might even end up being cheaper, depending on the specific needs of each user.


For Sale (lots of stuff):

Spoiler

[FS] [CAD] Various things

 

 

* Intel i7-4770K * ASRock Z97 Anniversary * 16GB RAM * 750w Seasonic Modular PSU *

* Crucial M4 128GB SSD (Primary) * Hitachi 500GB HDD (Secondary) *

* Gigabyte HD 7950 WF3 * SATA Blu-Ray Writer * Logitech g710+ * Windows 10 Pro x64 *

 

Link to post
Share on other sites
13 hours ago, OuterSpaceCitizen said:

Thanks for the input.

We've decided to split it all into separate rackmount units in order to decrease complexity.

Why are you planning to move your workstations to a rack?

Link to post
Share on other sites
Posted · Original PosterOP
On 3/8/2019 at 9:29 PM, Acedia said:

Why are you planning to move your workstations to a rack?

These are for a studio control room, so we'll place the computers in the rack room just next door in order to reduce noise and heat.

Link to post
Share on other sites
1 hour ago, OuterSpaceCitizen said:

These are for a studio control room, so we'll place the computers in the rack room just next door in order to reduce noise and heat.

You should consider going with a Terminalserver/Citrix Solution and a thinclient or a silent zotac client.

 

IIRC that’s also what Linus did in his last MultiGaming Solution.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×