Jump to content

SyscomAU

Member
  • Posts

    2
  • Joined

  • Last visited

Awards

This user doesn't have any awards

SyscomAU's Achievements

  1. 192 Gamers, 1 CPU Thought this might make for an interesting video to follow up to your 8 Gamers 1 CPU Premise: AMD MXGPU's enable a single GPU to be partitioned into several virtual ones. GPU's that support this include, s7150, s7150x2 and s7100x. This relies on "SR-IOV" and "Above 4G" options in bios (No licencing needed). Using a supported Hypervisor (ESXi, KVM) you can use the AMD MXGPU driver to partition the GPU for pass-through to many VMs. Option 1: Use a s7150 or s7150x2. This has been seen on L1techs, nothing too crazy. Option 2: Holy Sh%t Balls. Use 3 x AMD s7100x GPUs per HP pcie add-in card from a G8 series blade server. The s7100x is a MXM 3.0b GPU (AKA Laptop GPU form factor) it is designed for blade clusters and high density. HP makes an interesting PCIe add-in card that enables 3 x MXM 3.0b cards to share a single x16 slot via an onboard PCIe switch. (HP 716553-001 MXM PCIe Gen8) This means that you could have up-to 3 x s7100x GPU's per add-in card, and up-to 8 add-in cards on a motherboard like the Asus Z10PE-D8-WS (onboard ASPEED VGA for Hypervisor). This gives you a maximum of 192 partitioned GPU's each with 1Gb of vRAM. Given that each VM will need a few CPU cores for gaming, I would probably partition each s7100x into 2 GPUs, each with 4GB of vRAM. Using dual E5-2697v4's you can assign 3 vCPU's (2 Physical and 1 HTT) to each VM. This gives you 48 VM's each with 3 Cores and 4GB vRAM. Budget: Each s7100x is ~$150usd, the HP card is ~$300usd. Total is ~$6,000usd Tips: Use the power cables from the old HP Gen8 blades that have 3 x Tesla m2070q. The Tesla gpus are useless now days, but the cables will adapt from EPS 12v to the HP add-in card as well as from an 8 pin PCIe to EPS 12v for the HP add-in card. Otherwise you will need to make your own power cables. I've been using a PCIe ribbon cable as my case doesn't have enough room for the HP add-in card to fit. It hasn't caused any issues yet. I've had a play with it on my 2 x AMD s7100x GPUs using the HP add-in card and the 16 VMs work pretty well.
  2. Thought this might make for an interesting video to follow up to your 8 Gamers 1 CPU Premise: AMD MXGPU's enable a single GPU to be partitioned into several virtual ones. GPU's that support this include, s7150, s7150x2 and s7100x. This relies on "SR-IOV" and "Above 4G" options in bios (No licencing needed). Using a supported Hypervisor (ESXi, KVM) you can use the AMD MXGPU driver to partition the GPU for pass-through to many VMs. Option 1: Use a s7150 or s7150x2. This has been seen on L1techs, nothing too crazy. Option 2: Holy Sh%t Balls. Use 3 x AMD s7100x GPUs per HP pcie add-in card from a G8 series blade server. The s7100x is a MXM 3.0b GPU (AKA Laptop GPU form factor) it is designed for blade clusters and high density. HP makes an interesting PCIe add-in card that enables 3 x MXM 3.0b cards to share a single x16 slot via an onboard PCIe switch. (HP 716553-001 MXM PCIe Gen8) This means that you could have up-to 3 x s7100x GPU's per add-in card, and up-to 8 add-in cards on a motherboard like the Asus Z10PE-D8-WS (onboard ASPEED VGA for Hypervisor). This gives you a maximum of 192 partitioned GPU's each with 1Gb of vRAM. Given that each VM will need a few CPU cores for gaming, I would probably partition each s7100x into 2 GPUs, each with 4GB of vRAM. Using dual E5-2697v4's you can assign 3 vCPU's (2 Physical and 1 HTT) to each VM. This gives you 48 VM's each with 3 Cores and 4GB vRAM. Budget: Each s7100x is ~$150usd, the HP card is ~$300usd. Total is ~$6,000usd Tips: Use the power cables from the old HP Gen8 blades that have 3 x Tesla m2070q. The Tesla gpus are useless now days, but the cables will adapt from EPS 12v to the HP add-in card as well as from an 8 pin PCIe to EPS 12v for the HP add-in card. Otherwise you will need to make your own power cables. I've been using a PCIe ribbon cable as my case doesn't have enough room for the HP add-in card to fit. It hasn't caused any issues yet. I've had a play with it on my 2 x AMD s7100x GPUs using the HP add-in card and the 16 VMs work pretty well.
×