Jump to content

Help structuring Compute server

Go to solution Solved by Falconevo,

Yes you can dual boot, but as @leadeater said.. you don't need to!  Hopefully the motherboard has a VGA adapter, most have an aspeed vga adapter which can be used for the VGA output on UNRaid

 

Just provision a virtual machine with dedicated resources that encompasses all your requirements for premiere and any other software you need.

Then when you want to fire up the other VM's, power down the VM you created for premiere and share the resources out.

Hello again, (I think i post too much?)

I am thinking about building compute server (along with nas) from 2 xeon 2670's and 64gb ram,

Now i want to use this for rendering my projects and run aroung 2-4 gaming vm's like linus did for when friend come over (and LAN Parties) and as a server for (Minecraft Bf1 etc maybe?? (Not a neccesity, merely messing around))

1. How do i do it, can I dual boot a windows/linux server OS and UNraid for when im doing vm's, ill Be having the render server up 70% and Vm 20% so i dont mind rebooting every so often.

 

2. best drives for a compute server like this or should i Just run the client storage off my nas ?

 

many thanks

Link to comment
Share on other sites

Link to post
Share on other sites

For UNRaid you will need to have separate dedicated resources for each of those bare metal VM's you wish to run.

 

4x graphics cards if you are planning on running 4 at the same time, they need dedicated PCIe VGA resources as it cannot be shared.  Considering the CPU's you have, you would need a pretty high end SuperMicro/Dell/Tyan/HP board to handle 4x GPU's with that also as very few server boards come with massive PCIe expansion (even with risers).

 

i would just use UNRaid to virtualise up everything (if you have sufficient resources to dedicate)

Please quote or tag me if you need a reply

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Falconevo said:

4x graphics cards if you are planning on running 4 at the same time, they need dedicated PCIe VGA resources as it cannot be shared.

Plus one more for the base system GPU, since these Xeons don't have iGPUs.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, leadeater said:

Plus one more for the base system GPU, since these Xeons don't have iGPUs.

The board Im looking at has 4 x16 slots and yes I'll be putting 2 cards in right now for 2 VMS right now as I don't have budget for 4 and can expand later. My question is can I use unRAID for 2 VMS and also dual boot into Windows for things like rendering premiere stuff that KS

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Napalm360 said:

The board Im looking at has 4 x16 slots and yes I'll be putting 2 cards in right now for 2 VMS right now as I don't have budget for 4 and can expand later. My question is can I use unRAID for 2 VMS and also dual boot into Windows for things like rendering premiere stuff that KS

Why would you dual boot though? You can create a VM that has all the resources of the computer provisioned. You don't even have to power down existing VMs, core sharing is a thing.

 

The answer is yes you can, but why when there is no need to.

Link to comment
Share on other sites

Link to post
Share on other sites

Yes you can dual boot, but as @leadeater said.. you don't need to!  Hopefully the motherboard has a VGA adapter, most have an aspeed vga adapter which can be used for the VGA output on UNRaid

 

Just provision a virtual machine with dedicated resources that encompasses all your requirements for premiere and any other software you need.

Then when you want to fire up the other VM's, power down the VM you created for premiere and share the resources out.

Please quote or tag me if you need a reply

Link to comment
Share on other sites

Link to post
Share on other sites

I am in the same process of putting together a gaming server as a LAN party in a box. I have a 22 core Xeon (Broadwell) and was thinking to use a Gigabyte X99-designare EX motherboard with multiple GTX 1070s to run a multi-headed virtual reality gaming system that can accommodate up to 4 players. 

 

With todays SW, I think the best I could realize is 4 gaming VMs that see each other as network neighbors. Basically, you could play Hover Junkers and a few others in Networked multi player mode. But what I would like to build for is a multiplayer shared virtual environment. Maybe MS hologram platform would be a good solution for shared reality. 

 

The really cool architecture would be to have a server process running the world on a few cores, while client processes each render the players perspective in said world. Kind of a highly integrated MMO model with low latency for real time shared VR. 

 

It will be fun to create an early version of The Oasis from Ready player one. 

 

I regularly use Windows and Ubuntu VMs at my work, but I have only ever used RDP and Networked USB to communicate.  A gaming VM needs more immediate access to HW, especially for Virtual Reality. 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Falconevo said:

Yes you can dual boot, but as @leadeater said.. you don't need to!  Hopefully the motherboard has a VGA adapter, most have an aspeed vga adapter which can be used for the VGA output on UNRaid

 

Just provision a virtual machine with dedicated resources that encompasses all your requirements for premiere and any other software you need.

Then when you want to fire up the other VM's, power down the VM you created for premiere and share the resources out.

 

That sounds good, sorry couldnt quite get my head around it the first time lol, turns the asrock board im eyeing has aspeed vga so all is good, what do you reccomend for storage however which is me 2nd question in  the OP

4 hours ago, Harperhendee said:

I am in the same process of putting together a gaming server as a LAN party in a box. I have a 22 core Xeon (Broadwell) and was thinking to use a Gigabyte X99-designare EX motherboard with multiple GTX 1070s to run a multi-headed virtual reality gaming system that can accommodate up to 4 players. 

 

With todays SW, I think the best I could realize is 4 gaming VMs that see each other as network neighbors. Basically, you could play Hover Junkers and a few others in Networked multi player mode. But what I would like to build for is a multiplayer shared virtual environment. Maybe MS hologram platform would be a good solution for shared reality. 

 

The really cool architecture would be to have a server process running the world on a few cores, while client processes each render the players perspective in said world. Kind of a highly integrated MMO model with low latency for real time shared VR. 

 

It will be fun to create an early version of The Oasis from Ready player one. 

 

I regularly use Windows and Ubuntu VMs at my work, but I have only ever used RDP and Networked USB to communicate.  A gaming VM needs more immediate access to HW, especially for Virtual Reality. 

You could probably do that and have the vms connected via lan maybe? sounds awesome, im just doing this to play overwatch and stuff with friends lol

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Napalm360 said:

That sounds good, sorry couldnt quite get my head around it the first time lol, turns the asrock board im eyeing has aspeed vga so all is good, what do you reccomend for storage however which is me 2nd question in  the OP

Use an SSD to host the VMs and their system disks on, HDDs are very poor at running multiple VMs. You can use HDDs to host secondary virtual disks for each VM to store games on.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, leadeater said:

Use an SSD to host the VMs and their system disks on, HDDs are very poor at running multiple VMs. You can use HDDs to host secondary virtual disks for each VM to store games on.

Nevermind that , i WONT plan on reading games off a nas cos latency and stuff. would a couple WD Reds be sufficient, im thinking 4 WD red 3TB drives in raid 10 then sharing the 6TB accross the vm's can i share a partition accross vm's or would i be better off assining a single drive to each vm instead?

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Napalm360 said:

Nevermind that , i WONT plan on reading games off a nas cos latency and stuff. would a couple WD Reds be sufficient, im thinking 4 WD red 3TB drives in raid 10 then sharing the 6TB accross the vm's can i share a partition accross vm's or would i be better off assining a single drive to each vm instead?

Create the 4 disk array, RAID 10 or RAID 5, then create virtual disks within the array for each VM. You'll get better performance than a single disk per VM and better resiliency. I'd still use an SSD for the VM + C drive for each VM, I just like the speed of SSDs now, and the array of WD Reds for a second virtual disk per VM to store actual data on. Only need 80GB per VM/OS.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, leadeater said:

Create the 4 disk array, RAID 10 or RAID 5, then create virtual disks within the array for each VM. You'll get better performance than a single disk per VM and better resiliency. I'd still use an SSD for the VM + C drive for each VM, I just like the speed of SSDs now, and the array of WD Reds for a second virtual disk per VM to store actual data on. Only need 80GB per VM/OS.

Thanks for the info on that, As im only goin to be installing a couple games on each vm, i think ill just get a couple 1TB WD Blues instead of reds and an intel 600p 128gb drive through a pci-e adapter.

Because of budget, ill start with 2 vms at first so i only need to buy 2 gtx670's and 2 WD blues,

If i got 2 wd blues in raid 0 and created the 2 virtual disks on those for vm's and later on get another 2 blues...

can i edit the raid to a raid 10 for the 4 drives whilst keeping the data?

 

Many thanks

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Napalm360 said:

Thanks for the info on that, As im only goin to be installing a couple games on each vm, i think ill just get a couple 1TB WD Blues instead of reds and an intel 600p 128gb drive through a pci-e adapter.

Because of budget, ill start with 2 vms at first so i only need to buy 2 gtx670's and 2 WD blues,

If i got 2 wd blues in raid 0 and created the 2 virtual disks on those for vm's and later on get another 2 blues...

can i edit the raid to a raid 10 for the 4 drives whilst keeping the data?

 

Many thanks

I don't know enough about unRAID to say if you can. Most other solutions you cannot expand or convert RAID 0 or RAID 10 configurations. If the data size is small enough you can just backup to a cheap USB to do the expansion.

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, leadeater said:

I don't know enough about unRAID to say if you can. Most other solutions you cannot expand or convert RAID 0 or RAID 10 configurations. If the data size is small enough you can just backup to a cheap USB to do the expansion.

K brilliant, what I'll do is have 2 WD Blues in raid 0 that has the 2 virtual drives and later on I'll add another 2 WD Blues, and do the same however would puting 2 virtual disks on a 2 drive raid 0 be faster than assigning a drive to each ?

Thanks

Link to comment
Share on other sites

Link to post
Share on other sites

45 minutes ago, Napalm360 said:

K brilliant, what I'll do is have 2 WD Blues in raid 0 that has the 2 virtual drives and later on I'll add another 2 WD Blues, and do the same however would puting 2 virtual disks on a 2 drive raid 0 be faster than assigning a drive to each ?

Thanks

Yes it would be faster. A VM won't be using it's disk all the time so when it needs to being able to utilize more than 1 disk will be faster.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, leadeater said:

Yes it would be faster. A VM won't be using it's disk all the time so when it needs to being able to utilize more than 1 disk will be faster.

Thanks :)

 

 

Spoiler

Now i need to save £1200 lol

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×