Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Virtualization! I don't know where to start.

Go to solution Solved by leadeater,

For 5 computers on a mostly temporary basis you are making it way more complicated than it needs to be. Building 5 computers and placing them in the office where ever is the best answer here. Have a watch of the LTT video they did on remote working, that looks like a good solution when you need 3D graphics, video, CAD etc.

 

Doing this with a single system to meet your requirements it's mostly impossible, you would have to self build and that just isn't a good idea in a business environment honestly. Vendor servers don't often take more than 4 GPUs, they will only come with and work with Quadros and Teslas due to power cable location on the GPU and servers that can take more than 4 GPUs are much more specialized and expensive.

 

But even if you build it all yourself because of your requirements you are driven immediately in to a dual socket platform for Intel or single socket for AMD EPYC, either way the CPUs are expensive and you have to use Registered ECC RAM and a lot of it so you're at $1600 minimum for the RAM and up to $3000 ish.

 

Then you have to contend with Nvidia not allowing Geforce GPUs to be used in virtualization environments, you can get around it but it's another hint that it's not a good idea.

 

Can you build this for under $18k? Yes. Should you? No.

 

If you long term want to go down remote working from anywhere any time then you'll want to do an RFP/RFI with some IT firms to setup a VDI environment properly and a proper project from people who are experienced and done it before. To get a good user experience with VDI it's not as simple as hardware, good internet, and picking the right software. You can have all the right ingredients and still end up with a deflated souffle.

In short, I need to build one PC or Workstation or Server for 5 person to use while they're at their homes respectively.

 

So, some kind of Xeon or Threadripper (or Server CPU?) with Virtualization setup?

 

I actually don't know where to start with that.

I can build a PC if I know what I'm doing. Apparently, I don't.

 

Each of these person need a minimum of ...

CPU : 8 Cores 16 Threads (4.0GHz or higher if possible)

GPU : Equivalent of RTX 2070Super (2080Ti for each if possible)

RAM : 32GB DDR4 (2666MHz or higher if possible)

SSD : 1TB PCIe NVMe SSD (or just some normal SSD)

HDD : 1TB of Raw HDD

 

Any recommendations for the Hardware parts?
And what Virtualization setup should I use?

 

◆Budget◆

I can spend a maximum of 18,000$ for a complete setup.

Already have bunch of monitors, keyboards, mice and tools to build.

Don't need any hardware for their home setup.

 

 

 

ok boomer.png

Link to post
Share on other sites

Any particular reason why this needs to be a virtual environment? With these kind of specs it's probably cheaper and easier to build a workstation for each person.

 

The question you should ask yourself: Do they need these specs at the same time? The advantage of a VM might be that you can get away with less hardware, if the hardware isn't hogged by all of those people at once (e.g. you get one 8 core for two people, because they don't each need 8 cores at the same time).

Remember to quote or @mention others, so they are notified of your reply

Link to post
Share on other sites
13 minutes ago, Eigenvektor said:

Any particular reason why this needs to be a virtual environment? With these kind of specs it's probably cheaper and easier to build a workstation for each person.

 

The question you should ask yourself: Do they need these specs at the same time? The advantage of a VM might be that you can get away with less hardware, if the hardware isn't hogged by all of those people at once (e.g. you get one 8 core for two people, because they don't each need 8 cores at the same time).

@Eigenvektor Thanks for the comment. 


We (as a company) are currently limited to rely on remote working environment and some of the staffs need to work from home.

All of those people will be using this 24/7 and they "need" at least 8C/16T bla bla bla ... 

 

They can't come in to office and I can't go to their houses.

And I don't want to take back 5 desktops when this is over.

 

I know it's cheaper if we can get one PC for each person

but I'm changing the direction of normal office setups to more remote / mobile / cloud ones.

 

And a little bit of truth is that when this remote thing is over,

I can have that big boy as my own "test" hardware for work. :D 

 

Everybody wins. 

Link to post
Share on other sites
7 minutes ago, ZWELINHTET said:

We (as a company) are currently limited to rely on remote working environment and some of the staffs need to work from home.

All of those people will be using this 24/7 and they "need" at least 8C/16T bla bla bla ... 

 

They can't come in to office and I can't go to their houses.

And I don't want to take back 5 desktops when this is over.

Just some general considerations, I hope someone with more experience can expand on that. I haven't really used VMs "professionaly" (other than testing) for some time. Most of our stuff is in the cloud and containerized in Docker by now.

 

This is a simple "addition" of your specs for five people, rounded to typical numbers.

- 48 core CPU
- 5x RX 2070S
- 192 GB RAM
- 5 TB NVMe SSD
- 5 TB HDD

 

Based on the size of this machine, I suppose you want something a bit more "professional" than VirtualBox, e.g. VMWare ESXi. No idea what kind of license you'd need for this setup.

 

To get a dedicated GPU for a virtual machine you need to use GPU passthrough. Not really familiar with it, but ESXi supports it. So you're looking at something like a server board with at least 5 GPU slots.

 

The issue with this many CPU cores is that they are usually slower than CPUs with less cores. It might be an option to use a dual socket board and get two CPUs with fewer cores and higher clocks instead. The issue is that 2x16 would be too few and 2x32 is probably overkill.

 

Most virtualization options include memory optimizations where several virtual machines can share memory if it contains the same data. E.g. if every machine is running the same Windows version and has the same kernel loaded into memory, the kernel may only reside in memory once. So you may very will be able to get away with less RAM.

 

A NVMe disk of this size might be quite expensive. Do you really need this kind of performance for this many people at the same time? NVMe disks are getting cheaper but the price for larger sizes still grows rather exponential. Of course you can use multiple smaller disks but you may run out of M.2 slots.

 

Since this is remote you may also want to take bandwidth into consideration. Is everyones connection good enough to work on this machine remotely, e.g. over RDP?

Remember to quote or @mention others, so they are notified of your reply

Link to post
Share on other sites

It would make more sense to build 5 identical systems and connect them to the same network if you like. Having more than 3 GPUs on the same motherboard is difficult. A decent server that has enough PCI-E slots for 5 GPUs will easily set you back 3 tom 4 k just for the motherboard and case. Also those usually only work with Intel Xeons so you will need to spend 8 to 20 k on Xeon Gold CPUs, then come those registered ECC memory modules at another 3 k. for 256 GB. However those chips will not be able to run at 4 GHz, more likely at 2.2 to 2.6 GHz depending on load and temperature. If you can find an IT expert to set it up and maintain it for you, great but believe me every now and then something will go bad and all the clients will be offline. Building and debugging it takes many days. 

So unless you want to deal with that nightmare I'd suggest building 5 identical systems and put them onto your office network and let your employees remote in. You will save money and setup time per machine will be an hour or two with all drivers installed.

 

I would estimate that a build like yours in a server environment would cost about 50 k including 12 hours of phone support with the manufacturer of the motherboard and interface cards. Building 5 PCs will cost you probably around 10 k. 

 

On the network side of things, your company and your clients will need a really fast connection if they are working in real time on things like CAD for example.

Link to post
Share on other sites
4 minutes ago, Eigenvektor said:

This is a simple "addition" of your specs for five people, rounded to typical numbers.

- 48 core CPU
- 5x RX 2070S
- 192 GB RAM
- 5 TB NVMe SSD
- 5 TB HDD

@Eigenvektor Thank you for the round up.

 

5 minutes ago, Eigenvektor said:

To get a dedicated GPU for a virtual machine you need to use GPU passthrough.

I'm not familiar with this. More for me to study. yayy :D 

 

6 minutes ago, Eigenvektor said:

The issue with this many CPU cores is that they are usually slower than CPUs with less cores. It might be an option to use a dual socket board and get two CPUs with fewer cores and higher clocks instead. The issue is that 2x16 would be too few and 2x32 is probably overkill.

Nothing is overkill if it's within the budget.

Maybe I'm just looking for a reason to use Threadripper 3990X. Maybe not.

 

9 minutes ago, Eigenvektor said:

A NVMe disk of this size might be quite expensive. Do you really need this kind of performance for this many people at the same time? NVMe disks are getting cheaper but the price for larger sizes still grows rather exponential. Of course you can use multiple smaller disks but you may run out of M.2 slots.

I can live with any SSD or HDD as long as it's fast enough for me. Something like Samsung Evo 860 or 970 is fine, I guess.

 

11 minutes ago, Eigenvektor said:

Is everyones connection good enough to work on this machine remotely, e.g. over RDP?

Their bandwidth is not one of the top ones but good enough.

I don't really have any idea with Remote softwares. Help!

Link to post
Share on other sites
6 minutes ago, Applefreak said:

It would make more sense to build 5 identical systems and connect them to the same network if you like. Having more than 3 GPUs on the same motherboard is difficult. A decent server that has enough PCI-E slots for 5 GPUs will easily set you back 3 tom 4 k just for the motherboard and case. Also those usually only work with Intel Xeons so you will need to spend 8 to 20 k on Xeon Gold CPUs, then come those registered ECC memory modules at another 3 k. for 256 GB. However those chips will not be able to run at 4 GHz, more likely at 2.2 to 2.6 GHz depending on load and temperature. If you can find an IT expert to set it up and maintain it for you, great but believe me every now and then something will go bad and all the clients will be offline. Building and debugging it takes many days. 

So unless you want to deal with that nightmare I'd suggest building 5 identical systems and put them onto your office network and let your employees remote in. You will save money and setup time per machine will be an hour or two with all drivers installed.

@Applefreak Thanks for the comment but I can't breath while reading this. T=T

 

7 minutes ago, Applefreak said:

I would estimate that a build like yours in a server environment would cost about 50 k including 12 hours of phone support with the manufacturer of the motherboard and interface cards. Building 5 PCs will cost you probably around 10 k. 

 

On the network side of things, your company and your clients will need a really fast connection if they are working in real time on things like CAD for example.

I know the cost is skyrocketing but I really need this problem to solve with one system (that's one PC).

We already have a fast and fat internet connections in the office and 50Mbs connections on the clients side.

Link to post
Share on other sites
10 minutes ago, ZWELINHTET said:

I'm not familiar with this. More for me to study. yayy :D

As a rule of thumb, GPUs don't support virtualization. So the basic idea is that one GPU is bound to one virtual machine as a dedicated card and can only be used by that machine. From what I know this isn't entirely true anymore, there are data center products that support virtualization. Like a CPU is split into many vCPUs and shared beetween machines a GPU is split into vGPUs and shared. Here's some info from Nvidia: https://www.nvidia.com/en-us/data-center/virtual-gpu-technology/

 

No clue how affordable and within budget that is :D But that might solve the issue of having to install 5 GPUs in a single board. E.g. maybe you only need 2x Nvidia Quadro cards.

Quote

I don't really have any idea with Remote softwares. Help!

If we're talking Windows, then Remote Desktop over VPN should probably cover your needs.

Remember to quote or @mention others, so they are notified of your reply

Link to post
Share on other sites
16 minutes ago, Applefreak said:

@ZWELINHTET May I suggest you go over to the Level1Techs Forum, they specialize in server stuff. Here is the LINK

@Applefreak Thank you. I'll look into there too. ❤️ 

 

15 minutes ago, Eigenvektor said:

As a rule of thumb, GPUs don't support virtualization. So the basic idea is that one GPU is bound to one virtual machine as a dedicated card and can only be used by that machine. From what I know this isn't entirely true anymore, there are data center products that support virtualization. Like a CPU is split into many vCPUs and shared beetween machines a GPU is split into vGPUs and shared. Here's some info from Nvidia: https://www.nvidia.com/en-us/data-center/virtual-gpu-technology/

 

No clue how affordable and within budget that is :D But that might solve the issue of having to install 5 GPUs in a single board. E.g. maybe you only need 2x Nvidia Quadro cards.

If we're talking Windows, then Remote Desktop over VPN should probably cover your needs.

@Eigenvektor Thank you too. I think I need some kind of GPU like Tesla or Titan. Maybe I'm wrong on that subject.

And yes, I need Windows for the work. I'm thinking RDP with VPN will do the trick too. But I'm not so sure.

Link to post
Share on other sites

So you technically don't even need a server for this, Citrix Virtual Desktops can be used on their regular work PCs which will allow them to remote in and is honestly a lot better than RDP or other remote control solutions that I've seen, even over the Internet. We have about half of our employees here on VMs and the other half using their PCs all through the Citrix software since it creates a virtual graphics card that allows you to, for example, resize your your desktop to however many screens you want. You can pass through USB devices and local file directories too if needed.

 

I personally think this is the better solution in the short term and then work on a more robust VDI solution for the long term instead of the stop gap you're trying to do right now.

[Out-of-date] Want to learn how to make your own custom Windows 10 image?

 

Desktop: AMD R9 3900X | ASUS ROG Strix X570-F | Radeon RX 5700 XT | EVGA GTX 1080 SC | 32GB Trident Z Neo 3600MHz | 1TB 970 EVO | 256GB 840 EVO | 960GB Corsair Force LE | EVGA G2 850W | Phanteks P400S

Laptop: Intel M-5Y10c | Intel HD Graphics | 8GB RAM | 250GB Micron SSD | Asus UX305FA

Server 01: Intel Xeon D 1541 | ASRock Rack D1541D4I-2L2T | 32GB Hynix ECC DDR4 | 4x8TB Western Digital HDDs | 32TB Raw 16TB Usable

Server 02: Intel i7 7700K | Gigabye Z170N Gaming5 | 16GB Trident Z 3200MHz

Link to post
Share on other sites
5 minutes ago, 2FA said:

Citrix Virtual Desktops can be used on their regular work PCs which will allow them to remote in and is honestly a lot better than RDP or other remote control solutions that I've seen, even over the Internet.

@2FA 

 

They don't have any desktops. That's why I'm planning to build a new one.

 

7 minutes ago, 2FA said:

We have about half of our employees here on VMs and the other half using their PCs all through the Citrix software since it creates a virtual graphics card that allows you to, for example, resize your your desktop to however many screens you want. You can pass through USB devices and local file directories too if needed.

Thank you for the Citrix info though. I almost forget that name. Gonna look into it now. :D

 

8 minutes ago, 2FA said:

then work on a more robust VDI solution for the long term instead of the stop gap you're trying to do right now.

VDI is the end game but I'm not familiar with any of its setup and environment. Anywhere I can start reading?

Link to post
Share on other sites

As people mentioned before building this for 5 people can be quite complicated and expensive, two is more doable, three might also work. So it might be an option to go for two servers?

Link to post
Share on other sites

What kind of 'actual work' is being done?  Citrix will set you back a lot, this plan will set you back a lot.  What kind of accelerated workload do you have that requires those specs?  What's your plan and budget for remote access?  

 

It'd be much more economical to grab 5x $2k laptops with similar specs to what you listed.

On 4/8/2020 at 6:34 AM, ZWELINHTET said:

They can't come in to office and I can't go to their houses.

And I don't want to take back 5 desktops when this is over.

What did they use in the office?  Our work has been mailing stuff to each other. 

 

I feel like with the knowledge gap this will be an expensive struggle for you.

 

PC : 3600 · Crosshair VI WiFi · 2x16GB RGB 3200 · 1080Ti SC2 · 1TB WD SN750 · EVGA 1600G2 · Define C 

Link to post
Share on other sites

If its for a temporary solution,  why not have them hosted in the cloud?

Slayerking92

<Type something witty here>
<Link to some pcpartpicker fantasy build and claim as my own>

Link to post
Share on other sites
On 4/9/2020 at 11:33 PM, beersykins said:

What kind of 'actual work' is being done?  Citrix will set you back a lot, this plan will set you back a lot.  What kind of accelerated workload do you have that requires those specs?  What's your plan and budget for remote access?  

 

It'd be much more economical to grab 5x $2k laptops with similar specs to what you listed.

What did they use in the office?  Our work has been mailing stuff to each other. 

 

I feel like with the knowledge gap this will be an expensive struggle for you.

 

@beersykins 

 

To be honest, these are for the tops of the company and this need to be "expensive" just to let them like they're superior.

I'm still thinking about the plan but this don't have budget. (I know it's overrated but this's somehow true)

 

As I said earlier, I already have 2k~3k class desktops lying around my workplace and I don't wanna use those.

And lastly, I know I'm super low on the knowledge part. That's why I'm asking for your help. ❤️

 

Let it be expensive. xD 

Link to post
Share on other sites
On 4/10/2020 at 3:50 AM, Slayerking92 said:

If its for a temporary solution,  why not have them hosted in the cloud?

@Slayerking92 I need the physical device inside the company bruh. It's okay if it's our cloud.

Link to post
Share on other sites

For 5 computers on a mostly temporary basis you are making it way more complicated than it needs to be. Building 5 computers and placing them in the office where ever is the best answer here. Have a watch of the LTT video they did on remote working, that looks like a good solution when you need 3D graphics, video, CAD etc.

 

Doing this with a single system to meet your requirements it's mostly impossible, you would have to self build and that just isn't a good idea in a business environment honestly. Vendor servers don't often take more than 4 GPUs, they will only come with and work with Quadros and Teslas due to power cable location on the GPU and servers that can take more than 4 GPUs are much more specialized and expensive.

 

But even if you build it all yourself because of your requirements you are driven immediately in to a dual socket platform for Intel or single socket for AMD EPYC, either way the CPUs are expensive and you have to use Registered ECC RAM and a lot of it so you're at $1600 minimum for the RAM and up to $3000 ish.

 

Then you have to contend with Nvidia not allowing Geforce GPUs to be used in virtualization environments, you can get around it but it's another hint that it's not a good idea.

 

Can you build this for under $18k? Yes. Should you? No.

 

If you long term want to go down remote working from anywhere any time then you'll want to do an RFP/RFI with some IT firms to setup a VDI environment properly and a proper project from people who are experienced and done it before. To get a good user experience with VDI it's not as simple as hardware, good internet, and picking the right software. You can have all the right ingredients and still end up with a deflated souffle.

Link to post
Share on other sites
  • 1 month later...
On 4/13/2020 at 5:32 AM, leadeater said:

For 5 computers on a mostly temporary basis you are making it way more complicated than it needs to be. Building 5 computers and placing them in the office where ever is the best answer here. Have a watch of the LTT video they did on remote working, that looks like a good solution when you need 3D graphics, video, CAD etc.

 

Doing this with a single system to meet your requirements it's mostly impossible, you would have to self build and that just isn't a good idea in a business environment honestly. Vendor servers don't often take more than 4 GPUs, they will only come with and work with Quadros and Teslas due to power cable location on the GPU and servers that can take more than 4 GPUs are much more specialized and expensive.

 

But even if you build it all yourself because of your requirements you are driven immediately in to a dual socket platform for Intel or single socket for AMD EPYC, either way the CPUs are expensive and you have to use Registered ECC RAM and a lot of it so you're at $1600 minimum for the RAM and up to $3000 ish.

 

Then you have to contend with Nvidia not allowing Geforce GPUs to be used in virtualization environments, you can get around it but it's another hint that it's not a good idea.

 

Can you build this for under $18k? Yes. Should you? No.

 

If you long term want to go down remote working from anywhere any time then you'll want to do an RFP/RFI with some IT firms to setup a VDI environment properly and a proper project from people who are experienced and done it before. To get a good user experience with VDI it's not as simple as hardware, good internet, and picking the right software. You can have all the right ingredients and still end up with a deflated souffle.

Hi there,

 

To be honest, I am no expert in this topic but I myself have been trying to find a way to build Virtual PC Host with with multiple end users. I don't need as much computing power as the Original Poster needs but through my research I did stumble upon the nvidia T4. I don't understand the specs but from what I understand is that you have 1 GPU and have up 16 end users. Please correct me if I am wrong.

Link to post
Share on other sites
9 hours ago, abdullahissa said:

Hi there,

 

To be honest, I am no expert in this topic but I myself have been trying to find a way to build Virtual PC Host with with multiple end users. I don't need as much computing power as the Original Poster needs but through my research I did stumble upon the nvidia T4. I don't understand the specs but from what I understand is that you have 1 GPU and have up 16 end users. Please correct me if I am wrong.

The T4 is a specific generation of Tesla, but yes what you're after is some sort of Grid card that supports vGPU. 

Wether thats a Quadro or a Tesla: https://docs.nvidia.com/grid/gpus-supported-by-vgpu.html

Spoiler

Desktop: Ryzen 7 2700x | Aorus X470 Gaming Ultra | EVGA RTX2080 Super | 32GB (4x8GB) Corsair Vengeance RGB Pro 3200Mhz | Corsair H105 AIO, NZXT Sentry 3 | Corsair SP120's | 1TB Crucial P1 NVMe, 4TB WD Black | Phanteks Enthoo Pro | Corsair RM650v2 PSU | LG 32" 32GK850G Monitor | Ducky Shine 3 Keyboard, Logitech G502, MicroLab Solo 7C Speakers, Razer Goliathus Extended, X360 Controller | Windows 10 Pro | SteelSeries Siberia 350 Headphones

 

Spoiler

Server 1: Fractal Design Define R6 | Ryzen 3950x | ASRock X570 Taichi | EVGA GTX1070 FTW | 64GB (4x16GB) Corsair Vengeance LPX 3000Mhz | Corsair RM650v2 PSU | Fractal S36 Triple AIO | 10 x 8TB HGST Ultrastar He10 (WD Whitelabel) | 500GB Aorus Gen4 NVMe | 2 x 1TB Crucial P1 NVMe | LSI 9211-8i HBA

 

Server 2: Corsair 400R | IcyDock MB998SP & MB455SPF | Seasonic Focus Plus 650w PSU | 2 x Xeon X5650's | 48GB DDR3-ECC | Asus Z8NA-D6C Motherboard | AOC-SAS2LP-MV8 | LSI MegaRAID 9271-8i | RES2SV240 SAS Expander | Samsung 840Evo 120GB | 5 x 8TB Seagate Archives | 10 x 3TB WD Red

 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×