Jump to content

Parts for an all in one server + 4 user work station

xannax159

Budget (including currency): $10,000-$15,000

Country:  United States

Games, programs or workloads that it will be used for: CAD, AI learning, Gaming, basic computer use

Other details : The primary goal for this computer will be something similar to a 4 "gamers" 1 cpu project but with the extension of the same computer acting as a NAS, networking hub, and Building surveillance, etc.

 

This build is basically going to be any tech nerds wet dream, the use case of it is for a College House, for students to be able to use the computer for their daily tasks as well as in their free time as well as acting as a "server" for networking, NAS, and perimeter surveillance.

 

My layout so far is:

Thread Ripper 3990x as the heart of the computer, allowing 12 cores to each work station user with a spare of 14 cores for the actual server part of the build

4x 2080Ti's OR the upcoming 3080TI's, one for each user and if budget allows, a spare RTX TITAN to be used for whichever workstation needs it at the time.

128-256GB RAM so 32GB to each user with a spare of 128GB to be divided between the actual server (if there is some leftover, back into the users)

4x 10Gig ports to each user

2TB NVME storage for each user with each user having 2TB of additional redundancy (automated backups)

 

How would you go about this build? Im mostly concerned with the Networking, NAS, and Surveillance aspect of the build aswell as a host OS for it all. If its easier, The entire Work station aspect of the build can be physically seperate from the NAS, Networking, and Surveillance aspect of the build. But i would like to cap it off at a maximum of 2 machines. 10Gig networking is a must, the NAS must be fast enough to work with the 10Gig network, and the Surveillance aspect must be able to run redundantly 24/7 and store 4k30fps footage streams from 6-10 cameras at once. I may just be insane for suggesting such a large budget but reliability and redundancy is a absolute must.

Link to comment
Share on other sites

Link to post
Share on other sites

Id really suggest having separate systems. It will probably be cheaper, and much easier to setup and configure.

 

Have you done this before? What hypervisor do you plan on using?

 

How would the users connect to the system? Will they directly connect or use some sort of remote desktop connection?

 

There are some issues here, like how you want 5 gpus and 4 10g ports, and there is no board that can support that either.

 

If you want reliable, id seprate the roles, this is making it so if anything goes wrong, or there are updates that need a reboot, everything goes down with it.

 

What switches do you plan on using for networking? What 10gbe standard, there are many of them. Also why 4 10g ports, just get a single 40 or 100gbe port

Link to comment
Share on other sites

Link to post
Share on other sites

@Electronics Wizardy

Have you done this before? What hypervisor do you plan on using?

The closest ive done to something like this is "2 computers, 1 desktop" with a GPU pass through on UBUNTU, if there is something better for this scenario please let me know.

How would the users connect to the system? Will they directly connect or use some sort of remote desktop connection?

The only way i personally know if is Hard wired, running cables to each room. But if there is some super low latency system to doing this wireless im always open to ideas.

There are some issues here, like how you want 5 gpus and 4 10g ports, and there is no board that can support that either.

I understand the issue with 4x 10Gig ports, but whats stopping me from using 5 GPU's? Is there truly no board that supports more than 4 video cards?

If you want reliable, id seprate the roles, this is making it so if anything goes wrong, or there are updates that need a reboot, everything goes down with it.

Understood, i plan to rack mount this build in a server room anyways.

What switches do you plan on using for networking? What 10gbe standard, there are many of them. Also why 4 10g ports, just get a single 40 or 100gbe port

Ive had no real good experience with networking, whats the fastest consumer available standard? It would just need to play nicely with the NAS. If a single 40GB port would suffice im all for it.

Link to comment
Share on other sites

Link to post
Share on other sites

A separate NAS/video recording solution would indeed be the best choice. Consider a multi-core fibre-optic backbone for your network.

 

For the OS, the NAS should be either Linux or BSD, the workstation instances can run in a VM and on Win-10 if you so must (personally, Win-7 is the latest I'd go, but then I don't use Win-OS anyway, being a Linux user :P )

"You don't need eyes to see, you need vision"

 

(Faithless, 'Reverence' from the 1996 Reverence album)

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, BlueScope819 said:

Press the quote button next to our messages so that we can see them. I just happened to be typing my reply at the time so I saw it. It's the left arrow second from the left at the bottom of our message.

 

That doesn't solve the problem of reliability nor the massive f***ing software headache this is going to cause.

If everything is in one box why do you need 40 gigabit to the rest of the stuff? That's literally pointless as you would be setting up a network share and it's not like your use case needs 4 simultaneous 10 gigabit per second streams of data not to mention that you will need a 10 gigabit switch which will cost you hundreds. This entire build is just a really really really bad idea.

This build isn't my decision, its the homeowners decision. im merely the person thats going to be physically assembling it and working out the software kinks. They want the factor of it all being 1 machine. I personally do understand it would be easier, cheaper, wiser to build individual work stations but in the end im merely recommending, the final decision isnt mine. @BlueScope819

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Dutch_Master said:

A separate NAS/video recording solution would indeed be the best choice. Consider a multi-core fibre-optic backbone for your network.

 

For the OS, the NAS should be either Linux or BSD, the workstation instances can run in a VM and on Win-10 if you so must (personally, Win-7 is the latest I'd go, but then I don't use Win-OS anyway, being a Linux user :P )

What kind of hardware would you recommend for a NAS/Video recording setup with 6-10 4k30fps streams?

Link to comment
Share on other sites

Link to post
Share on other sites

@xannax159 FreeNAS is a solid choice, Proxmox too. They have minimum and recommended hardware setups listed on their sites, check those.

 

I'd also recommend to the building owners that although stuff may be technically feasible, it may not be wise to do so: it's technically perfectly feasible to jump from the Empire State building, but it's not recommended you'd do that :P  Now that I've seen previous posts I'd missed before, recommending a different way of "skinning the cat" (that's proverbial, before any of you activists get all loopy on me!) is indeed the best way forward, IMO.

"You don't need eyes to see, you need vision"

 

(Faithless, 'Reverence' from the 1996 Reverence album)

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, xannax159 said:

What kind of hardware would you recommend for a NAS/Video recording setup with 6-10 4k30fps streams?

what bitrate? What codec?

 

What are you captureing from?

 

35 minutes ago, xannax159 said:

This build isn't my decision, its the homeowners decision. im merely the person thats going to be physically assembling it and working out the software kinks. They want the factor of it all being 1 machine. I personally do understand it would be easier, cheaper, wiser to build individual work stations but in the end im merely recommending, the final decision isnt mine. @BlueScope819

Really try to tell this this will be a bad experinces, lots of issues. Just get seprate workstations for the rooms, and a server for the main area.

 

43 minutes ago, xannax159 said:

ve had no real good experience with networking, whats the fastest consumer available standard? It would just need to play nicely with the NAS. If a single 40GB port would suffice im all for it.

you need to make a plan for a network, or get someone in that knows what there doing. You don't want to connect directy to the nas, you want a switch, and that switch depends a lot of the rest of the network

 

44 minutes ago, xannax159 said:

I understand the issue with 4x 10Gig ports, but whats stopping me from using 5 GPU's? Is there truly no board that supports more than 4 video cards?

Threadripper is made for desktop atx boards mostly, so your limited to 7 slots. With dual width gpus that limits you to 4 cards with one handing over at the bottom. And nothing else.

 

You can get servers that will fit 10 gpus, but this is way out of budget, and don't have threadripper(xeon normally, some epyc parts). Also these are extremly loud, and often needs 240v power, and other annoyances

45 minutes ago, xannax159 said:

The only way i personally know if is Hard wired, running cables to each room. But if there is some super low latency system to doing this wireless im always open to ideas.

Do you want them to connect with a IP based solution, or some other wired solution like hdmi or thunderbolt? IP based solutions are the most flexable, but worse experience.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×