Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
xannax159

Parts for an all in one server + 4 user work station

Recommended Posts

Posted · Original PosterOP

Budget (including currency): $10,000-$15,000

Country:  United States

Games, programs or workloads that it will be used for: CAD, AI learning, Gaming, basic computer use

Other details : The primary goal for this computer will be something similar to a 4 "gamers" 1 cpu project but with the extension of the same computer acting as a NAS, networking hub, and Building surveillance, etc.

 

This build is basically going to be any tech nerds wet dream, the use case of it is for a College House, for students to be able to use the computer for their daily tasks as well as in their free time as well as acting as a "server" for networking, NAS, and perimeter surveillance.

 

My layout so far is:

Thread Ripper 3990x as the heart of the computer, allowing 12 cores to each work station user with a spare of 14 cores for the actual server part of the build

4x 2080Ti's OR the upcoming 3080TI's, one for each user and if budget allows, a spare RTX TITAN to be used for whichever workstation needs it at the time.

128-256GB RAM so 32GB to each user with a spare of 128GB to be divided between the actual server (if there is some leftover, back into the users)

4x 10Gig ports to each user

2TB NVME storage for each user with each user having 2TB of additional redundancy (automated backups)

 

How would you go about this build? Im mostly concerned with the Networking, NAS, and Surveillance aspect of the build aswell as a host OS for it all. If its easier, The entire Work station aspect of the build can be physically seperate from the NAS, Networking, and Surveillance aspect of the build. But i would like to cap it off at a maximum of 2 machines. 10Gig networking is a must, the NAS must be fast enough to work with the 10Gig network, and the Surveillance aspect must be able to run redundantly 24/7 and store 4k30fps footage streams from 6-10 cameras at once. I may just be insane for suggesting such a large budget but reliability and redundancy is a absolute must.

Link to post
Share on other sites

Id really suggest having separate systems. It will probably be cheaper, and much easier to setup and configure.

 

Have you done this before? What hypervisor do you plan on using?

 

How would the users connect to the system? Will they directly connect or use some sort of remote desktop connection?

 

There are some issues here, like how you want 5 gpus and 4 10g ports, and there is no board that can support that either.

 

If you want reliable, id seprate the roles, this is making it so if anything goes wrong, or there are updates that need a reboot, everything goes down with it.

 

What switches do you plan on using for networking? What 10gbe standard, there are many of them. Also why 4 10g ports, just get a single 40 or 100gbe port

Link to post
Share on other sites
11 minutes ago, xannax159 said:

Budget (including currency): $10,000-$15,000

Country:  United States

Games, programs or workloads that it will be used for: CAD, AI learning, Gaming, basic computer use

Other details : The primary goal for this computer will be something similar to a 4 "gamers" 1 cpu project but with the extension of the same computer acting as a NAS, networking hub, and Building surveillance, etc.

 

This build is basically going to be any tech nerds wet dream, the use case of it is for a College House, for students to be able to use the computer for their daily tasks as well as in their free time as well as acting as a "server" for networking, NAS, and perimeter surveillance.

 

My layout so far is:

Thread Ripper 3990x as the heart of the computer, allowing 12 cores to each work station user with a spare of 14 cores for the actual server part of the build

4x 2080Ti's OR the upcoming 3080TI's, one for each user and if budget allows, a spare RTX TITAN to be used for whichever workstation needs it at the time.

128-256GB RAM so 32GB to each user with a spare of 128GB to be divided between the actual server (if there is some leftover, back into the users)

4x 10Gig ports to each user

2TB NVME storage for each user with each user having 2TB of additional redundancy (automated backups)

 

How would you go about this build? Im mostly concerned with the Networking, NAS, and Surveillance aspect of the build aswell as a host OS for it all. If its easier, The entire Work station aspect of the build can be physically seperate from the NAS, Networking, and Surveillance aspect of the build. But i would like to cap it off at a maximum of 2 machines. 10Gig networking is a must, the NAS must be fast enough to work with the 10Gig network, and the Surveillance aspect must be able to run redundantly 24/7 and store 4k30fps footage streams from 6-10 cameras at once. I may just be insane for suggesting such a large budget but reliability and redundancy is a absolute must.

Spending $10-15k on a server for a few people to use as workstations is really not a good idea. As @Electronics Wizardy said it will simply cause more problems than it solves. Unless you have real enterprise grade experience in doing this then you are set to waste a whole lot of time and money. Literally just build 4x workstations for everyone with their own 2080tis and 3900x CPUs, or whatever the next gen equivalent is, and then you can buy an actual server to do the server stuff from. You will save a ton of money.

 

(Also just generally pointing out that spending $15k on computer gear isn't a good long term financial plan as it would be preferable to create value optimized systems that meet your needs rather than going all out on something that's not required. For example, you can use Ryzen 5 3600 CPUs and RTX 2070S GPUs and just use all of the other stations as render nodes as everyone won't be using them at the same time. Put your extra $10k into an IRA and watch that grow to a few hundred thousand to retire on.)


@BlueScope819 so I can see your post

#MuricaParrotGang

My name is Legion 'Murica Parrot Gang, for we are many.

If a design is taking too long, the design is wrong, and therefore, the design must be modified to accelerate progress. -Elon

Mentioned in 8/5/2020 TechLinked

Link to post
Share on other sites
Posted · Original PosterOP

@Electronics Wizardy

Have you done this before? What hypervisor do you plan on using?

The closest ive done to something like this is "2 computers, 1 desktop" with a GPU pass through on UBUNTU, if there is something better for this scenario please let me know.

How would the users connect to the system? Will they directly connect or use some sort of remote desktop connection?

The only way i personally know if is Hard wired, running cables to each room. But if there is some super low latency system to doing this wireless im always open to ideas.

There are some issues here, like how you want 5 gpus and 4 10g ports, and there is no board that can support that either.

I understand the issue with 4x 10Gig ports, but whats stopping me from using 5 GPU's? Is there truly no board that supports more than 4 video cards?

If you want reliable, id seprate the roles, this is making it so if anything goes wrong, or there are updates that need a reboot, everything goes down with it.

Understood, i plan to rack mount this build in a server room anyways.

What switches do you plan on using for networking? What 10gbe standard, there are many of them. Also why 4 10g ports, just get a single 40 or 100gbe port

Ive had no real good experience with networking, whats the fastest consumer available standard? It would just need to play nicely with the NAS. If a single 40GB port would suffice im all for it.

Link to post
Share on other sites
Just now, xannax159 said:

Have you done this before? What hypervisor do you plan on using?

The closest ive done to something like this is "2 computers, 1 desktop" with a GPU pass through on UBUNTU, if there is something better for this scenario please let me know.

How would the users connect to the system? Will they directly connect or use some sort of remote desktop connection?

The only way i personally know if is Hard wired, running cables to each room. But if there is some super low latency system to doing this wireless im always open to ideas.

There are some issues here, like how you want 5 gpus and 4 10g ports, and there is no board that can support that either.

I understand the issue with 4x 10Gig ports, but whats stopping me from using 5 GPU's? Is there truly no board that supports more than 4 video cards?

If you want reliable, id seprate the roles, this is making it so if anything goes wrong, or there are updates that need a reboot, everything goes down with it.

Understood, O plan to rack mount this build in a server room anyways.

What switches do you plan on using for networking? What 10gbe standard, there are many of them. Also why 4 10g ports, just get a single 40 or 100gbe port

Ive had no real good experience with networking, whats the fastest consumer available standard? It would just need to play nicely with the NAS. If a single 40GB port would suffice im all for it.

Press the quote button next to our messages so that we can see them. I just happened to be typing my reply at the time so I saw it. It's the left arrow second from the left at the bottom of our message.

 

1 minute ago, xannax159 said:

Understood, O plan to rack mount this build in a server room anyways.

That doesn't solve the problem of reliability nor the massive f***ing software headache this is going to cause.

2 minutes ago, xannax159 said:

Ive had no real good experience with networking, whats the fastest consumer available standard? It would just need to play nicely with the NAS. If a single 40GB port would suffice im all for it.

If everything is in one box why do you need 40 gigabit to the rest of the stuff? That's literally pointless as you would be setting up a network share and it's not like your use case needs 4 simultaneous 10 gigabit per second streams of data not to mention that you will need a 10 gigabit switch which will cost you hundreds. This entire build is just a really really really bad idea.


@BlueScope819 so I can see your post

#MuricaParrotGang

My name is Legion 'Murica Parrot Gang, for we are many.

If a design is taking too long, the design is wrong, and therefore, the design must be modified to accelerate progress. -Elon

Mentioned in 8/5/2020 TechLinked

Link to post
Share on other sites

A separate NAS/video recording solution would indeed be the best choice. Consider a multi-core fibre-optic backbone for your network.

 

For the OS, the NAS should be either Linux or BSD, the workstation instances can run in a VM and on Win-10 if you so must (personally, Win-7 is the latest I'd go, but then I don't use Win-OS anyway, being a Linux user :P )


"You don't need eyes to see, you need vision"

 

(Faithless, 'Reverence' from the 1996 Reverence album)

Link to post
Share on other sites
Posted · Original PosterOP
Just now, BlueScope819 said:

Press the quote button next to our messages so that we can see them. I just happened to be typing my reply at the time so I saw it. It's the left arrow second from the left at the bottom of our message.

 

That doesn't solve the problem of reliability nor the massive f***ing software headache this is going to cause.

If everything is in one box why do you need 40 gigabit to the rest of the stuff? That's literally pointless as you would be setting up a network share and it's not like your use case needs 4 simultaneous 10 gigabit per second streams of data not to mention that you will need a 10 gigabit switch which will cost you hundreds. This entire build is just a really really really bad idea.

This build isn't my decision, its the homeowners decision. im merely the person thats going to be physically assembling it and working out the software kinks. They want the factor of it all being 1 machine. I personally do understand it would be easier, cheaper, wiser to build individual work stations but in the end im merely recommending, the final decision isnt mine. @BlueScope819

Link to post
Share on other sites
Posted · Original PosterOP
4 minutes ago, Dutch_Master said:

A separate NAS/video recording solution would indeed be the best choice. Consider a multi-core fibre-optic backbone for your network.

 

For the OS, the NAS should be either Linux or BSD, the workstation instances can run in a VM and on Win-10 if you so must (personally, Win-7 is the latest I'd go, but then I don't use Win-OS anyway, being a Linux user :P )

What kind of hardware would you recommend for a NAS/Video recording setup with 6-10 4k30fps streams?

Link to post
Share on other sites
1 minute ago, xannax159 said:

This build isn't my decision, its the homeowners decision. im merely the person thats going to be physically assembling it and working out the software kinks. They want the factor of it all being 1 machine. I personally do understand it would be easier, cheaper, wiser to build individual work stations but in the end im merely recommending, the final decision isnt mine. @BlueScope819

Whoever the homeowner is clearly doesn't know what they are talking about. You would have to do something on the scale of this:

Running opticial thunderbolt 3 cables thru the walls in order to get signal everywhere, USB over fiber optic with custom runs and plates in the walls, a 10 gig internal network:

And just generally it would be an incredibly bad idea all things considered. I have no idea what this "homeowner" thinks they know about computers but by doing it this way you are, in no particular order, going to need to:

  • Spend thousands of dollars on server grade hardware (As a general estimate to what you specced out it would be at least $15,000)
  • Replace internal wiring in the house in order to do proper signal runs (cost at least $8,000 if you want to do it right, because you need TB3 over fiber optic AND USB over fiber optic which just costs a shit ton of money)
  • Buy all new hardware for your entire networking setup ($2,000)

All in all it's just a really really really bad idea and will cost about double your estimate because of the above.

 

You can solve this entire problem by just building 4 value workstations + a server for ~$1,000 each and save literally tens of thousands of dollars.


@BlueScope819 so I can see your post

#MuricaParrotGang

My name is Legion 'Murica Parrot Gang, for we are many.

If a design is taking too long, the design is wrong, and therefore, the design must be modified to accelerate progress. -Elon

Mentioned in 8/5/2020 TechLinked

Link to post
Share on other sites
6 minutes ago, xannax159 said:

What kind of hardware would you recommend for a NAS/Video recording setup with 6-10 4k30fps streams?

Oh I don't know something like this:

There is your server which has so many goddamn zeros after the number I don't even know


@BlueScope819 so I can see your post

#MuricaParrotGang

My name is Legion 'Murica Parrot Gang, for we are many.

If a design is taking too long, the design is wrong, and therefore, the design must be modified to accelerate progress. -Elon

Mentioned in 8/5/2020 TechLinked

Link to post
Share on other sites

@xannax159 FreeNAS is a solid choice, Proxmox too. They have minimum and recommended hardware setups listed on their sites, check those.

 

I'd also recommend to the building owners that although stuff may be technically feasible, it may not be wise to do so: it's technically perfectly feasible to jump from the Empire State building, but it's not recommended you'd do that :P  Now that I've seen previous posts I'd missed before, recommending a different way of "skinning the cat" (that's proverbial, before any of you activists get all loopy on me!) is indeed the best way forward, IMO.


"You don't need eyes to see, you need vision"

 

(Faithless, 'Reverence' from the 1996 Reverence album)

Link to post
Share on other sites
30 minutes ago, xannax159 said:

What kind of hardware would you recommend for a NAS/Video recording setup with 6-10 4k30fps streams?

what bitrate? What codec?

 

What are you captureing from?

 

35 minutes ago, xannax159 said:

This build isn't my decision, its the homeowners decision. im merely the person thats going to be physically assembling it and working out the software kinks. They want the factor of it all being 1 machine. I personally do understand it would be easier, cheaper, wiser to build individual work stations but in the end im merely recommending, the final decision isnt mine. @BlueScope819

Really try to tell this this will be a bad experinces, lots of issues. Just get seprate workstations for the rooms, and a server for the main area.

 

43 minutes ago, xannax159 said:

ve had no real good experience with networking, whats the fastest consumer available standard? It would just need to play nicely with the NAS. If a single 40GB port would suffice im all for it.

you need to make a plan for a network, or get someone in that knows what there doing. You don't want to connect directy to the nas, you want a switch, and that switch depends a lot of the rest of the network

 

44 minutes ago, xannax159 said:

I understand the issue with 4x 10Gig ports, but whats stopping me from using 5 GPU's? Is there truly no board that supports more than 4 video cards?

Threadripper is made for desktop atx boards mostly, so your limited to 7 slots. With dual width gpus that limits you to 4 cards with one handing over at the bottom. And nothing else.

 

You can get servers that will fit 10 gpus, but this is way out of budget, and don't have threadripper(xeon normally, some epyc parts). Also these are extremly loud, and often needs 240v power, and other annoyances

45 minutes ago, xannax159 said:

The only way i personally know if is Hard wired, running cables to each room. But if there is some super low latency system to doing this wireless im always open to ideas.

Do you want them to connect with a IP based solution, or some other wired solution like hdmi or thunderbolt? IP based solutions are the most flexable, but worse experience.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×