Jump to content

A Read Only Server for 500 people

That's a lot of money free for what you want to do.

 

Personally I'd run it on a Linux distro, use 2TB or 4TB HDD's, maybe put it in a RAiD1 setup.  If ya want to get crazy you could do 2x1TB SSD's for the primary data and then do a backup of it or copy over to HDD's.  You also have the possibility to create a cache that would be SSD that would then ultimately write to SSD if you wanted to save on cost and not use pure write to SSD.

 

Depending on what they are doing to access the stream I do a cifs share with a samba server and can regulate users/access completely with that and specify what files/location I want the to access.

 

Things you'll need to focus on primarily will be networking since you'll want it to be pretty quick, 1G NIC should be fine, no need to get into bonding if you're not going to be worried about reliability (if it breaks your viewers will just have to wait til a new one comes in).  Remember networking isnt limited to just the system but dependent on the type of router and modem you have as well, so you'll need 1G connection there too.

 

Memory 16G I'd think would be plenty and create a swap partition of 4G.

 

And a quad core processor as well, preferably intel since I'm an intel fanboy.  

 

EDIT* Oh yeah also use LVM so you can expand easily.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, yarn said:

That's a lot of money free for what you want to do.

 

Personally I'd run it on a Linux distro, use 2TB or 4TB HDD's, maybe put it in a RAiD1 setup.  If ya want to get crazy you could do 2x1TB SSD's for the primary data and then do a backup of it or copy over to HDD's.  You also have the possibility to create a cache that would be SSD that would then ultimately write to SSD if you wanted to save on cost and not use pure write to SSD.

 

Depending on what they are doing to access the stream I do a cifs share with a samba server and can regulate users/access completely with that and specify what files/location I want the to access.

 

Things you'll need to focus on primarily will be networking since you'll want it to be pretty quick, 1G NIC should be fine, no need to get into bonding if you're not going to be worried about reliability (if it breaks your viewers will just have to wait til a new one comes in).  Remember networking isnt limited to just the system but dependent on the type of router and modem you have as well, so you'll need 1G connection there too.

 

Memory 16G I'd think would be plenty and create a swap partition of 4G.

 

And a quad core processor as well, preferably intel since I'm an intel fanboy.  

 

EDIT* Oh yeah also use LVM so you can expand easily.

Thanks a lot. 
The network part looks promising so I'm not worried about that. It was only the server configuration that worried me. I'll probably get back to you with questions after I look into what you've suggested and done a little bit more of reading up. Thanks again. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, leadeater said:

Anything web server protocol based will monster CPU usage, video streaming servers heavily rely on multicast and client driven access. For example flash video players and native HTML5 video players use much less server CPU than other types of access methods. Just something to keep in mind.

I can't really agree 100% on this. Streaming servers softwares rarely rely on multicast because they are usually aimed at WAN streaming, which doesn't allow multicast. SMB and HTTP(S) file distribution, depending on what software is used (Samba, NGINX, Apache) won't even use that much CPU and can support several hundred accesses. Also, in order to take avantage of multicast, two users must request the same video at the same time, which is really rare. Where multicast is heavily used is when live streaming is involved : Ex. Streaming thousands of security cameras to multiple servers (redundancy) and those servers need to stream back hundred of streams to multiple clients (viewers). In my mutliple solution I've built for file and video distribution, the big problem was often drive access speed (latency & bandwidth) and network speed. If you are planning to stream on a local network only, I would go the route of a simple nginx server so you can distribute any kind of file through download and I would also look at Link Aggregation because 1Gbps won't serve more than 500 stream @ 2Mbps (Not taking into account TCP overhead). Since your case might require high IOPS (since all the users might not access the same file), I would take some SSD storage. HDD could do the trick if you took 2 x 2Tb RED in RAID 1 but if you have many random access, you might have difficulties. A solution could be to have a SSD cache (256Gb) so you have at least less stress on the mecanical for multiple access files. As for CPU, I would pick something in the Xeon family with 4 cores / 8 threads like the E3-1270 v5. Powerfull but budget chip.

If you wan't to primairly do video streaming, I would recommend you to go look at Nimble Streamer, this software is really awesome but it has a small monthly cost. You can do HLS and Progressive Download streaming. CPU usage is minimal with this software : http://blog.wmspanel.com/2014/08/utilize-all-bandwidth-with-nimble-streamer.html. I had a server hit 100 Stream on a single core VM and usage was 8% @ 3.6Ghz. You can also run NGINX on the side for your other types of files.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, CatBoiler said:

I can't really agree 100% on this. Streaming servers softwares rarely rely on multicast because they are usually aimed at WAN streaming, which doesn't allow multicast. SMB and HTTP(S) file distribution, depending on what software is used (Samba, NGINX, Apache) won't even use that much CPU and can support several hundred accesses. Also, in order to take avantage of multicast, two users must request the same video at the same time, which is really rare. Where multicast is heavily used is when live streaming is involved : Ex. Streaming thousands of security cameras to multiple servers (redundancy) and those servers need to stream back hundred of streams to multiple clients (viewers). In my mutliple solution I've built for file and video distribution, the big problem was often drive access speed (latency & bandwidth) and network speed. If you are planning to stream on a local network only, I would go the route of a simple nginx server so you can distribute any kind of file through download and I would also look at Link Aggregation because 1Gbps won't serve more than 500 stream @ 2Mbps (Not taking into account TCP overhead). Since your case might require high IOPS (since all the users might not access the same file), I would take some SSD storage. HDD could do the trick if you took 2 x 2Tb RED in RAID 1 but if you have many random access, you might have difficulties. A solution could be to have a SSD cache (256Gb) so you have at least less stress on the mecanical for multiple access files. As for CPU, I would pick something in the Xeon family with 4 cores / 8 threads like the E3-1270 v5. Powerfull but budget chip.

If you wan't to primairly do video streaming, I would recommend you to go look at Nimble Streamer, this software is really awesome but it has a small monthly cost. You can do HLS and Progressive Download streaming. CPU usage is minimal with this software : http://blog.wmspanel.com/2014/08/utilize-all-bandwidth-with-nimble-streamer.html. I had a server hit 100 Stream on a single core VM and usage was 8% @ 3.6Ghz. You can also run NGINX on the side for your other types of files.

Thanks for the info on the streaming side, yea I was most mostly thinking of live streaming since that what we do for our lecture video platform. You can also access the archives of it but as you said that likely isn't multicast, not a system I manage so guess work was involved.

 

Don't exactly agree on the SMB one though, we have an entire Netapp 8060 (2x  E5-2658) head dedicated to that per campus and CPU utilization is over 50% during the day and 100% during backups (ignore backups though, doesn't represent normal file usage at all). SMB is rather a pig of a protocol though so unless that is going to be used I'd just write this off as not that useful information.

 

Anyway knew you were the perfect person to ask, excellent information :).

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, CatBoiler said:

I can't really agree 100% on this. Streaming servers softwares rarely rely on multicast because they are usually aimed at WAN streaming, which doesn't allow multicast. SMB and HTTP(S) file distribution, depending on what software is used (Samba, NGINX, Apache) won't even use that much CPU and can support several hundred accesses. Also, in order to take avantage of multicast, two users must request the same video at the same time, which is really rare. Where multicast is heavily used is when live streaming is involved : Ex. Streaming thousands of security cameras to multiple servers (redundancy) and those servers need to stream back hundred of streams to multiple clients (viewers). In my mutliple solution I've built for file and video distribution, the big problem was often drive access speed (latency & bandwidth) and network speed. If you are planning to stream on a local network only, I would go the route of a simple nginx server so you can distribute any kind of file through download and I would also look at Link Aggregation because 1Gbps won't serve more than 500 stream @ 2Mbps (Not taking into account TCP overhead). Since your case might require high IOPS (since all the users might not access the same file), I would take some SSD storage. HDD could do the trick if you took 2 x 2Tb RED in RAID 1 but if you have many random access, you might have difficulties. A solution could be to have a SSD cache (256Gb) so you have at least less stress on the mecanical for multiple access files. As for CPU, I would pick something in the Xeon family with 4 cores / 8 threads like the E3-1270 v5. Powerfull but budget chip.

If you wan't to primairly do video streaming, I would recommend you to go look at Nimble Streamer, this software is really awesome but it has a small monthly cost. You can do HLS and Progressive Download streaming. CPU usage is minimal with this software : http://blog.wmspanel.com/2014/08/utilize-all-bandwidth-with-nimble-streamer.html. I had a server hit 100 Stream on a single core VM and usage was 8% @ 3.6Ghz. You can also run NGINX on the side for your other types of files.

Thank you a lot. That's everything I could ask for. 

Could you explain a little bit on why RAID 1 ? I thought mirroring was only required for data security. But in this case, since there is no write to the server involved, I don't understand why RAID 1. 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, esrootes said:

Thank you a lot. That's everything I could ask for. 

Could you explain a little bit on why RAID 1 ? I thought mirroring was only required for data security. But in this case, since there is no write to the server involved, I don't understand why RAID 1. 

RAID 1 gives you read improvements just like RAID 0 does - assuming the RAID controller or software implementation supports it.

Looking to buy GTX690, other multi-GPU cards, or single-slot graphics cards: 

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, leadeater said:

Thanks for the info on the streaming side, yea I was most mostly thinking of live streaming since that what we do for our lecture video platform. You can also access the archives of it but as you said that likely isn't multicast, not a system I manage so guess work was involved.

 

Don't exactly agree on the SMB one though, we have an entire Netapp 8060 (2x  E5-2658) head dedicated to that per campus and CPU utilization is over 50% during the day and 100% during backups (ignore backups though, doesn't represent normal file usage at all). SMB is rather a pig of a protocol though so unless that is going to be used I'd just write this off as not that useful information.

 

Anyway knew you were the perfect person to ask, excellent information :).

Might want to put an RDMA/iWarp card in to that SMB share/cluster.  CPU usage will be non-existent if its SMB 3.0, had to intervene on a project recently where someone tried to implement an enormous SMB platform without RDMA.  Even with the best intentions you want it all out of CPU cycles where possible.

 

Pick up a cheap Mellanox or Chelsio card off the ole eBay that has support.  Obviously Windows 2012 R2 and above.

Not quite on topic but hopefully the info will be useful :)

Please quote or tag me if you need a reply

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Falconevo said:

Might want to put an RDMA/iWarp card in to that SMB share/cluster.  CPU usage will be non-existent if its SMB 3.0, had to intervene on a project recently where someone tried to implement an enormous SMB platform without RDMA.  Even with the best intentions you want it all out of CPU cycles where possible.

 

Pick up a cheap Mellanox or Chelsio card off the ole eBay that has support.  Obviously Windows 2012 R2 and above.

Not quite on topic but hopefully the info will be useful :)

Netapp 8060 do support that, the general PC and laptop that use it don't and wont for a long time. And we aren't going to go out and buy 6000 RDMA cards for the desktops :P.  But yea RDMA is awesome, still a little annoyed my X540's at home don't support it.

 

Everything else is either NFS or iSCSI, VMware and SQL, which also run on their own heads.

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, leadeater said:

Actually I know the perfect person to bring in to this conversation, not sure if he will join in but worth a shot. @CatBoiler

Ay this is my profession yet nobody ever paged me...

My native language is C++

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, tt2468 said:

Ay this is my profession yet nobody ever paged me...

Pff I didn't know, can't blame ignorance.... too much.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×