Jump to content

Bought 3 dirt cheap servers which I want to meld into 1 system. Is daisy chaining just a myth, or can I actually have 3 machines (or more) become 1?

Ok, so first off don't panic, I haven't actually got 3 working servers just yet, the other 2 are spare parts which I could turn into working machines for a fairly small investment. (why are VRMs more expensive than processors?!)

 

Oh, and disclaimer, while I'm a bit brainy an a swift learner, I don't know a whole lot about networking and Linux has so far confused me with its sudo nonsense! So, speak clearly in concepts I can research rather than technical jargon if you want me to not cry.

 

As the title says, I want to be able to turn 3 servers into 1 machine. If that isn't going to happen then stop reading here and call me an idiot for buying 100kg of stuff for no reason!

 

Primary plan is utilise the 16 drive bays in 1 server for NAS, and allocate cores to VMs I can access through otherwise utterly useless tablet devices that can just about run a browser and cost somewhere approaching zero. These get dished out amongst different rooms to provide more chunky processing capabilities without having to have an actual computer in each room, or having to buy expensive tablets. Ye know, like starship Enterprise has one central computer doing all the shazam. Looking forward to Threadripper/Epyc getting cheap as anything in a few years' time! If the servers can be chained, obviously I'd end up with then 48 bays for drives which I'm sure RAIDz would be able to find a use for when it's bored.

 

Secondary plan is to get the other 2 systems up and running and meld them all into one badass machine which can be used as and when I feel like it to do much more demanding tasks. Say video editing or encoding or other such nonsense. Yes, I'll probably have to slap some graphics capability in them but 680s seem to end up in the trash these days anyway.

 

Tertiary plan is to spend left over crunching power doing something bit-coin shaped (if it's even worth doing any more) or even perhaps as a remote powerhouse that I can flog processing time on/donate to nuclear physics laboratories. Or, slightly more sensibly, give me the ability to continue composing/producing music over the internet because I'm not spending £3k on a laptop which is actually up to task. If I can't access my network (and thus VMs) from the internet then what the bloody hell is the point in the internet anyway?!

 

Machines are (drum roll) ProLiant ML370 G5s.

1. 2x X5355s with 44GB(DDR2) and many many HDDs

2. 1x X5459 with 16GB(DDR2) and zero HDDs

3. A mother board. And a case that weighs about 40kg.

4. £200

 

Now, assuming what I want to do is at all feasible....

 

Questions!

1. Will I need to use the same processor in each machine?

One assumes that paired processors are required in a 2 slot board (yes noob, I know, it's savage guess work!) and so would that mean I would thus need all the processors to be the same?

 

2. Will unraid or FreeNAS fit the bill?

I'm quietly confident that some form of Ubuntu will manage the task but that would mean I need to know things and stuff, and that's not currently the case.

 

3. Can you convince Linus et al to do a video on it?

Seriously, if you search youtube for 'daisy chain server' it's pretty much a googlewhack picking up nothing but noise and some woman getting series and parallel electrical circuits a bit wrong. He'd get a million hits anyway, the stuff costs sod all and it'll help usher in the future of houses which actually come with a computer installed in them. I bet even Bill Gates'd watch it!

 

4. Using unraid/FreeNAS, can the VMs actually access the NAS?!

I know it seems like a stupid question, but honestly when I think about it the VMs would need to talk to the NAS as a network location, right? Would that make things get all confused?

 

6 (sod 5 that's boring). If there are many ways to daisy chain and it's not a myth, what's the best way?

I can imagine a PCIe card that just connects the systems together on a primal level, but if it can only be achieved by Ethernet cable is 1Gbit/s actually swift enough?

 

G. Is there a resource (video channel or some such) which speaks Linux/Ununtu but also translates into people?

Level 1 Tech is great and they do seem to be the most likely, but can't find anything which'd give me an idea of what is and isn't possible.

 

8. If Ethernet is the way forth...

Is It better to hook the machines to a network switch using 2x Gbit/s inputs, rather than one into another into another and then 2 feeds to a main router from first and last machine?

 

9. Can I add extra machines willy nilly?

'cause if I'm picking up more servers for dirt cheap and decide 24 cores is getting dull and boring and I've discovered some way to actually make money out of it, then I might want to waste even more time making a properly mental super computer which could one day take over the world! Yeah ok, it's unlikely, but a megalomaniac can dream!

 

10. The servers have iLO. Integrated Lights Out.

What's that then? It seems to be a remote console thingy jobby but it doesn't work in Win10 and I have no idea what I'm doing with WinServer '08!

 

Anyway, thank you very much for reading this internal monologue of mine and I do hope that at the very least my enthusiasm has brightened your day!

 

Jon

 

ps if it posts multiple times it's because my service provider I having trouble finding their own stable connects, let alone mine!

Link to comment
Share on other sites

Link to post
Share on other sites

Well if you are wondering, no, you can't just hook up a cable to each one in order to daisy chain them. Computers don't work that way. You can however run a linux distro on each one that allows clustering, but it would probably not be what you are looking for.

My native language is C++

Link to comment
Share on other sites

Link to post
Share on other sites

Cheers buddy!

I'll have a google to find out how clustering works!

Link to comment
Share on other sites

Link to post
Share on other sites

Daisychaining is not possible.

 

The servers are crap, low performance and lots of power consumption.

 

The 2 x x5355 's performance is maybe about 80% of the performance you'd get with a single  Ryzen R5 1400 processor (4core/8 thread), and the difference in power consumption is huge (2 x 120w TDP cpus vs 1 x 65w TDP)

Those 44 GB (but I suspect it's 48 GB) of memory would also chew power, i figure there's 8 sticks x ~ 3w per stick or about 25 watts just for the memory sticks. 2 x 8GB DDR4 would use less than 5w

And Ryzen supports ECC memory as well, so there goes pretty much the only "pro" for servers.

Basically, the difference in cost between such an old server and a ryzen build you'd basically recover in about 2-3 years just from the electricity costs, if you run the system 24/7

 

The servers would be crap for mining, a single RX 470 card or something equivalent would be several times faster and use lots less power compared to those systems.

 

Pretty much the only thing I could think of using that dual cpu server for would be to render 4K h264/hevc videos over night ... because the amount of memory would help and because video encoders can use lots of cpu cores (even weak ones) to encode in parallel.

 

Oh there's one other thing they'd be good for... to heat your room. But it's summer now.

 

If you want a NAS.. then if you can, consider keeping the case (if it has those cool trays for up to 16 hdds) and think how you'd be able to install an atx or matx board there with a regular ryzen to save money on electricity. The ryzen gives you 6 sata 3 ports from the start and you can always buy a 8-16 port hba controller to add support for more drives (unless the case has SAS connectors or some special connectors for those hard drive bays)

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah I've been using one as DAW for a few years now. Keeps my legs warm!

Link to comment
Share on other sites

Link to post
Share on other sites

Look up farming

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

Righto, add that to the list. Unless you mean agriculture and just want to make fun, then I'll just be sad.

Link to comment
Share on other sites

Link to post
Share on other sites

So, will Xenserver work? It does seem (on initial research) to offer clustering capability and then run VMs? Though controlling a machine from the web would be a different story.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, i_am_kingjonny said:

So, will Xenserver work? It does seem (on initial research) to offer clustering capability and then run VMs? Though controlling a machine from the web would be a different story.

I'd personally use ovirt but xenserver and proxmox and esxi will also work. 

Link to comment
Share on other sites

Link to post
Share on other sites

If you're in ausland, the proliant G6 servers go for about $400AUD/$300USD which would out-gun those two together. Dell servers there are pretty damn expensive for some reason. Even though they're 1:1 in specs (R710,R610 etc..) lol.

 

Anywho GlusterFS if you want to cluster them for storage. It's not too bad, worth a look (plus what was already mentioned).

 

As for virtualization... you don't really "cluster" in the way I think you're thinking... xenserver/esxi/promox - when they cluster they're just distributing virtual machines across the hosts to balance the load, but you're not combining horsepower. Meaning 1 virtual machine will run on one or the other host, then you add a second VM and it'll go to the opposite host. They're not going to have any better performance when clustered. To achieve a performance boost it will require speciality software with a single task in mind. Render farms certainly make use of this.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Thanks for the reply bud. UK based.

Storage was my primary aim so I'm glad I can at least form them into something useful. It's a shame I can't build them into a single powerhouse, but at £200 it's an inexpensive hobby for me to fiddle with. And for the sake of the 5 array card batteries (£40 each), 16 hard disks (£20/£30 each), 60gig ram (probably £50 total), processor upgrades (£65) to the audio workstation and other bits and bobs I don't feel like I've made a mistake buying them. :)

 


On the subject of bitcoin and getting the machines to do something more worthwhile, as you can imagine I've discovered my HD7770 is by itself far better at the task than all 8 cpus (I already owned one of these for Audio processing) would be, plus a random GT9500 I had laying around. So given the boards have 8 slots in each I'm now looking at what I can shove in them to generate a profit.

The dual (redundant) power supplies are 850 (peak at 1000) so could probably take a good number of cards each so long as they draw from the slot. (I had to use a molex adapter to get the 7770 to work which was a massive faff) Right now I'm looking at cheap 7750s on eBay.

But there are these other peripherals (can't recall their designation) designed to work specifically on blocks; and they're all USB? OK so if I'm feeling rich and it's financially viable I can populate the slots with USB hubs and have a cat-o-ninetails of USB leads coming out the back of each machine, but are there any slot based solutions for bit coin other than Radeon cards? For the life of me I can't seem to get google to show me any if they do exist!

Link to comment
Share on other sites

Link to post
Share on other sites

Bitcoin isn't worth mining with cpu or video cards. It's custom chips territory these days (those boxes that plug in usb ports or whatever)

 

Other currencies are better designed to work on video cards and are hard to get working on custom chips, so that's where you should focus on.

But cards like 77xx would have poor value (profit per power used and space taken in a system).. people these days buy RX 470/RX 480 and RX 570/R80 as that seems to be the sweetspot (best performance for cheap price for the card) .. with such cards for some currencies it seems like you could make up to 5-8$ a day mining, but it would also cost you 1-2$ a day in electricity costs (about 150 watts or 2kWh a day)

Keep in mind that the cards would produce heat 24/7 as well, so will the power supplies.. so you're making your pc a heater. In UK maybe it wouldn't matter since it's cloudy and colder so you could just open the windows maybe but in warmer countries you would also pay more for air conditioning to keep your house cool and that would also eat into your profits.

 

Like I said.. those servers have poor power consumption ... i mean from the start the systems probably idle at around 50-100 watts and you're trying to mine with a card that probably has less power consumption that the server itself... it doesn't make sense.

Link to comment
Share on other sites

Link to post
Share on other sites

Thanks mate!

I think it's actually closer to 250w at idle!

I'll take away your greater experience though and come back again in the distant future when I've found some use for them other than NAS.

 

At least in the winter I can save on central heating! :D

Link to comment
Share on other sites

Link to post
Share on other sites

What you're looking for is Clustering as @tt2468 said, and one thing to look into for fun would be virtualization clustering.  Such as Microsofts HyperV and VMWares ESXi.  You have virtual machines shared throughout the servers meaning if one servers go down it will automatically know and start the other virtual machine on the other server.  (Known as Migration) 

"45 ACP because shooting twice is silly!"

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×