Jump to content

Help me back up my servers :D

Hello everyone. I'm managing a network of about 20 PCs in a small office. We use windows SBS 2011 (soon to migrate to server 2016 std) and currently don't have much of a backup system other than a daily full metal backup of the OS drive. Our data drives are mirrored but that's about it.

 

With that said, I would like to build a server backup solution that provides a similar functionality to what linus described in this video. Particularly, I would like to accomplish two things:

1. Real time backup of the data drives, and have it configured such that if a file is deleted, it is still available on the backup server in the recycling bin.

2. Daily/weekly System images so that I can restore the server overnight if it goes down.

 

I'm a bit confused as to how to approach this from a hardware and software standpoint. We don't have a ton of data (<10tb) but we live in a small community with slow internet and as a result use bonded LTE (with data caps) making cloud storage less ideal. Any help is appreciated!!

 

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, stevosimpson said:

Hello everyone. I'm managing a network of about 20 PCs in a small office. We use windows SBS 2011 (soon to migrate to server 2016 std) and currently don't have much of a backup system other than a daily full metal backup of the OS drive. Our data drives are mirrored but that's about it.

 

With that said, I would like to build a server backup solution that provides a similar functionality to what linus described in this video. Particularly, I would like to accomplish two things:

1. Real time backup of the data drives, and have it configured such that if a file is deleted, it is still available on the backup server in the recycling bin.

2. Daily/weekly System images so that I can restore the server overnight if it goes down.

 

I'm a bit confused as to how to approach this from a hardware and software standpoint. We don't have a ton of data (<10tb) but we live in a small community with slow internet and as a result use bonded LTE (with data caps) making cloud storage less ideal. Any help is appreciated!!

 

What is your background? Being able to manage backups is pretty basic stuff covered in almost all IT related degrees.

 

Honestly I would just run a weekly full backup off peak with Incremental backups performed daily. If you are wanting to keep previous versions of files then you will want to enable shadow copies. Real time backups might not be ideal though.. Just use a raid solution to keep your data safe and make sure to replace failed drives asap.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, stevosimpson said:

Hello everyone. I'm managing a network of about 20 PCs in a small office. We use windows SBS 2011 (soon to migrate to server 2016 std) and currently don't have much of a backup system other than a daily full metal backup of the OS drive. Our data drives are mirrored but that's about it.

 

With that said, I would like to build a server backup solution that provides a similar functionality to what linus described in this video. Particularly, I would like to accomplish two things:

1. Real time backup of the data drives, and have it configured such that if a file is deleted, it is still available on the backup server in the recycling bin.

2. Daily/weekly System images so that I can restore the server overnight if it goes down.

 

I'm a bit confused as to how to approach this from a hardware and software standpoint. We don't have a ton of data (<10tb) but we live in a small community with slow internet and as a result use bonded LTE (with data caps) making cloud storage less ideal. Any help is appreciated!!

 

Linus isn't a sysadmin. (and man he could really use one because...yeah)

 

His "in depth" explanation here is really high level and hardly in depth at all.. but to me it doesn't look like a correct solution. Simply synchronizing a file system isn't a backup and Linus might be susceptible to ransom ware with his deployment. (as changes appear to be over written.)

 

You want daily incremental backups. (meaning each alteration of existing files is unique) You want that sent to a redundant NAS preferably a ZFS NAS and you want an offsite backup of that NAS. Tape or amazon aws is a good way to go. (Tape? it's 2018?.. well tape is still effective and can store a ton of data, people provide services where a courier will drop off new tapes and check your old ones into their vault) You can use ZFS snapshots if you don't have a good way to do incrementals. If you have two or more sites you can send the snapshots to your other sites. I've used ZFS for 10 years now and nothing even comes close to it's efficiency and ease of use on the market.

 

The products you choose to do this will be dependent on your environment.

 

Bacula is popular but I'm not a huge fan of it.

 

"Only proprietary software vendors want proprietary software." - Dexter's Law

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, jde3 said:

His "in depth" explanation here is really high level and hardly in depth at all.. but to me it doesn't look like a correct solution. Simply synchronizing a file system isn't a backup and Linus might be susceptible to ransom ware with his deployment. (as changes appear to be over written.)

Not mentioned in the video is the offsite backup of that data they also do have so it's not AS bad as what is shown in the video. Couldn't help thinking when I was watching it why he wouldn't just use Volume Shadow Copies/Previous Versions instead of that sync thingamajig.

 

7 hours ago, jde3 said:

Linus isn't a sysadmin. (and man he could really use one because...yeah)

Understatement of the year so far xD.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, leadeater said:

Not mentioned in the video is the offsite backup of that data they also do have so it's not AS bad as what is shown in the video. Couldn't help thinking when I was watching it why he wouldn't just use Volume Shadow Copies/Previous Versions instead of that sync thingamajig.

 

Understatement of the year so far xD.

Also shadow copies tend to be affected by ransom ware as well because the malcode will wipe them before crypt. Anything the user can touch. It has to be at another level.

"Only proprietary software vendors want proprietary software." - Dexter's Law

Link to comment
Share on other sites

Link to post
Share on other sites

38 minutes ago, jde3 said:

Yeah, it's copy in place. no no no Also shadow copies tend to be affected by ransom ware as well because the malcode will wipe them before crypt. Anything the user can touch. It has to be at another level.

That doesn't mean you shouldn't use Shadow Copies, that only means you shouldn't use them as a backup. Previous versions have an extremely good user experience and any user can retrieve deleted files or older versions of files all from within Explorer on their own device. It's the very use case Linus is using that Sync tool for and the way he does it requires an entire other server or disk volume, not exactly space efficient. Better to use that second server as an actual backup server decoupled from the live data it's protecting, the big issue you pointed out with that sync tool.

 

Edit:

Also for a network share a user isn't going to have the require perms to wipe the shadow copies on the server anyway, those are actually system level and protected so admin perms required to wipe them.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, leadeater said:

That doesn't mean you should use Shadow Copies

People like them, it's fine, versioning is good. A good file level incremental will take care of that though.

12 minutes ago, leadeater said:

Also for a network share a user isn't going to have the require perms to wipe the shadow copies on the server anyway.

Yeah, It's just to the level that the user themselves can touch. That would include anything writable on the network.. if it's not writable then no. If you only have one version however and that is updated to the crypto file, then your hosed.

"Only proprietary software vendors want proprietary software." - Dexter's Law

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, leadeater said:

Couldn't help thinking when I was watching it why he wouldn't just use Volume Shadow Copies/Previous Versions instead of that sync thingamajig.

The only thing I can think of is a separate server lets you keep working if the system goes down(linus said change the mount point, but really something like dfs that has clustering built in is a much better idea here)

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, jde3 said:

People like them, it's fine, versioning is good. A good file level incremental will take care of that though.

And both is better because Shadow Copies means self service, backups of NAS shares are not. Shadow Copies are also much faster then even disk backups, really quick to revert 1TB share back to 1 hr ago, less than 1 minute.

Link to comment
Share on other sites

Link to post
Share on other sites

Something else I noticed on second watch is... he has Chrome and FileZilla installed on his servers.

(I was trying to see if he was actually using Syncthing, but it doesn't look like it)

 

FTP Linus? -- Maybe WinSCP would be better?

Chrome is just kind of wtf.. who is browsing the web on a server? heh.

"Only proprietary software vendors want proprietary software." - Dexter's Law

Link to comment
Share on other sites

Link to post
Share on other sites

So ideally your users do not store data locally - first step is to change that behavior. It's a nightmare to manage so many different machines.

You could keep a copy of a base image so if you need to redeploy you could save a little bit of time.

 

Windows Backup isn't all that bad, and for a small environment could be just fine. Veeam windows agent works great for free too (though I don't think you can use it commercially). Veritas works well but a headache to learn. At home I use Veeam, at work I use Veritas. I would love to switch to Veeam at work.

 

There's a few aspects of backup and you have to sit and think about what you want to do. How often do you want to retain backups and at what intervals. Budget and necessity go hand in hand here. I like to be able to restore from within 3 hours during the day (snapshots/vss), any weekday within 2 weeks, any month within 2 years, and any year within 5 (requirement to keep data for 5 years). The monthly/yearly backupsets are full, so they're larger. The under 2 weeks backups are mostly incremental data, and take up very little space (though I perform a full weekly). 

 

Just a note about vss/snapshots - despite any shortcomings, giving the user the ability to restore something from the morning on their own is godsend. I've got my backups on tape in case of a fire lol, so as long as you have the data in a 3rd place you'll be fine.

 

What you do will depend on the type of work your company does, and how much you have to spend on storage for the backups.

 

Linus is just syncing data with a really weird backwards solution for recovering deleted files. I'm not sure how that would handle recreating the same file over and over. Imagine you edit a picture, export it - decide you don't like it, and overwrite it 5 more times before deciding that you like it. Only to realize there was a layer from the first edit that you could really use.... I used to work for an engineering company, this happened a lot. We did hourly snapshots here.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Here's a quick sketch from visio of what I'm thinking of doing. Is raid z the appropriate approach? Also note, the daily full metal backups of the server OS are already occurring and the Heroku IoT backup would be an extra thing I tag on.

Net backup.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

No real need to put your daily and weekly full into separate volumes. The incrementals do not work without the full backup anyway.

Ideally you want your DC to do a little work as possible avoiding extra roles such as a file server. However if this is a constraint then you gotta do what you gotta do. 

 

RaidZ is fine for a backup repository - you always have your live data + monthly to pull from should the array fail. RaidZ2 etc.. just gives you more fault tolerance but since it's a backup, I don't see a need honestly.

 

If you have enough disks, I would create the data repository on the ZFS Backup server instead, and just use a scheduled job to copy data from the data repository to the backup volume. You'd have 2 arrays (live data + backup data) and then whatever your third solution is for remote backup.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×