Jump to content

Off-site Backup?

JCH

Hello,

 

So I'm hosting some important projects on a dedicated server with a 256GB SSD, however I feel like I should have an off-site backup server to remotely backup all of my files every 24 hours or so. I've built myself another server, however I'm unsure how to setup this remote backup. 

Link to comment
Share on other sites

Link to post
Share on other sites

first of all, can your internet speed handle uploading hundreds of GBs of data once a day without making your internet unusable?

NEW PC build: Blank Heaven   minimalist white and black PC     Old S340 build log "White Heaven"        The "LIGHTCANON" flashlight build log        Project AntiRoll (prototype)        Custom speaker project

Spoiler

Ryzen 3950X | AMD Vega Frontier Edition | ASUS X570 Pro WS | Corsair Vengeance LPX 64GB | NZXT H500 | Seasonic Prime Fanless TX-700 | Custom loop | Coolermaster SK630 White | Logitech MX Master 2S | Samsung 980 Pro 1TB + 970 Pro 512GB | Samsung 58" 4k TV | Scarlett 2i4 | 2x AT2020

 

Link to comment
Share on other sites

Link to post
Share on other sites

First off, what OSes are your servers running? Answers will vary wildly for different OSes. Also, to avoid the problem @enderman stated, you can run automated backups at ridiculously late hours of the night (personally my local and remote storage servers sync between 3AM and 6AM). 

 

Secondly, how fast is your internet connection, specifically the upload rate on the location of your dedicated server, and the download rate of the location where the backup server will be located? The slower of the two speeds will be how fast you will be able to transfer data. 

 

Thirdly, how much will the data be changing inbetween backups? 

2 hours ago, Enderman said:

first of all, can your internet speed handle uploading hundreds of GBs of data once a day without making your internet unusable?

If @JCH uses incremental backups (which only upload the changes to files since the last backup), and their internet connection is relatively modern (>=30mb/s symmetrical), and they set the transfer to only occur at night, this shouldn't be nearly as big of an issue is you make it out to be. 

"/usr/local/bin/coffee.sh" Missing – Insert Cup and Press Any Key

Link to comment
Share on other sites

Link to post
Share on other sites

Get something like Crashplan. Its only a couple $ a month (6, I think?) for unlimited storage, and it will backup by default every 15 minutes (Configurable). It will only backup changed files ofcourse.

 

You don't have to pay $, as you could use the second server as the destination for free.

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, Enderman said:

first of all, can your internet speed handle uploading hundreds of GBs of data once a day without making your internet unusable?

 

15 hours ago, networkArchitect said:

First off, what OSes are your servers running? Answers will vary wildly for different OSes. Also, to avoid the problem @enderman stated, you can run automated backups at ridiculously late hours of the night (personally my local and remote storage servers sync between 3AM and 6AM). 

 

Secondly, how fast is your internet connection, specifically the upload rate on the location of your dedicated server, and the download rate of the location where the backup server will be located? The slower of the two speeds will be how fast you will be able to transfer data. 

 

Thirdly, how much will the data be changing inbetween backups? 

If @JCH uses incremental backups (which only upload the changes to files since the last backup), and their internet connection is relatively modern (>=30mb/s symmetrical), and they set the transfer to only occur at night, this shouldn't be nearly as big of an issue is you make it out to be. 

I have housed my servers in a datacenter with at least 1gbps down and 500mbps up.

 

My main server with all of my projects is running CentOS 7 x64, however my backup server can run any OS that I want.

 

I have around 120GB in terms of files.

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, JCH said:

My main server with all of my projects is running CentOS 7 x64, however my backup server can run any OS that I want.

If you're good with writing your own scripts, I'd recommend taking a look at rsync (man page here); it's a network file transfer command that works great for backups. It supports things like compressing files to minimize the amount you need to transfer, then decompressing them on receive, an incremental option that only copies over files that have changed, and the ability to set it up with SSH keys so you don't have to enter in a password manually. 

 

Personally I have a script that uses rsync to copy between my local and remote servers, which is run by cron every night. 

"/usr/local/bin/coffee.sh" Missing – Insert Cup and Press Any Key

Link to comment
Share on other sites

Link to post
Share on other sites

Install crashplan on both.

 

Setup your offsite server as a backup destination.

 

Otherwise, rsync is great if you have a static ip or dynDNS

 


I backup everything at home to the server AND to an offsite system via crashplan.  Considering I already have a local backup, the offiste server is minimal with a single large drive.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×