Jump to content

File storage & synchronization between offices

hey folks!

 

parameters:

- office locations across a few provinces (<10 to <50 users per location) 
- gigabit network infrastructure in each office

- local old server (that needs replacing) in each office

- total storage of all locations combined currently <8TB

- all users have office 365 licenses with 1TB onedrive included

- all users log in via active directory domain

- all users run windows 10 pro or 11 pro on their workstations

- internet speeds are limited to 1 gigabit down, some have lower upload limits

- backups are currently done to a cloud service on daily basis

 

problem:
outside of having outdated servers that need replacing sooner than later, the biggest issue is frustration caused due to bad speed/latency when opening and saving files on the remote drives. each location currently has its local drive from the physical server mounted plus drives from the other locations mounted for accesses via the web. anything accessed locally works without any issues for obvious reasons. and since autocad files and large design files are being used fairly frequently, going purely cloud based was not recommended. there are also a hand full of users that work fully remote, so no access to a  local server whatsoever.

 

notes:

since storage has become so affordable i don't see an issue to equip each new server with enough storage to host all files that are currently split between locations. that would at least eliminate speed issues for everyone and only leave the synchronization and backup to be resolved. the fully remote users are their own issue in addition. things like resilio, azure, aws were mentioned to me but what really makes sense is up for debate. a local canadian solution would of course be great for file "safety" but is understandably not a must. i was wondering if with all servers hosting the same files maybe the the sync and backup solution could exclude a cloud service entirely but i could be wrong.

 

question:
what's the most cost effective and simple solution to sync multiple office locations with (local) file storage to be futureproof for the next 5-10 years and not create bottlenecks for the users? and what could be a good choice for local file syncing on the user's workstation that then gets sync'd back to the main location(s) after edits have been completed for those guys that are fully remote?

 

let me know if you have any questions that could make the above easier to understand. any constructive feedback/suggestions are greatly appreciated!

Link to comment
Share on other sites

Link to post
Share on other sites

For local workstation/client sync you can use 'Offline Files' and configure it to force syncing of all files in certain folder on network shares or the entire share. To actually use the local/cached copy if you are on the network you need to right click on the folder and select 'Work Offline', when you are finished do the same but change it back to Online and wait for the sync to complete.

 

Issues with above, potential sync conflicts, if multiple people change the same file offline then the last to sync is the file you get and all other are lost. If the file changes on the server since going offline the Windows Sync utility will ask which version you want to keep. So minor data loss risk and usability quirks. Offline Files is most commonly used for laptops and 'Home Drives' so people can work remotely and offline while maintaining access to all documents and profile data if you have redirected these to a network location.

 

For server sync, server to server, you can use DFS-R that is inbuilt in to Windows Server and you can setup as many copies as you like. This means as long as every server has enough storage for all the data you can sync the data between all of them. These are asynchronous copies so again potential for sync conflicts if the same file is changed in two different locations at the same time or left open for long time by multiple people then something has to figure out which file version to keep. Often, but not always, open files are locked by the application to prevent this as well as applications being built for multiple people to have it open and editing at the same time i.e. current versions of Microsoft Office.

 

The other Windows native server to server sync option is 'Storage Replica' and this one isn't as "free" from memory, requires Windows Server Datacenter for most real practical usage as Standard Edition is limited to 2TB volume size as well as limited to a single volume. Storage Replica is however synchronous so not suitable over higher latency connection, both of these aspects I would say make it not suitable for your situation.

 

There are other 3rd party software options (i.e. Resilio) that will give you similar to DFS-R that may handle things a little better/different to your situation, I haven't used many myself but I know there are many good options.

 

The first scenario of using Offline Files you can do now at no infrastructure cost and a little bit of configuration and user training, Just make sure you do a good amount of testing so you don't dig a hole you don't want to be in and didn't understand well enough before stepping in to it.

 

If money/Opex wasn't an issue then I would say pure cloud storage i.e. OneDrive/Teams/SharePoint would be you best bet and cached files are always used over the remote copy. Other market options like Google Drive, Dropbox etc offer the same so explore those as it may not be as expensive as expected and a whole lot simpler than maintaining on site servers at each location and the hassles with that which you are experiencing.

Link to comment
Share on other sites

Link to post
Share on other sites

thank you for the feedback and input!

 

having local hardware is indeed an extra headache with setup and maintenance, plus considerable upfront and running cost for each location. maybe cloud based should be re-considered, just by the sound of it. if offline copies of files can be worked on to avoid the frustration with slow file access speeds it would work. to sell such idea a price comparison would probably do it, eg. physical local servers with 3rd party vs synchronization vs. fully cloud based with microsoft collaboration software.

 

every user already has onedrive and uses teams in a (currently very limited) way. one issue i was made aware of is the character limitation that come along sharepoint for instance, as we have files that are deeply routed in lots of sub-folders. that would have to be remedied ahead of time which is probably quiet the task.

 

offline files are too risky with so many locations and users i assume, conflicts would be unavoidable. the current system in use is using the DFS-R option which is working okay but doesn't resolve the speed issues across locations. and remote users suffer from VPN slowdowns in addition, making work frustrating.

 

the key word is cached files while being cloud based i think, as you mention. so once the file is cached you no longer rely on the ISP's speed limitations and local users area treated basically the same as remote users, one uniform system. one question that pops up for me is what happens when there is an outage with the ISP, how would you go around file access? a local NAS backup of the cloud files?

 

and for autocad specifically, it's scary what negative posts you find online about a platform that supports working with autocad files well. solutions like zee drive showed up which look reasonable and cost a fraction of what autodesk charges for their cloud based offers. but this is a pretty specific problem of course.

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/31/2024 at 4:50 PM, ninebot said:

one question that pops up for me is what happens when there is an outage with the ISP, how would you go around file access? a local NAS backup of the cloud files?

If the file is not cached then you have no access, this also creates sync conflict issue too but that is basically unavoidable in your situation no matter what you choose. It is however quite minimal if you go with a cloud storage option as downtime isn't really that common and files won't automatically be overwritten, that requires manual intervention to choose what to happen.

 

We backup everything that is Cloud based back onsite but that's more for backup/data protection and not for maintaining or creating access to the data again since having that data doesn't mean you can actually access it. If it is files in a Teams site then your Teams client can't access or know you restored the file to a local file server temporarily etc. So you don't only have to plan for how to backup Cloud data you also need to plan how to give access to this data when deemed necessary (waiting for restoration of services is often better) as well as making sure any data that you have changed is copied back to the live production Cloud service.

 

On 3/31/2024 at 4:50 PM, ninebot said:

and for autocad specifically, it's scary what negative posts you find online about a platform that supports working with autocad files well. solutions like zee drive showed up which look reasonable and cost a fraction of what autodesk charges for their cloud based offers. but this is a pretty specific problem of course.

My advice is avoid anything that takes these cloud storage repositories and turns them in to network drives. It introduces technical issues that can be difficult to resolve if encountered as well as pretty bad performance, secondary to that it doesn't foster a change of work flow and style more suited to utilizing Cloud services which just leaves in you a technical debt scenario that just gets worse as time goes on.

 

Personally I'm a big proponent of using what is the best TCO and fits the business and technical needs which is actually quite often on-prem infrastructure and software solutions but your work as you describe it very much fits in to the category of being better served by Cloud solutions excluding creating your own "Cloud". You could setup something yourself i.e. Nextcloud but I actually don't recommend it. Have a look at Nextcloud anyway, just have really good planning sessions around it if you do want to consider going down that path and identify all the potential problems with such a solution and make sure you include being able to support it i.e. People.

Link to comment
Share on other sites

Link to post
Share on other sites

thanks again for your reply!

 

i think with the <1% downtime of most ISPs it might be worth the risk to go semi-fully cloud based. and for the remaining 1% risk we could put a 5G mobile network backup system in place, if desired. plus a local data backup in addition? is a simple NAS suitable for that purpose? or don't bother at all as you say, better what until it's restored as you say?

 

did some testing with autocad in combination with sharepoint sites today, which went surprisingly well. i find follow the severe issues i find posted all over the net. i assume this might have something to do with us not collaborating on files much, or at all. so single users access single files, which maybe avoids a lot of issues others have. will do some testing with other software we run, like adobe suite and some design software to see if it works just as well.

 

with multiple locations (potentially more following in the near future) and full time remote users you are probably right that cloud based could be a viable solution in this case. not our own cloud as internet speeds won't allow us to run such systems efficiently.

 

i also prefer owning things, especially because you have power over cost that way, but that seems impossible to do going down this new path. you basically give up all the power and give it to microsoft, hoping they won't disappear or simply raise all subscription fees to stupid levels that negate all the savings we currently envision...

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, ninebot said:

is a simple NAS suitable for that purpose?

For really important data I would say absolutely yes, core business files like anything related to payroll, finances, HR etc. Stuff that if lost the business can't really operate without. Depending on where it is it/what Cloud service is used you might need to invest in a backup software product but this may not be necessary or a free option could be suitable.

 

14 minutes ago, ninebot said:

don't bother at all as you say, better what until it's restored as you say?

One of the bigger reasons to do backups is data corruption or recovering from data changes (user error, malicious changes etc). So think about scenarios outside of temporary loss of access and see what you think may or may not be necessary. Not all data is super important or can't be obtained or created again.

 

14 minutes ago, ninebot said:

i also prefer owning things, especially because you have power over cost that way, but that seems impossible to do going down this new path. you basically give up all the power and give it to microsoft, hoping they won't disappear or simply raise all subscription fees to stupid levels that negate all the savings we currently envision...

Yes that is a significant risk factor many overlook or downplay but I'm glad you understand it. It's very important for those in decision making positions to understand you cannot outsource risk, Microsoft doesn't and won't care about your loss of access or data and those are specifically excluded from their contracts, only significant outages you can demand reimbursement but not data loss or corruption.

 

Cost increases is a very real thing, so is reduction in services. We have academic Office 365 licenses and just this year we got told our Office 365/Azure data storage allocation is being reduced from 5PB to 660TB and still having to pay the same mount per year. So not only could the cost go up but what you get for what you pay can be reduced also.

Link to comment
Share on other sites

Link to post
Share on other sites

that's indeed a very good point, accounting and human resources people would definitely agree on that comment. and attacks from outside (and inside) sources are not unrealistic, even without having experienced such ourselves to date *knock on wood* the way it's looking is that we'd still set up at least one new server that is good for the next 5-10 years, which would host our sage software/database and a backup of all files (running in the background). unless sage offers reasonably priced cloud based operation, which we still have to look into. that's probably the only software we use i'm most unfamiliar with (and also hate).

 

that 5000TB to 660TB reduction sounds insane. i wasn't aware of such things going on to such an extent but it sure goes along with the shrinkflation trend and things people like louis rossmann point out in their entertaining videos, eg. your example with paying a subscription and getting services removed after the fact, which are some really dirty business moves. it's have to be a move we have to risk if we ever want to deal with moving into something more modern. we could always move back to something traditional if we had to i guess (painfully).

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×