Sync Local Storage to a NAS

Hi I would like to automatically sync/mirror my local Nvme/SSD storage on a Z8 with a Synology, any recommendations on techniques and software to use ?

Also do the same with a Mac with thunderbolt nvme, and a Synology .

.But I think with Mac you can use the Synology drive client…?

Thanks
Marios

If you’re technically inclined, rsync and crontab are free options.

If you want an app, SyncTime on the Mac is decent. I used it for a long time.

Since then I’ve upgraded to GoodSync which is more expensive, but supports access to all the cloud storage proivders as added bonus.

Both apps provide folder sync functions and can run on a schedule.

Yes there more options for Mac, but I think Synology drive might do the job, but on the linux side im not too sure , …I think using rsync cannot handle two-way sync, because there is no mechanism to delete files?

GoodSync has a Linux version, and it can definitely do two-way sync.

1 Like

looks good! thanks , …I was looking at this Linux Installation Guide | Insync Help Center

Goodsync looks better, …

Where tools like GoodSync shine is if your client send you massive files on Dropbox and G-Drive. The web based downloads are limited by the number of streams the browser will let you open. So if you need to max out a fiber link, download from Dropbox and G-Drive, apps like GoodSync are great.

Same if you don’t want to pay for LucidLink for cloud storage, you can mirror folders straight into naked AWS cloud storage with GoodSync and apps like it.

That makes these worth the price.

I use it mostly to mirror critical project files onto NAS and LucidLink at the end of the day for offsite backup.

1 Like

Which license did you go for ? I suppose the personal license covers it all for me ?

Yes, just personal license.

What precisely are you trying to achieve? Just a backup of your local system disks?

If so, Synology’s Active Backup for Business is great.

no I want to test an unmanaged workflow, Z8 (flame) using local nvme’s raid zero (rocky), insync with my Synology used for i/o, and backup, that will also be sync with another Synology at an other location via vpn, … using drive connect to a Mac (flame) local storage (nvme’s) with drive sync …, so I can work in both locations with multiple backups with only the NAS’s in raid 6,with smaller archives :wink: …make sense ? …just a test, …

Synology Active backup for business is interesting you think it would do the same thing?

Oh. In that case.

I’ve tried a lot to mess with that kind of situation and it’s not really working and worth it.

Synology Drive client doesn’t work because you can’t set external managed volumes as sync locations.

For the love of God, if you’re gonna do this, publish to the Synology. Then use Synology Drive Share Sync to have your Synology talk to another Synology. That way, and that way only is the way to get Synology in the mix.

Or, just use lucid link. Use lucid link with a 4 TB internal SSD, which you already have which is perfect for local caching, and then back up your project to Synology when it’s done.

Yeah did you try goodsync?

If only there was a way to just plug all that kind of stuff in and sort of turn it on…
:rofl:

syncing stinks. dont do it. I can’t remember a solid pipeline that scaled more than one person that involves syncing.

Lucid Link for active, Synology for Backup, and PCoIP is optimal. Don’t move data.

2 Likes

Agreed. Synced copies work fine for single operator. Once you have multiple people making changes at the same time the risk for conflicting changes exists. Since no automatic merge function exists, file locking is required. Lucid Link can handle that, simple sync doesn’t.

Though Lucid Link is going through changes that are concerning.

Syncing Flame archives is a decent work around in a pinch. Also works for source clips and OpenClips.

Im a single operator so this works great for me: I have something similar - I have NVME internal fast storage on flame which via syncthing always syncs my server project by project. This then gets a auto copy to Synology for backup and sync to dropbox. ( one of my clients uses dropbox for whole project sharing). So when i export a wip to my NVMe it syncs to my server ( now a Truenas ZFS share) and then to my synology and then to dropbox. all automatically. So in a minute or so an offsite producer receives my WIP without any intervention. Syncthing to Syn drive. essentially. I actually want to remove my Synology all together from the equation as its my bottleneck, so ill look into Goodsync so i can go from server to dropbox or whatever client needs. I wanted to go from truenas to dropbox but the sync sucks for constant sync. works great for timed rsync type stuff but has a hard time with 10s of thousands of files.

1 Like

I managed to get this to work still need todo more testing,… I been using synology syncing for a while seems ok I usually work alone

One thing to know about these solutions, especially if you deal with large collections of small files (like EXR sequences) -

Any solution that will transfer individual files will be slow. The only ones that can do that well, is those that package up those files into blocks for storage in the Cloud - being Lucid Link and Enterprise backup apps.

I’ve tried a few, including Resilio, etc. and they all struggle.

Now, if you have enough time and these apps can churn through the pile while you sleep or are stuck in traffic, it isn’t necessarily an issue. But if someone is waiting on the other end for those files to come in, then you’re in trouble

That’s the TL;DR.

The detail is that for each file transfer the app has to open a separate transfer. On long distance connections to the cloud latency becomes an issue in the current protocols. The apps get around that by opening 64/128/256 parallel data streams for each file, and then combining the chunks on the cloud side of things.

But that works great for 100GB MOV files. It doesn’t work for thousands of files that are 5MB. The overhead of creating all these connections per file becomes a drag on the performance, and you can’t max out your fiber link.

Lucid Link doesn’t store individual files in your cache or transfers them to the cloud individually. The way they’re stored on your local NVMe is actually in encrypted blocks of larger size, and that’s what they up/download… You don’t see that, because Lucid Link uses a virtual file system. Go into your local cache folder (after enabling to see hidden files), and you will see a small database file and one large cache file (in my case 322GB). That’s it. Not thousands of files that you have access to in your file space mount point.

I get it, but I don’t find a problem with exrs but batch setups can be a problem as there are more smaller files, …and in any case its easy to get a hard drive or a usb stick or manual transfer to top up the sync to help it along to catch up every now again when required,… if it works as I’m hoping it will save me a lot of archive space/time and file management at the end of a large project. I dont feel I need to go as far as Lucid link or spend 10k on 25/50gbe nvme NAS. Thanks

you don’t need to spend. 10Gb networking with a Synology is more than sufficient.