Still using LTO as it remains the best price when you’re dealing with large amount of data, and also reasonable technology stability.
I have a long established folder structure I maintain per job. That covers source files, project files (non-Flame), renders, deliverables, etc.
As for Flame, I have a running Flame archive while working, which I do backup to the cloud to provide daily offsite coverage. But it only includes the setups, key renders, etc, not the source files, etc. to keep it manageable in size. For cloud backup I use LucidLink with a Wasabi back-end. That costs about $20-60/mo depending on what is going on.
Once the project wraps, I make a new archive that covers more detail, but not all the intermediate desktop snapshots, etc. That gets saved in my folder structure with everything else.
I retain a copy of this folder structure on a RAID for a while, and also make a copy to LTO. Once I run out of space on the RAID older copies roll off, they only exist if the client comes back for anything after we wrap. Similar to Tim, I have at least two copies of it for about 6 months, then it might be single copy on LTO after.
As for LTO, I use a OWC Mercury LTO8 drive via Thunderbolt connected to an older MacbookPro. I use Hedge’s Canister software, and LTFS to just mirror folders onto LTO. I’ve previously used more comprehensive backup software (Retrospect + YoYotta), but don’t like their proprietary formats (Retrospect) and complexity (YoYotta), and I don’t really need incremental backups and whatnot, since that’s covered in the cloud, this is just archiving completed jobs. LTFS provides a bit more stability there. LTO drives can read two generations back, so at some point, I may have to forward copy any archives I really care about long-term.
As for files that are not job related, but cover my basic business operations, I have an occasional LTO snapshot of those, plus I have a Cloud version. Right now they also go to LucidLink for the everyday backups. In the past I’ve copied them to AWS buckets, but it’s more hassle. At present I use GoodSync as a utility to sync folders between my RAIDs and LucidLink, but I’ve also used SyncTime (Mac) and SyncBackPro (Win) in the past. GoodSync costs a bit more, but has good schedule job schemas, runs on all three OS (Mac, Win, Linux), and can access all the various cloud providers, so I can use it for bulk downloads from Dropbox, G-Drive, ftp, etc. as a more versatile tool.
Re: LTO drive, I saw that Hedge in their resent newsletter mentioned a new drive enclosure someone else built that is 10GbE enabled, in case you want your drive further away from the desk for minimizing noise, as that can be a hassle.
I think I did the math again, not too long ago, and LTO starts becoming cost effective once you’re past around 200-300TB of archival storage. Less than that and you’re better off with naked HDD in a SATA dock (if you do single copy). If you want RAID coverage, what a lot of folks do is find a buddy and do remote sync of a large Synology RAID. Bit more on a per TB cost, but more redundancy. But has some built-in headroom limits, so you would only use it for recent jobs and then drop everything after a while. I still have files from jobs I worked on 10+ years ago, and did recently go back to one of them not too long ago (for source files, not project though). Archiving projects for the duration is a whole separate topic. Though we have a recent job that is an Avid edit that has been going for more than 10 years.