Filesystem level Snapshots/Backups

So as we all know the danger of corrupted batches and projects e.t.c is way to real.

I am reading a lot about people saving their setups as text files and saving the desktops and even doing nightly archives . I have been there and its the stuff nightmares are made out of.

so my question is if anyone successfully uses alternative backup strategies that dont require any manual user input. Lets just say I am a huge fan of just having a single project file on disk somewhere in a ASCII format (like nuke). Currently I am doing teh whole backup and archive dance and also copying my current /opt/autodesk folder to network storage every night (which itself is snapshotted)

My thinking is to use the linux built in filesystem snapshot tools , in case of corruption I should be able to just restore a batch setup from X hours ago depending how thats set up and then never worry about archives, saved old desktops and so on until the project is done?

anyone using alternative methods that are proven to work in a disaster case? I have been bitten by the corruption snake too many times now…

2 Likes

Arguably snapshotting is one of main arguments for an unmanaged workflow. There all of your comp metadata and media live completely external and unmanaged by flame. Sure you cache for playback and whatnot but the idea is that you’re outside the scope of the software for storage.

The exception is editorial of course which is truly a miss—the structure of your timelines lives inside Flame of course and therefore must be backed up, but if you’re doing metadata only backups of simple timelines you’re not really talking about massive investments in time. It’s not automatic but it is what it is. My hope is that with OTL (or fan vad det heter…) we can get to a point where we can be completely unmanaged, publishing actual timeline metadata that can be picked up like a batch setup on a remote file system and incremented.

Perhaps someone has some other approaches which are more beneficial but this is at least a partial.

1 Like

yea I always preffer unmanaged I wish I could set everything to unmanaged… but like alembics are always hard imported , batches are saved somewhere and same with timelines as you said, media itself I definetely dont hard import only cache, but then renders that I create in batch live only inside flamestore… but yea I dont backup my flamestore as I should be able to re-create everything procedually anyhow.

Alembics don’t need to be hard-imported… in fact it often pretty advantageous not to and avoid filling up your system drive->

https://help.autodesk.com/view/FLAME/2021/ENU/?guid=GUID-49474588-6833-4870-9B1A-B9989D4C446B

If your timelines are published out with shots, batches and .clips, all metadata and renders live external to the framestore, so you’re only caching .clip versions for round-tripping. It’s only the timeline… which is to say a lot, but all other aspects of working in flame, compositing-wise can live on snapshot-capable devices.

One day there will be a solution for the timeline as well and then we’re home.

1 Like

@finnjaeger,

Have you thought about publishing? It can be solution for a hands-free workflow. The batches all get saved out externally and the setups are all in the publish directory - for every iteration. For the most part, we now run nightly backups just for the metadata and the sources and we do not run Archives.

Metadata is not only accessible by navigating to the setups, but if you bring in the Output Clips (.clip file), you can “explode” the node into your current batchgroups and it gives you the source batch flowgraph as a contained group. If you need to go back.

I personally love the feature of the Output Clip that lets you select versions in the published conform. And another upside is the versions can also behave as pre-renders to later work. We often get the preliminary stuff rendered as an early version and then use it downstream in later versions… Or if a client wants to just tweak one detail, no need to go all the way back.

When Publishing, we try to separate the Data from the Metadata. The data being the actual frames of the publish and the renders. The Metadata being the Output Clips, the Segment Clips, and the batch groups.

One tip is we found that if you give your projects nicknames when you set them up (short ones) and include that token with the publish then you have a great identifier that can fit reasonably in a name.

Another important tip is to make sure to "Rename Shot"s. Every shot in a published sequence needs a unique shot name that you have to give it. I use “(abbreviate name of sequence)_sh<segment##>0”.

Here’s a sample structure we use but there’s maybe something else that’s key to your pipeline that you’ll need to add or subtract - if the job is EXR, it works just the same:.

Video Options Tab:
DPX
Generate Media
Pattern:<name>/Versions/<project nickname>_<shot name>/<version name>/<project nickname>_<shot name>.<version name>

Clip Options Tab:
Make sure the following buttons are enabled:
Create Open Clip
Copy Exported Clip in Project
Create Shot Setup (this makes batches!)
Create Batch
Clip Versions Enabled:
0, Pad 2, Version Name = v00
And I use “Create Read File Nodes” - which places the Segment clip in the created batch group as the source. If you double click on it, it gives you the ability to select the Output Clip instead.

For the tokens here’s what I’ve found works:
Segment Pattern: <name>/Metadata/Segment_Clips/<project nickname>_<shot name>/<segment name>
Setup Pattern:
<name>/Metadata/Batches/<project nickname>_<shot name>/<project nickname>_<shot name>_<version name>
Output Pattern:
<name>/Metadata/Output_Clips/<project nickname>_<shot name>
Output Path Pattern:
<name>/Versions/<project nickname>_<shot name>/<version name>/<project nickname>_<shot name>
Create Batch (button enabled):
/Libraries/PublishedBatchGroups/<name>/Raw_Batch_Groups/

Oh wow thats some cool stuff, I didnt know there was a “soft import” option now for alembics , thats interesting, and I also dont know how stuff works with publishing at all, but that does sound amazing and definetely something I will look into.

I didnt even know that publishing can work with batch groups oh dears thats a lot less cumbersome than having to deal with archives… i hate archives

thanks all!

If you don’t like pubishing… you could always just set your preferences to save your Batch Group Iterations to any location, which helps. But it doesn’t give you the direct link between the writefile output and the Batch Group with the same version number as the directory with all the metadata.

Good luck!

1 Like