Automatic project backup

Working on creating nighlty automatic archives for all projects marked “active” in our project management tool.

The problem is that sometimes artist forget to clean up and i dont want to end up with a accidental 4 tb archive

couldnt find a way to query flame_archive to check the size of a project pre archive without caches. flame_archive -e just returns the full “largest size it could be”

I wonder - as I - in no case ever care about any media thats in my projects as we work unmanaged i just want a simple way to save out just the sequences and project settings.

is a full pull of opt/autodesk/ of my project server safe enough so I dont have to do any archives at all anymore? I guess I have to shutdown s+w and whatever and then grab the opt/autodesk folder and just dump it into a folder and I am good?

Arvhives are super annoying

/opt/Autodesk/io/bin/flame_archive --help
-e, --estimate  estimate space of clips when archiving and restoring.

You can dive into the project using wiretap and dig through all of the libraries recursively - it’s horribly tedious.

The best approach is to develop a completely openclip workflow, and set cron or launchd to archive your projects in the background.

1 Like

-e only gives the full archive size not one without caches.

example
project it retunrs 15TB while actual atchive is 30mb

@finnjaeger - yep, you need to dig a little deeper - it’s truly horrible and tedious.
A truly openclip workflow is the answer.
Sadly nothing is straight out of the box.
LOGIK-PROJEKT will get you most of the way there.

yea i dont like
this at all.

I only
want to save my timelines somwhere :frowning: thats it

1 Like

It’s complicated
It’s also the early evening in Paris and people here drink and talk and eat and enjoy life.
So we’re going out and leaving the flames to do something.
I’m so out of my depth…
It was easier twenty years ago.

Are you also including your other flags to not cache, what to omit, etc? If I’m not mistaken, it’ll give a more accurate size estimate when you pass all the options. That said, this is a vague memory when I was testing our nightly back-up system.

What you might appreciate is our method which adds the project to a text file every time Flame is launched or the project is changed. This file is read by the nightly archive script and goes through the list and updates only the touched projects. Our projects are named with a identifier so we know what machine is what so even if you mount a remote project it’ll be archived from the correct Flame. The only thing that you need to watch out for is when you update the version of Flame and an archive needs to be converted to the new version.

1 Like

yes sadly it is

./flame_archive -O sources,renders,maps,unused -e -P Aldi_Stories_2409_v002

14.53 TB

Stopping managed threads.

I’m not around a machine right now, but isn’t --linked required as well?

yea true reading it again… let me try that

hell yea brother, thank you! I was missing the -k that I used in my backup script, thats awesome, I really like the changelog idea, maybe I can somehow poll last opened from my central project server

[root@flameingo001 ~]# /opt/Autodesk/io/bin/flame_archive -O sources,renders,maps,unused -k -e -P Aldi_Stories_2409_v002

Registered thread ‘NetMtor’ [ 140410626551808 ]

Could not find presets path

Output log to: ‘/opt/Autodesk/log/flame_archive.log’.

MP: Using 16 cpus (16 visible)

16.68 MB

Stopping managed threads.

:cool:

1 Like

adding the script here for anyone that needs something like this.

it needs to be modified to fit your paths e.t.c and you need to cron it up yourself but here are the 2 scripts

First one is the automatic auto-backup , again read the file, change all the paths and SSH things, this uses a clear text password as a example… better to setup your own ssh keys and stuff.

it will parse all projects, compare the change time of the folder in /opt/Autodesk/project to the last auto-archive, if the change is newer it will do a archive but only if the archive size is smaller than 20GB, it will add nice logfiles like a list-of-shame that lists all projects larger than 20GB . as we work unmanaged we never have large archives unless someone illegaly hard commits stuff

it will create a new archive in a dated folder and it will delete everything but the 2 most recent archives (we have additional 1 month of daily snapshots on lucid)

Its made to archive from a single host, we have a centralized project server so thats my host.
You will need to install some requirements
pip3 install paramiko for the ssh login stuff

autobackup_v004.py (10.7 KB)

Second one uses the same logik but its more for manual user based archiving, it will try to match a projectname with a name of a folder on storage, again probably has to be changed for other paths.

archivo.py (9.5 KB)

I am not responsible if this breaks everything and your machine room catches fire.

4 Likes