I had a crawl through previous posts but as they are a few years old thought I would revive this topic if thats ok.
I have a project with 1 ProRes4444 clip that is 2.5hrs in length (feature film) and is 730GBs in size. When I go to archive said clip my Flame Archive is 9TBs! WTF?!
There is nothing more going on in this timeline than some dustbusting so this 730GB clip is the only media in the project. The rest is just paint nodes.
9TBs seems insane when the original clip is just 730GBs.
Project setting are all normal. ProRes4444 selected as “Preferred Format”.
I need to provide an archive back to my client as I am simply helping them out last minute but there’s no way I can give them back a 9TB archive.
My trick always is that I have done is that I have done this as BatchFX on v2 of the timeline so I can just copy it delete the source, relink a backclip and archive it so that all they need to do is drop it back over the original timeline but this raises the question for me… surely there has to be a better way.
My archives seem insanely heavy and are getting heavier as the years go on. What am I doing wrong here?
Flame’s Preferred Format is for caching only, not for archiving. Flame archives uncompressed media no matter what. It kinda has to or every time you archive it will compress the media which is kinda anti Flames ethos.
-If its just Paint nodes, send them Batches and have them render.
-If you cache something and archive with Source Media Cache, then yes you are going to archive the uncompressed media.
-Any media generated on the Framestore…Batch renders, Soft FX in timelines, etc, will also add to the data.
Thanks guys! Yes this all makes sense and I had planned to do much of it - I just again am always stumped by the absolutely insane increase in file size for Flame Archives when compared to source media and in this case, where so little is actually being done to this project it shows how resource heavy flame truly is!
The theory behind Flame saving everything uncompressed is that when you load your archive back at a later date it is exactly how you left it. There is no compressing/uncompressing artifacts or other issues. I find it a great solution, but to me, longform is 90 seconds.
What you can do with your current method is to remove all your precomp inside your BFX and relink correctly your setup. ( It’s super heavy to keep precomp )
So like this you only have your background media + any needed sources inside the batch.
You Can also create a fresh archive and as mentionned avoid caching anything.
Flame will cache any media in the comp same if cache media IS not selected.
I like this way, it keep things light, however you have to go inside every batch to clean them before archiving.
There is probably better way with unmanaged media workflow but also more complexe to do.
I didn’t know that. I thougt that the archive file format of media was same that cache.
Yes, it is. The ratio from prores 444 to uncompressed is insane. You can test yourself exporting a prores to exr or dpx and compare the size. A classic issue in daily archive is to see how a 50 gb’s folder with 5 qt prores of 50 gb, added to project that day, means a 300 gb of archive that night.
Work without cache (aka unmanaged), not cache on archive , and back-up project + archive is the best thing I could do in terms of saving space and countless hours of archiving and copying.
Those were options. You could still archive uncompressed to a file. D1 was uncompressed and so was SD D5. My mind is foggy around those early years, but I think there was always a file or data tape option.
For sure. D5 would have been the only decent option for uncompressed since it was 10bit. D1 was 8bit though so most archived to digi since it was hella cheaper (although the DCT was faster). In the naughts, there were even options for archiving to HDCam (shudder) but I know we all covered our eyes and did it at least once. No one had a Vodoo except maybe Björn and I and like one other shop in France .
Data-wise, prevalent options were DLT and later DTF and LTO—the first discreet logic install I was part of came on a DAT but all of these option were slow—super slow compared to archiving real-time compressed over sdi—which was why they were relegated to people doing film work until multi-res and the death of linear video tape finally demanded data-only archiving.
Ultimately my point was that there hasn’t always been this puritanical approach to archiving—it hasn’t needed to be uncompressed because of some higher ideal. Compression plays a constant part in everything we do and always have done and we should be allowed to play the cost/benefit game here as we do everywhere else in production.
I’d imagine this discussion also fuses together technically and abstractly with what @PlaceYourBetts and @finnjaeger and a few of us have been saying about cache formats for floating point data and DWAA/B.
Again, the cost/benefit analysis and being able to make that call ourselves.
Unnecessary caches and intermediate files likely cause the inflated archive size. Clear unused caches, trim the timeline, and select “Media Used Only” in the archive settings. You can check that ProRes4444 is the render format and avoid archiving redundant data.
May be the BatchFX approach works well—archive only the adjusted frames and relink the original clip for delivery. Regular cleanup and clear communication with the client can help manage archive sizes in future projects.