Good info.
@Slabrie I’d love to chat about this. This is a big part of our workflow as the import node really messes up with our archiving and how we manage jobs.
Hello Kyle!
The Read File node has been put on the list of features we are nt more adding new capabilities to since the Import Node contains all you need (and more) for workflows based on Read File node.
Could you explain what you mean by “messes up” archive?
Sure, although apologies to everyone for hijacking this thread.
We currently still work with managed media and when we archive, we tell flame_archive to cache any un-cached media in case someone missed something accidentally. This ensures we have everything in a nice little container.
However, when it comes to dealing with renders from 3D or elements from our library, we use read file nodes as they act as dumb pointers. When we’re ready to archive, we have a script that will save all the batchgroups out to a known location on the server and then another that will scrape those setups and create a sidecar tar archive that includes all the files used by read file nodes. This gives us an archive of files as-is.
Were we to use the import node, then Flame now sees that as an unmanaged clip that is present in your reel groups. This clip will now be cached upon archiving using our current settings. The issue with that is, as we all know, Flame archives as uncompressed media. The size difference is multiples.
We spoke about this a couple of years ago and I sent an archive to prove my point. Exact same files from an fresh, empty project. One was our tar sidecar archive, the other using unmanaged import nodes but cached on archive. I forget the exact numbers but the our existing method was 1/3 or something of the size.
More than happy to do that exercise again so please let me know if you’d like more details, etc.
This is a very valid workflow. I guess until we are able to provide compressed cache for archiving, we could probably see if it could be possible to have a way to have a more granular way to not cache specific content when archiving.