I always set mine to 1GB. In the unlikely event you get a corrupt segment when restoring, (Yes, it has happened!).
1GB ensures you won’t lose much data.
I’m happy with 50GB, they are not to big but lesser segments when transfering the archive.
This is a pretty mundane and non-critical question to use the bat signal for. The segment size has ZERO impact on anything other than the # of files.
We are not all as clever as you @ALan
What is mundane to you is more often than not, a gap in my knowledge.
The file archive segment size has more to do with what medium you’re backing up to and mental math convenience than anything else. And since very few of us have optical media and or some kind of non-file-based physical media archive pipelines, it really is down to preference now.
Some choose 1GB
Some choose 10GB
Some choose 25GB.
Some choose 50GB.
Some choose Tabasco when everyone knows that Cholula is better.
What Randy said
In your experience, do you find archives slowing down to open and close as they get larger? Does segment size affect this?
Negative. Large clip libraries for shows that drag on forever seem to be the biggest culprit, not archive segment size.
It’s always been my assumption (but never bothered to actually check out) is that smaller files take longer because of the time to create and close them and to map out data which was broken up across two files, but that may be either a silly way to think of it, or negligible if it’s true. I do 50GB. When I went straight to LTO, I did one file that filled the tape, which was 1.37 TB.
The size does actually matter, I’ve run into this problem twice. The first time was solved by going from 50GB to 100GB and the second time going from 100GB to 200GB. No, I don’t really understand it either.
The solution advocates not limiting the segment size. I wasn’t aware this was an option.
Ha, me neither but it seems like a terrible idea.
I’ll provide a real reason why it may matter…
I sync my archives to LucidLink for offsite security at the end of the day.
I used to have my segment size set to 600GB or whatever the choice in that range is. Fewer files seemed more efficient. In my archive I generally just keep setups, not cached sources, etc. So there can be several days worth of backups in each segment. Generally that would be good.
Except, when I then sync the archive folder with LucidLink it updates the full 600GB segment every single day, because the segments keep changing every day.
Since I have switched to 5GB segment size, and now archives roll into new segments more frequently, which means that older segments don’t have to be re-uploaded to LucidLink because they didn’t change. It only uploads the incremental changes.
It made my end-of-day offsite backups faster as it cut down on the data transfer.
I still archive my audio to digibeta as TV fuzz
But what about the last segment of the archive? I’m under the impression that when you add to an archive it completes the last segment of the previous archive, filling it out to the full segment size. If you don’t overwrite that last segment from the previous archive on Lucid, it won’t be complete.
You’re correct. I used imprecise language.
Archive A: seg 1-3, 5GB ea
Archive B: seg 3-4
Archive C: seg 4
Archive D: seg 4-6
So the last segment can change, but it’s a small one compared to one very large one.
At most I recopy < 5GB, instead of as much as 450GB or more.
I use GoodSync, so it automatically finds changed files and updates them.
The last time I brought in an archive from D-beta, it was a 12 year old archive. l loaded it because I wanted an audio track from one of the spots. It loaded in perfect, except for the audio.