Random Public Service Announcement #37: Google Drive is horse poop

My colorist hates me. Said colorist is posting close to a terabyte of graded materials over dozens of sub folders. And Google Drive and its web interface and its constant slow zipping and garbage success rates of successful transfers means instead of spending a few minutes on an Aspera link or WeTransfer or Massive, I’m spending hours watching stuff download and then sifting through 47 character file names trying to make sure I got everything or wondering if its truly ‘offline access’ in the Google Drive App.

CYBERDUCK HAS ENTERED THE CHAT.

Who knew that with Cyberduck you can have FTP style drag and drop transfer with speed limits and reconnects from Google Drive to my NAS and it saturates my 500/500?

Star Trek Reaction GIF

4 Likes

OOH! And you can Synchronize a Google Drive folder with Cyberduck such that if someone adds something to your Google Drive it’ll auto sync to your NAS?

Tonight Show Reaction GIF by The Tonight Show Starring Jimmy Fallon

1 Like

Cyberduck works on Mac as well.

Yes, long known work around. Also works with Dropbox for similar reasons.

And it can saturate your broadband connection if you enable segmented transfers. Which makes it faster than web interface beyond the zip issue.

Two things to watch for:

Google drive has a 750GB per day transfer limit per account, even on enterprise accounts. Total insanity.

If you use segmented transfers, don’t use your NAS as destination. The process of reassembling the segments is excruciatingly slow. Use a temporary NVMe as destination and then move. Will save you hours of time.

And be alert to data corruption. For reasons I have not fully debugged, occasionally files downloaded at least from Dropbox, possibly also from G Drive have been corrupted by cyberduck. I had video files that had weird artifacts or bad content, that once downloaded via the web interface were fine. Still use Cyberduck for speed, but if a specific file acts weird double check the file.

All that said, Google’s product are crap all around and shouldn’t be used in production workflows. They always solve the first 80% and then move on to other things. I’ve done plenty of work for Google, super nice people, but the products are a different story. Same deal for their software, to their hardware, and even things like Nest thermostats. Battling them all the time.

Cyberduck is by far most popular for this, but there are multiple other sync tools that support access to cloud storage. SyncBackPro on PC. There’s GoodSync as well. And you might check out RClone which is commandline and runs on Linux and can be scripted.

Looked a bit deeper into RClone. That’s probably your best option, as it works across all operating systems.

Download and install.

Then run rclone.config. It asks you a few questions, which you pick all the defaults (drive type for GDrive is ‘drive’). It then opens the browser to authenticate your G-Drive login and saves that as config.

After that you can type:

jan@AllKlier-LBM Tools % ./rclone lsd gdtest:/
-1 2017-02-26 15:56:38 -1 Archive
-1 2016-02-08 08:42:06 -1 Clients
-1 2020-08-06 08:32:56 -1 Misc
-1 2017-02-10 09:15:23 -1 Packing List
-1 2017-09-11 22:26:43 -1 Permanent

to get a directory listing of your G-Drive.

If you want to copy or sync, use

…/rclone copy gdtest:/Misc .

To copy the ‘Misc’ folder into the current local folder. You can append --dry-run to get a preview of what it will do

jan@AllKlier-LBM test % …/rclone --dry-run sync gdtest:/Misc .
2022/12/29 08:31:46 NOTICE: QC_Sample_FalsePositive.mxf: Skipped copy as --dry-run is set (size 163.056Mi)
2022/12/29 08:31:46 NOTICE: QC_Sample_GoodPositive.mxf: Skipped copy as --dry-run is set (size 190.126Mi)
2022/12/29 08:31:46 NOTICE:
Transferred: 353.182 MiB / 353.182 MiB, 100%, 0 B/s, ETA -
Transferred: 2 / 2, 100%
Elapsed time: 0.7s

Doesn’t get better than that :slight_smile:

2 Likes

I agree Google Drive isn’t perfect, but far from useless too!
If you set it up via the GoogleDrive App to mount as /Volumes/GoogleDrive and set the cache location to a big drive connected by thunderbolt or 10GigE you can select which directories you want online or offline (to save local storage space. If you are working with a remote team you may need to set up a file path rule in Flame to get everything to link.

Some bad things I have discovered;

  • The big drive is pretty busy when it’s syncing/indexing to the cloud, so local flame caching can be slow. (I use a 6 bay Promise Pegasus, so shouldn’t be as slow as it sometimes feels)
  • Occasionally you get corruption within file sequences. This is less of an issue the using a clip (QT) based workflow.
  • The 750GB per day can be a pain so I archive locally as a backup and make a final archive on google at the end of a job.

Some cool things I’ve discovered;

  • If a project is offline (i.e. only in the cloud) you can open an archive and it only downloads the segs and material needed for the the restored section of the project. (so I make my segs small, around 5GB)
  • Agencies seem to like masters and versions being delivered via GoogleDrive/Google Sheets.
  • You can copy and paste google links of clips into Google Sheets and get the correct file name and a thumbnail of the clips when you hover over the name.
  • Although there is a 750GB/day limit, I have managed to get 70TB of material stored online for a pretty low cost and can access any of it very quickly.
  • I can pick up a job “on the road” using my laptop if needed.
  • I don’t find sync speed an issue at all - as fast as Frame.io (but I do have 1Gb/1Gb internet)

I also use Dropbox, but not directly from flame - never managed to get it to work well. Also Frame.io for review purposes.
In the past I found that Resilio Sync was pretty good to share a job with another op too, but not great to share to the wider world.

Cheers,
Angus

p.s. Had a message from someone from LucidLink who was going to demo it. As apparently Autodesk have qualified it as Flame storage.
Haven’t heard back yet.

LucidLink is a great product. But it’s for workgroups, not external exchanges like clients or film crews. I’ve used it for quite some time now. In fact that’s what I use the same way you describe Google Drive. It’s essentially a Cloud NAS that makes a mount point available. Except it’s a lot smarter than Google Drive and its sync.

It’s a virtual file system, so you can see files that are in the cloud without having to download them, and then they download in real-time once you open them. Also it manages the cache based on size, so you can have more data in the cloud than you have local without having to constantly configure which folders get synced. And you pay as you go. You don’t have to sign up for a storage tier and overpay. They charge you for what you actually use at the end of each month.

And as a rule I never install any Dropbox/G-Drive sync tools. They cause more trouble than they’re worth. They’re meant for non-technical consumer users, not production workflows.

For client deliveries I use either Frame.IO or MASV, and I also have a MASV inbox where people can send me files. Not a cheap solution though for large transfers, so use with caution.

1 Like

No serious work can be done on that interface. Same goes for DropBox. We use Transmit as the interface!

I have to transfer TB of data all the time between SE Asia, UK, and various shoot countries. This is our system using Dropbox. It a system that breaks the data up in to manageable packets, allowing you to kick off as soon as the first files arrive, then sync the rest in the BG. It works v well, maybe useful to some people:

Here’s the PDF:

THE BATCH SYSTEM V05.pdf (521.8 KB)

2 Likes