KI and AI tools in Production

,

Hello
I am very interested in the current AI tools.
I wanted to hear your experiences with the different tools and whether you use them just for fun or in a production.
In our company we already use AI in very different places in our production, be it to generate patches for clean ups or deepfakes or for dubbing.
I’m excited to see where the journey will take us and believe that we should take it seriously.

I look forward to your comments.

1 Like

I’ve been using Midjourney to create bits and pieces that I need for sky replacements, set extensions, etc. It gives me plenty of options for things that I can’t draw and can easily modify.

I use Nuke’s copycat for a bunch of roto and cleanup tasks if I have a high volume of shots. It’s not perfect but can get me 90% there if the volume of shots justifies the time to train, test, etc.

I’ve also been using Runway for really rough roto. I recently had some green screen shots where the talent had some green on their shirt. Runway gave me shitty (but totally adequate) mattes quickly.

Finally, I use @talosh’s amazing ML Timewarp tool, but I’m curious to try the new beta of Twixtor. If I can get similar results in an OFX it’s totally worth the money.

3 Likes

Does Twixtor work in the timeline now? I haven’t used it since the IRIX days when dinosaurs roamed the Earth.

2 Likes

I’ll letcha know!

2 Likes

Maybe ADSK has as surprise for us at NAB.

4 Likes

I suspect RevisionFX just has less to lose / less-scared lawyers.

3 Likes

I used Runway’s greenscreen to do rough roto up until recently, when I realized “Magic Mask” on Resolve provided a much better result, didn’t require me to send my footage out of house for processing, and didn’t require an additional subscription to use (we have a bazillion Resolve Studio licenses).

I have used Runway, and Stable Diffusion to make comp elements a few times Trees, graffiti, park benches, brick walls, etc. Back in the old days, everybody scanned elements like this were from art books and magazines, so I don’t think there’s too much of a difference.

2 Likes

I use stable diffusion all the time for clean up and matte paintings. I usually access it through A1111, DiffusionBee (low-key an amazing SD tool-set for mac) and Invoke.ai.

Pika labs and Genmo for abstract design elements.

AVC labs for up scaling video (great for anything with faces)

Runway for pretty much everything else

3 Likes

I tried using runway for some mattepainting/extension stuff, but I do not think they like me very much. This is what they returned

3 Likes

For rotoscoping, I often use the rotobrush in After Effects, especially for more complex scenes with hair details. Since it’s a bit heavy, I try to optimize it by downloading the resolution and increasing the matte again within Flame.

Midjourney and Stable Diffusion to generate environment frames, passing these results through Maginific AI for upscaling and improving details and textures. I also use Photoshop (web version when on Linux) for outpainting and cleanups.

For frame interpolation or morphing, TimewarpML and SimpleML tool when I need to train clean tracking points when dealing with numerous screen shots.

2 Likes