Motion Control FBX into Flame?

Hey there guys and gals,

I got an FBX file from a motion control rig and I’m wondering how to line this up to my shot.

I don’t have the full take of the shot, only part of the shot and there no visual markers inside the file, axis, object etc.

Does anyone have any tips or tricks to do this?

Can you get the full shot, even if it’s ungraded?

You will need to figure out where stuff is in 3d space relative to the FBX, which can be quite difficult without real world measurements. Also, the FBX won’t have any shakes, shimmies, or wobbles from the rig, so even if you had correct scene data it might not always stick.

Did they shoot a trackers pass for ya?

Typically these are far less helpful than anyone things they ought to be. Just ask @Sinan . I think he works for a production company using MoCo and if memory serves me right they still do tracking passes.

Most data from MoCor rigs, if Im not mistaken, isn’t necessarily oriented or scaled to helpful units. Plus, it doesn’t care about what is seen through the lens so it has no idea if anything useful is in the scene or where things you care about are.

I’d camera track it.

I work for a production company and we have our own motion control robots. Never have I ever been successful in using FBX from robot to Flame. I always shoot a tracking pass for posterity and use a camera tracker.

I’m sorry I can’t provide a solution. But please let us know if you come up with one.

If in the future you have such a shot make sure to get a track pass.

Also it’s worth noting, if you have a 3D department, there is a software called Mimic for Maya from Autodesk. By taking real world measurements 3D dept can precisely create movements and upload them to a variety of moco systems as well as into Flame via FBX.

3 Likes

Thank you to everyone for sharing their knowledge and input.

You’ve pretty much confirmed what I thought was the case that the FBX is pretty much not very useful right now.

At this point in time, unfortunately I don’t have the luxury of time to try and test it out and try more stuff, so will just have to camera track it, the good old fashioned way.

Hi Everyone,

thanks Tallen for raising this issue and all contributors. I know this is an old thread but want to expand on it, share and learn.

I am a VFX sup/ VFX producer (ex-Flamer) and on MoCon shoots diligently capture real world measurements, from multiple static objects in scene to Cam pos start with XYZ co-ords. Also with MoCon base centre if static base or track pos/centre data if MoCon track. With the obvious lens/sensor data. I accompany it with annotated photos and have even tried lidar scans. If the rig permits I get strobe frames for sync. Don’t worry, I also get a tracker pass. But in every case, as intimated on this thread, the Flame artist re-tracks the scene, either farming the track out, using flame or one of the many third party trackers.

I have tried myself, hopping on Flame and trying to use my own data to line up the FBX and yea guess what? I can’t do it either. I end up using a bit of maths, trying to remember Pythagorus and eyeballing it whilst cursing the VFX sup!! I can get it in the ball-park, in timing sync and throw a few teapots and planes in the scene to see it is tantalising and frustratingly close. The data is right there but as we know, there is no cigar for close.

I am rusty on flame these days but that is probably not the excuse for my failing! I am pretty sure I am providing enough data to do it but my feeling is that you need a whole extra level of nerd genius that is beyond most of us? Would you agree?

Maybe this is the domain of a Maya artist? Has anyone used or received the data in a usable form for Flame via a friendly Maya artist or using the software mentioned by Sinan called: ‘Mimic for Maya from Autodesk’?

So to put the question out there, has any Flame artist managed to successfully use the kind of data I am gathering to align an FBX camera? Is it just a lot easier and possibly more accurate to re-track it anyway?

I am interested to know both artists and VFX sups experience and opinions on this. If I am missing something I’d also love to hear that.

To collect that data on-set is a time consuming distraction and if it’s next to useless, I would much rather get the minimum and spend more time focussing on what is being setup and shot.

Cheers David

1 Like

My company has a full blown layout department that take care of this stuff. Never had to deal with this, but from a freelancer point of view I would be interested to see what kind of data are we talking about. Having done a fair amount of layout and camera tracking work in various software, I am curious about where the complexity lays and if it something like Blender, Nuke and Python can be of any help for Flame.

I tried a few times, checked with a few cg artists who I thought were professionals, checked with @Sinan, who does this for a living, then gave up confidently.

Also… is that an onion on your ear?

The blue onion must be the Flame artist equivalent of a Babel fish so you can understand client comments.
image

1 Like

Hi Randy and Sinan, thanks guys.

I see, so I think I can conclude that for mere mortals getting the FBX into a usable position is not feasible. Happy to hear from others.

And yes… that is indeed an onion on my ear. It can completely cancel out client whinging. if you missed it there is a video:

2 Likes