You will need to figure out where stuff is in 3d space relative to the FBX, which can be quite difficult without real world measurements. Also, the FBX won’t have any shakes, shimmies, or wobbles from the rig, so even if you had correct scene data it might not always stick.
Typically these are far less helpful than anyone things they ought to be. Just ask @Sinan . I think he works for a production company using MoCo and if memory serves me right they still do tracking passes.
Most data from MoCor rigs, if Im not mistaken, isn’t necessarily oriented or scaled to helpful units. Plus, it doesn’t care about what is seen through the lens so it has no idea if anything useful is in the scene or where things you care about are.
I work for a production company and we have our own motion control robots. Never have I ever been successful in using FBX from robot to Flame. I always shoot a tracking pass for posterity and use a camera tracker.
I’m sorry I can’t provide a solution. But please let us know if you come up with one.
If in the future you have such a shot make sure to get a track pass.
Also it’s worth noting, if you have a 3D department, there is a software called Mimic for Maya from Autodesk. By taking real world measurements 3D dept can precisely create movements and upload them to a variety of moco systems as well as into Flame via FBX.
thanks Tallen for raising this issue and all contributors. I know this is an old thread but want to expand on it, share and learn.
I am a VFX sup/ VFX producer (ex-Flamer) and on MoCon shoots diligently capture real world measurements, from multiple static objects in scene to Cam pos start with XYZ co-ords. Also with MoCon base centre if static base or track pos/centre data if MoCon track. With the obvious lens/sensor data. I accompany it with annotated photos and have even tried lidar scans. If the rig permits I get strobe frames for sync. Don’t worry, I also get a tracker pass. But in every case, as intimated on this thread, the Flame artist re-tracks the scene, either farming the track out, using flame or one of the many third party trackers.
I have tried myself, hopping on Flame and trying to use my own data to line up the FBX and yea guess what? I can’t do it either. I end up using a bit of maths, trying to remember Pythagorus and eyeballing it whilst cursing the VFX sup!! I can get it in the ball-park, in timing sync and throw a few teapots and planes in the scene to see it is tantalising and frustratingly close. The data is right there but as we know, there is no cigar for close.
I am rusty on flame these days but that is probably not the excuse for my failing! I am pretty sure I am providing enough data to do it but my feeling is that you need a whole extra level of nerd genius that is beyond most of us? Would you agree?
Maybe this is the domain of a Maya artist? Has anyone used or received the data in a usable form for Flame via a friendly Maya artist or using the software mentioned by Sinan called: ‘Mimic for Maya from Autodesk’?
So to put the question out there, has any Flame artist managed to successfully use the kind of data I am gathering to align an FBX camera? Is it just a lot easier and possibly more accurate to re-track it anyway?
I am interested to know both artists and VFX sups experience and opinions on this. If I am missing something I’d also love to hear that.
To collect that data on-set is a time consuming distraction and if it’s next to useless, I would much rather get the minimum and spend more time focussing on what is being setup and shot.
My company has a full blown layout department that take care of this stuff. Never had to deal with this, but from a freelancer point of view I would be interested to see what kind of data are we talking about. Having done a fair amount of layout and camera tracking work in various software, I am curious about where the complexity lays and if it something like Blender, Nuke and Python can be of any help for Flame.