So many time warped shots. So much lazy use of fit to fill by editors.
I need to find a solid workflow for working with CG and time warps.
So it is generally a bad idea to try and get a camera track from footage that has had a TW applied. The motion analysis makes a big mess of pixels even if the plate doesn’t look too bad to the eye.
Happy to track shots with the TW removed but what if my TW is something like 235%.
At the moment we do the full length and then TW the comp. Every other frame is going to be thrown away. Very frustrating especially if there is frame by frame painting involved.
How can I get my TW data over to the tracked camera so that the camera gets the TW. All of the CG gets rendered using the faster camera and I only comp what I need.
I have tried supplying the 3D department with frame numbers burnt into the plate. When motion analysis is turned on this makes for very amusing number morphs. When the TW is linear I can sometimes give the 3D department a start and an end frame number.
Following the Interpolation / Extrapolation of Animation Curves thread I thought about getting my frame based TW and baking the animation. I then exported the data via an x.position axis as a raw file from action. That looks good. I have a text file of my TW values. We are currently writing a MEL script to interpret the values.
Track your full-length shot. Rebuild your TW curve as Timing, then paste that onto an axis somewhere you can get at it with expressions, then make a new camera and feed its position using eval() with the pasted timing curve and tracked camera. If you feel fancy, you can even export that axis and camera as an FBX to your CG artist and have them use the curve to drive the shutter angle on their camera for correct motion blur. We did this on a 1200-frame Phantom shot that got timewarped down to 3 seconds in the final edit and it worked great. HTH!
In the old days I did this:
Have two cameras.
They must be the same kind of camera.
ie two lots of legacy cameras or two camera_3d
You paste your animated key frames into the tracked_cam and you pipe those key frames into your tw_cam.
You use the axis to mirror the key frames of your time warp curve.
eg a 10 frame clip starting at frame 1 going to 10
The time warp maps frame 3 to 1, frame 6 to 2 and frame 9 to 3 yielding a 3 frame clip that runs at 300%.
So in action, your expression should be applied to tw_cam so that it affects all animation channels, and it should look something like:
eval (tracked_cam, timewarp_axis.position.x)
Something like that
I’m not in front of a flame right now and my brain is old.
On re reading.
The tracked camera should have all the tracked key frames
The tw cam reads those key frames and mimics them on a per frame basis
The frame to read is governed by the timewarp axis
The key frames for the timewarp can be copied directly from a timewarp node or timeline Fx so you can be sure that you’re applying the correct temporal displacement.
It takes more words to write an explanation than the ninety seconds it would take to show you in context
I put up a simple batch that has the broad strokes laid out. The key is also to floor your timewarp curve so that you always evaluate your tracked camera’s values at an integer value… like if the timewarped curve has a value of 14.5 at frame 3, the tracked camera needs to be evaluated at tracked camera frame 14 not 14.5
Edit: Floor the timewarp curve if you need integer frames. As Phil points out if you’re doing motion estimation you will most likely need float evaluations.
Sometimes it is necessary to permit float values, especially when you’re attempting to match those pesky motion estimation timewarps that so often appear from diligent, detail oriented, editorial departments.
I might nail it first time. Stranger things have happened. I’ve had a good nights sleep and my brain is much fresher now. Maths requires a lot more effort than it used to.