Ok.
Zeiss app allowed me to call for st maps manually and I could capture 50+ distances per lens in many cases.
The stmaps I could verify on the vfx side for accuracy.
3dEqualizer has the ability to convert to raw distortion values which in the case of SPH lenses, usually boil down to k1 and k2 values. (Distort and quartic). The other 3 values (k3, p1 and p2) are near zero most of the time unless the mount isn’t aligned correctly.
The raw values can be entered into unreal to be mapped to distance. So each lens is mapped from focal min to infinity. (Open to close)
This is much more interactive than relying on 50 32bit ST maps to blend between in real time.
Houdini is my ground truth for cg. Unreal is a great app but it’s very Wild West. Placing the same data into Houdini allowed me to map points along a curve that was generated by calculating the focal distance / (focal distance - focal length)
This gives the falloff to infinity for any lens.
Houdini and flame can also use the raw distortion values instead of ST maps. Interactivity is much improved this way. I could bring in my raw values into flame then verify that it was working correctly by using the original st map to check it.
Flame doesn’t currently have a way to create a data stream like a lens model so I used timelines to map distortion and vignette as well as metadata over the distance throw of the lens.
Flame 2025 adopted the 3de model to create their new lens distortion tool which made it easier to apply the same data cleanly.
Depending on the lens, the dynamic distortion may be more/less obvious.
My recent post with the flame 2025 setups might not be so obvious with the lens breathing because they were longer lenses in this quick demo.
The distortion is still present and dynamic as you can see w the values in the UI.
The Dropbox link on that clip includes a couple of wider lenses which should show that effect more prominently.
Hope that helps?
A