LatLong180 Viewing

Could anyone point me at a little tutorial on the proper way to view latlong180s in flame? I don’t think this should be tricky but I need to be able to see if this stuff looks right.

Not sure specifically what you’re asking but there is a Map Convert node in Batch. Are you trying to look at Lat-Long as Angular or Spherical?

Same rules should apply for 180 for cleanup etc. and to view in 3d u can pipe it in an action using an ibl.

If i’m not mistaken, the trouble is flame needs all latlong/spherical type images to be 360, so the first thing you need is an ST map to convert a 180 into a 360, at which point it’ll work with the various tools in flame.

1 Like

The original question about LatLon180 deg implies you have half-hemisphere covering the floor and sky so just extend the canvas and treat it like a LatLon360.

If you are talking about having half the sky and having the floor as well (vertical hemisphere) just duplicate and again, treat it like 360.. there will be seams, you could choose to fix them if needed but it should be quite simple.

Given you have spherical transformation tools in Flame, you won’t need STMaps unless you want to do something funky.

1 Like

So If I have a 180 pic or video like this I don’t need an ST map? (I know this is not latlong, and I’m probably misunderstanding Carl’s question by assuming that the 180 capture was a fisheye lens that made an image like this. but anyway, if there is a non-ST map way to make these play nicely with Flame’s 360 VR tools, that’d be great)

You can do the conversion easily, either PTGui, Photoshop, Nuke and I am sure Flame too.

If I was doing it in PTgui, with one single image and assuming is an 8mm lens on a 35mm camera like a 5D MarkII, this is what I would get and if you put this as a Panorama/HDRI on your scene you would get half a room, as you would expect.

1 Like

I’m less sure on the Flame front. Haha. The 360/VR tools are pretty limited compared to Nuke’s offering.

Can you punch the expressions perhaps?

these are the maths

I’ll give you the equations for converting from fisheye lens coordinates to equirectangular (latitude-longitude) panorama coordinates.

Fisheye to Equirectangular Conversion

Given:

  • Fisheye image with center at (cx, cy) and radius R

  • Point in fisheye image: (xf, yf)

  • Output equirectangular image dimensions: (width, height)

Step 1: Convert fisheye pixel to normalized coordinates

dx = xf - cx
dy = yf - cy
r = sqrt(dx² + dy²)
theta = r / R * (FOV / 2)

Where FOV is the field of view of the fisheye lens (typically 180° = π radians for full fisheye).

Step 2: Convert to 3D direction vector

phi = atan2(dy, dx)
x = sin(theta) * cos(phi)
y = sin(theta) * sin(phi)
z = cos(theta)

Step 3: Convert 3D vector to equirectangular coordinates

longitude = atan2(y, x)
latitude = atan2(z, sqrt(x² + y²))

Step 4: Map to output pixel coordinates

u = (longitude + π) / (2π) * width
v = (π/2 - latitude) / π * height

The resulting (u, v) coordinates give you the position in the equirectangular panorama that corresponds to the fisheye pixel at (xf, yf).

Note: For the reverse mapping (which is more commonly used for actual image generation), you’d iterate through each pixel in the output equirectangular image and sample from the fisheye image using these equations in reverse.

1 Like

Or in nuke

Yeah, Flame’s map conversion and 360 toolset never got that developed. Like, it functions. Map Convert can be useful, the 360 viewer can be useful, but as soon as you want to do anything remotely offbeat it shrugs and you’re out of luck.