It is the first time I do a job using 4448x3096 anamorphic lens 16 bit exr sequence inside Flame.
I am provided with the sequences coming from the lab, and I wanna use batch fx to do the shots.
The client advised that:
The files will be delivered as Uncompressed 16-bit OpenEXR. The color space will be Linear ARRI
Wide Gamut. Without messing or playing with the metadata already fed inside the exr.
Can anyone give a bit of advice about just the basic import/display/export workflow, project settings…etc?
And of course the (todo) and (not todo) things I know it is a lot … but don’t want to mess with this project. :)))))
Get your viewing LUT set up in colour management /preferences. Use the ACES transforms to view the footage. Make sure you tag it correctly on import.
As you work, don’t be a slave to the Linear colourspace you have been given. Whilst in a batch setup, Feel free to use the ColourMgt Input Transform on the linear footage and jump into Log-C and back to linear depending on your task at hand.
If you are the final step in the process remove the anamorphic (in my opinion). Squash the image and make those pixels square. You might want to enquire about the type of anamorphic lens used. Not all are 2:1 anamorphic.
If you are not the final step in the process then you might need to resupply them exactly as your received. Again make sure you tag the raio correctly on import and then Flame can display the anamorphic correctly.
This is amazing man @PlaceYourBetts , that was exactly what I was scratching my head about
I will try all these and let you know
So @PlaceYourBetts quick questions:
To tag the footage in import, you mean to tag them in Arri LogC WideGamut / Scene -Linear Alexa Widegamut or ACES? If the later, which ACES exactly ?
OK for the pixel ratio, but we have to activate size after the import to be taken in consideration
Tag them ** Scene - Linear Alexa Wide Gamut.**
Not ACES colour space but using the ACES transforms. As in, it isn’t an Alexa wide gamut LUT we use but an ACES transform that knows what Alexa wide gamut needs.
Not sure I understand. Sorry.
Do you mean using the “use Ratio” option in view settings?
Cameras obviously capture log. Alexa logC in this case.
Comp work generally wants to work in linear, tho some tasks work better switched to log then back.
AlexaWG_linear is a working space you could use instead of ACEScg if you wanted.
Alexa gamut does actually run outside the ACEScg gamut in places, so there is an argument to be made.
Either way, using an input transform (colorManage node) going from WHAT to WHAT is generally your workflow.
Keep it simple. These working color spaces contain no gamma curve, so keep clean of those except for client dailies which should be done after the shot work. (I usually do on timeline w openClips linked to my shot comps)
Linear to log or the reverse etc. Definitely want to be 16 bit, 32 bit is overkill except for data passes like UV or displace. When working between ACEScg (linear) and ACEScc (log) there will be little diff on conversion. Same thing for Alexa LogC (log) and AlexaWG_linear (linear).
Exception to the simple rule above is if you had graphics. Then you’d want a viewTransform (colorManage) to reverse out the gamma curve and convert so that it lives in your linear comp space w everything else.
“Ratio On” in the player will give you the stretch so the player looks correct.
Sometimes, tracking gets confused by the squashed image. In these cases, Physically stretching the image (usually 2:1) can be helpful for the track, then squash back to original size so that it matches your original plate.
Hope that helps a bit?
Inside Color Management in the prefernces, in the project working color space, do I leave it unknown, or LogC (v3 -E1800) Alexa WideGamut, Or SceneLinear Alexa WideGamut?
Don’t leave tagged unknown.
Anything in the actual prefs are simply defaults.
Use a ColorManage set to TagOnly to let flame know what it is.
LogC is native from sensor.
I find it helpful to enable a view rule under camera/Alexa rendering. It’s another way to see an SDR version of the HDR data in the plate.
Alexa Wide Gamut linear would be an alternative working space to ACES.
Although most comp tasks are better handled in linear, there are a few notable exceptions like grain/degrain.
Generally comp in linear and usually they will want the comp delivered back in log to match non-vfx plates.
That make sense?
I think in this case its filmscans? so no native log, i think the scanner natively does linear although not 100% sure
But yea this is basically just a regular linear workflow, so wverything thats been said makes sense, you can see my logik live on aces it covers all the view transforms things and how scene reffered data needs to be handled and why.
Working with graded (display reffered) is like buying a frozen pizza and changing that into something else (add toppings)
Working with scene reffered data (log or linear or whatever) is like buying the ingredients and baking that pizza yourself, can you put cheese into the crust of a frozen pizza?
If shot on film and scanned it would usually be adx10 cineon?
Size alone would seem to be a digital source.
Scanners like sensors are usually log.
Reading above, you were given source plates that were already exr?
So they were pre converted for you into linear?
Not impossible but uncommon.
Grade was already done or no?
If you were to hit bypass on the lower left of the viewer, does the source plates become flat (logC) or super dark (linear)?
If bypass looks identical on/off then there would be a gamma curve baked in. On a feature this is very unusual since they’d prefer to do color afterwards.
Yes they were already EXR with the little nice guy in .ccc that should match the EXR to what they were seeing on the set in term of “lut” which I can’t make it work to look like their Ref QT. So they were not graded.
I am not sure if they are converted into linear already. But I can see them flat when I apply a color Mgmt node like this
To answer your last question, when I bypass they become super dark and contrast like this
comparing to this look
I will allow myself to re-address the main problem behind this post.
I was trying to color match my exr output in QT with their reference QT, they provided a .ccc file that does som but it doesn;t no matter how I apply it: convert to rec709 and apply it, convert to logC and apply, it doesn’t work or don’t give satisfactory results.
Since time is short, I was using Nuke to matchGrade between my exr outout in QT and their QT and the results are ridiculously AMAZING, I generated a 3d lut from that node and a .cc to use them in Flame using Look node, the result was ok except for the hightlights that became black.
Any idea about why this is happening, and if there is any technique that does the great color matching between two clips inside Flame?
do send me files happy to have a look.
Yea usually from scanners its adx10, but ive gotten linear exrs before.
actually from DI places i always get linear exrs, this is very normal for episodic vfx work, netflix always sends aces 2065
the black spots will be a problem with luts not working on data higher than 1 or something but thats really not the correct way to do this anyhow.
my best guess is still a custom showgrade rec709 lut that was used and not provided.
The screenshots definetely look like they are linear exrs, so that at least is correct.
Agree w Finn.
Definitely look linear. One less variable.
W cdls and ccc usually the workflow for dailies would be at end of your comp, add the cdl (ccc) then add another colorManage, view lut, that goes from source to rec709.
I usually do those on my timeline to keep separated from the comp work itself,
But could do in batch as well.
Color match isn’t something I’d prefer to do in this case. It becomes another variable.
Plate plus cdl plus rec709 should equal the result. Workflow might be:
Linear plate > colorManage (input transform) from alexaWG linear to logC
Then put CDL (ccc) on that.
Then convert from logC to rec709.
Usually that’s the match.
If you post fir Finn,
Send me a link and I’ll verify as well.
Sure I will send a sample for both you and Finn in a private message
In the sample I sent, the .ccc file is the one provided by the client, the other two I generated from Nuke to test an alternative solution
what was the end result? I still think you where missing a show-lut as both k1s1 , aces 709 and newer arri 709 luts did not match 100% to the ref
But yea probably the source was actually acesCG