Mixed Color Spaces: Invert ODTs or not?

Thanks @randy :call_me_hand:

1 Like

For what it’s worth, a pair of cm with down/up (down input transform and up inverted ODT) seems to do a descent job if your provided material was converted in a way that would require an untonemap display (gamma corrected in flame)
There’s also something interesting on ACES central using a down/up through 2084 instead of video or srgb. I haven’t tried though.
I haven’t been able to create a good enough lut to replace the down/up pair and share across platforms yet, but I still like the idea. Lattice gave something that was almost there, but clamping highlights. Lut exported at size64 and 16bits, but once imported in flame it shows size64 12bits. Everything looks fine … except for the uggly clamp :slight_smile:

So I setup a master grade working on calibrating patterns and captures (synthetic chart from Ampas samples, Sony sample …) I’ll use that as a starter if I want custom settings for future jobs, instead of cm pairs or a lut if I get there.

This, applied on the untonemap conversion to ACEScg, gives something very close to the inverted ODT, but with controlable curves, hopefully allowing to avoid certain artefacts or shifts.

1 Like

Thanks @allklier.
Not sure that it’s even possible to achieve (‘correct’ math and accurate visual result within the ACES current state - and maybe more specifically the ACES 1.0 SDR display tonemap) but maybe a tool that would still apply the needed inversion of the tonemap, BUT … with user control to help slipping away from the most known/common issues related to inverted ODTs … would be handy. Like the gamut compression tools - ish
Matchbox anyone? @Slabrie & @fredwarren? @doug-walker?
I mean, this or completely rewriting the ACES display thingy … :partying_face:

1 Like

Have you tried Resolve CST node? Yesterday was playing with it with some 709 drone footage on an ACES timeline. Fiddling with the tone mapping options got me really close to an Invert ODT… but without the artifacts.

3 Likes

Sounds interesting. I’ll try to look tomorrow. Thanks for the tip @milanesa!

Hah dealing with this is so tiring. I seem to have the same conversation every month with a german supermarket chain. “Our logo and brand colours MUST be exactly like the delivered graphics, but should also look like it belongs in the scene, but also MUST look exactly like the delivered graphics, but also…” I have just about given up at this point and accepted that what I deliver looks like dogshit. At least they pay on time.

3 Likes

One question that may be a bit silly. Is this a problem specifically with flame and its implementation of color management? . I have been looking for similar discussions in nuke and can find hardly anything. They only ask how to set up input transformations correctly but I don’t see discussions similar to this one.

I never understood , reading the information that appears in the color management node, if such colour transforms are specifically developed. I wonder why flame didn’t want to simply use OCIO to follow the same guidelines as the rest of the industry.

This is how to fix this issue:

  1. Do a version that matches their technical requirements but looks perceptually wrong.

  2. Do a version that looks perceptually right but doesn’t meet their technical requirements.

  3. Blend version 1 over version 2 at 50% opacity.

3 Likes

That is definitely on my mind. We also don’t have these debates on our colorist discord as much, which mainly works on Resolve. Flame has definitely taken a different approach on this. I understand the argument for it, but it may not be working so well in practice as we see in this thread.

@Stefan I meant that offer to investigate sincerely. I’m a member of the IMAGO technical committee which just last week made their photon path work public "Photon Path" And "Glossary" A New Tool To Understand And Teach How A Digital Image Is Processed. – IMAGO which is an effort for properly documenting the flow of image information from the camera lens all the way through to the display. That committee spreads across the spectrum of disciplines and includes folks from the Academy and specifically ACES as well. So if there is a real issue, that would be a place to add this to and get a more concrete understanding and solution. But before I can bring it up there, I need to get a good example and decipher and frame it properly.

And if it turns out to be more of a Flame color management / workflow thing, maybe we can figure out the right process or add to the toolset to level the playing field with the other apps.

This is an issue that is not unique to Flame.

This is an issue that is more prevalent for Flame artists because of our role as leads, vfx supervisors, and speaking on behalf of the companies we represent to clients.

This is, at its heart, a physics problem. Taking pictures of pictures that must match the original picture is hard.

4 Likes

@allklier or anyone else interested, Here are ‘equivalent’ nuke and flame setups to play with this thing if needed. There are notes for where to get the sources images (Ampas and Arri)
The v2 flame has extra GFX with saturated colors that inverted ODTs are problematic with.
Flame ACES 1.0 color policy.
Nuke ocio config should be vfx studio aces1.3 ocio2.1.
demonstrate_ODT_tonemap.zip (560.3 KB)

The blur is to give a sense of how defocus will react (blooming, edges …)

The text that says CMY GFX should be RGB GFX

The following screengrabs are oversaturated and dark, sorry (teradici, mac with eizo set to rec709 … meh)






2 Likes

Thanks, downloaded and will look at it in detail over the weekend.

I’ll vouch that Stefan knows his stuff.
Also he’s incredibly nice to deal with.
:wink:
A

3 Likes

@Stefan your test case is very helpful, and I’ve made some progress. The Nuke side is a bit simpler than the Flame side because of the different color management approaches, including how Action does color management (I changed it to comp nodes to improve transparency temporarily).

One issue is that the inputs are not comparable. The AMPAS folder has the still life in both ACEScg and Rec709 versions as you used them in your test case. Yet when you read them into Nuke (or Flame) and apply the appropriate color transforms to them, the ACEScg version still has different contrast and saturation (if you wipe the two read nodes).

I’m assuming that some tone mapping is baked into the Rec709 version of the AMPAS file. The same read-node wipe contrast/saturation difference applies to the synthetic chart files from the AMPAS folder. On the other hand, if you take the ACEScg file, write it out to Rec709 and read it back, they match. Not surprising, as this is a null test of the Nuke color transforms. But it proves that there is an additional variable in the supplied Rec709 file.

So we have to distinguish two problems: One is it a transparent mutli-colorspace workflow (i.e. if we had comparable inputs into your comp, can we combine them truthfully with appropriate transforms). The other is, do you work with comparable material, and is that realistic.

On the first one - I’d like to find (or create) an asset that is free of tone mapping so we can prove that the comp workflows are transparent (does not add its own texture/flavor).

On the second one, that is what we discussed higher up in the thread. If the Rec709 still life has tone mapping baked in that didn’t exist in the camera original, then comping Rec709 material into a bigger colorspace cannot be pixel accurate. It could be perceptually accurate, which requires the compositor or colorist to modify the tone and color so that it matches visually, which may not be exactly what the camera saw, because it depends on the context of the composite.

Incidentally in your Nuke comp, in the read node if you uncheck tone-mapping for the reverse ODT, it matches the the Rec709 image. Which is a hint that the tone mapping of the Rec709 file is a critical detail here. There is no such checkbox in Flame.

That said this scenario is more unique because we’re comparing a single camera capture that has traveled in different paths and are comparing it at the end. In most real life scnearios we’re comping two separate materials.

As it comes to the color bars you have (more an example of graphic elements), i don’t see any issues bring them in properly so far.

More to come…

@allklier, The Flame and Nuke results should match.
In Nuke, are you using the correct ocio config? Have you tried to just change the filepath on the read node as opposed to creating a new read node (to make sure it’s set the same way)? For the video rec709 it should also match what you get in your OS/preview/whatever way-app.

@Stefan Yes, I just used path replacement in Nuke and clip replacement in Flame. And re-verified that I’m using the correct OCIO configs as you stated at the top.

Here’s a simplified example (before we even get into the compositing):

Reading three files from the AMPAS folder in Nuke:

Read 1: ODT/Rec709_100nits_dim
Read 2: ACEScsc/ACEScg
Read 3: ACES/
Read 4: RAW + Inverse Display Rec709 2.4

I would have preferred to start with the camera original, which they also provide, but Nuke no longer includes the slog1 IDT, so we have to use their pre-rendered version in the ACES folder in AP1.

Read 1 (same file you used in your test), uses Utility/Gamma 2.4_Rec709_texture IDT (same as your exmaple, and logical choice to read Rec709 image back in)

Read 2 (ACEScg file from AMPAS, your test uses a ‘converte_Flame’ file I don’t have), uses the ACEScg IDT

Read 3 uses the ACES/AP1 IDT

In the screenshots if you wipe between them you see that Read 2 and 3 are identical. Read 1 is darker. I’ve tried different utility IDTs, and when you lower in gamma (2.2 and 1.8) some things get closer, but saturation is still off. This is the tone mapping I believe has been baked in.

The reason that matters, is that without being able to construct this null test, the rest of the comp test is biased as we can’t tell what part of the difference is the comp colorspace problems, or biases already present in the inputs.


As a next step, I would like to replace the Rec709 image from your test with one we exported from Nuke in the ODT 2.4 so we have a proper Null test. Then from there we can go into the test of the comp and see how mixed colorspace assets fare.

As a variation - if you apply the reverse ODT to read 1, then it matches properly with read 2 & 3. Which I guess goes back to the original question of the post. It works in this test image without artifacts.

The conclusion from this is though that the Rec709 2.4 output from an ACES color pipeline is not the same as a Rec709 2.4 texture you receive from a graphic designer or other source and seems somewhat flawed in principle. In the old days, if someone gave you a Rec 709 assets, that is exactly what you would see on the screen, not a variation of it.

That may be the question to pose to the ACES experts.

Nuke script: Dropbox - read_test_v2.nk - Simplify your life

1 Like

Finding some more details that could explain that darker look of the ODT Rec709 100nit dim, that was applied to the test image.

Looking through the ctl code, there are the expected transforms to the Rec709 primaries and the 2.4 gamma, which would be part of any color space transform.

However, there are additional changes noted as ‘apply gamma adjustment for dim surround’ as well as ‘apply desaturation to compensate for luminance difference’.

Thus this ODT that was used to generate the Rec709 image isn’t a pure primaries/gamma transfer as one might have expected. And that is specific to the ACES environment and it’s math.

Rec709 ODT CTL script

The conclusion from that is, we should update the test case and omit the AMPAS Rec709 file, but instead create a straight Rec709 transcode by other means, which should pass the Null test and then return to question at hand, which was comping elements of the various color spaces.

This ODT as written is quite possibly appropriate as a display transform for specific viewing environments. But when applied to any materials that go through the whole image pipeline again, it creates unexpected results. It truly is an end of chain EOTF and should only be used as such.

@allklier, I couldn’t focus on this during the weekend but I’ll try to dm you this week and maybe we can find some time to chat. Thanks for looking into this and for your inputs.

Andy! @imag4media, I showed your comment to my teenage kids, thinking it might help … they said that i must have rigged the whole thing and paid for positive comments … :wink:
Cheers Bud!

Edit: That’s a sweet cheering up thing to say, or to read, thanks Andy @imag4media , hope you’re well dude.

4 Likes

@Stefan Thanks, would look forward to chatting.

Given yesterday’s detour over the input image, this is where I landed. And the answer is that it works perfectly well in Nuke, and Flame has a whole ton of trip wires in terms of color management, but if you look in the right places it does work there too. It just requires more attention. Screenshots, Nuke script, and Flame batch attached below.

The three things in Flame:

  1. If you use Action you have to make sure the node-prefs for Action are set properly. There’s a color management tab which defaults to ‘auto convert’, ‘same as background’. So Action will run on the color space of the background input, which in the original batch was often Rec709. You’re better off setting ‘user defined’ and then setting your color space to ‘ACEScg’ to match your working color space.

  1. If you use a comp node, there is no color space management. So you have to make sure that both inputs have already been color managed to the same color space or things get really ugly. This gets confusing because of the ‘viewing rules’ in Flame, which make it all look right on the screen, even if the patient is very ill. If you use a comp node with divergent colorspaces your end result will be garbage. (This is where Nuke’s approach helps, because it prevents this from happening in the first place, or makes it a lot more obvious.)

  2. The inverted display transform does not seem to be working properly, or not as you expect. By all accounts it appears to be doing a double transform. And again, because of the viewing rules, it’s actually really hard to see what it does. There were also weird things in terms what the rules based transform is. Best to override ‘from rules’ and be specific, for both the display and the transform. I had an unexpected D60 SDR in there in one place.

The upshot of all of this, working in ACEScg without significantly changing the look of the image is possible, but you need need to do all the color management. As long as you do that right at the input, and let the rest flow from there, it’s not too onerous.

Which gets me back to - Flame color management in the age of ACES is unnecessarily complicated and misleading. I get the idea of ‘don’t you dare touch my pixels’, but if it backfires, that’s off little use. It also makes it very hard to debug, because it’s not WYSWIG unless you disable all the viewing rules.

It’s unclear how well ‘auto convert’ in Action works. But even in the best case it creates more obscurity of what is happening. If it does work, it will be dependent on the color management setting.

Updated Nuke side-by-side. This is with a new Rec709 still life shot, which I created via manual CST in Nuke from the ACEScg still life file to bypass the ODT_dim shenanigans.

The wipe in the bottom right shows the null test. There’s still a very slight difference between Rec709 and ACEScg. It seems to have lost some of the specularity.

Flame result from updated batch.

Top left - Action set to run in Rec709, color bars and still life untouched (using Rec709 still life file), Arri footage translated to Rec709. Then the final result converted to ACEScg to work in the overall comp

Bottom left - Inverted Display transforms. Apparent double conversions or some other weirdness. Discounting as unnecessary.

Bottom right - Still life from the ACEScg test file (native, no CST), Arri footage CST from LogC4 to ACEScg, color bars CST from Rec709 to ACEScg, all into one Action that was configured to be ‘user defined’ / ‘ACEScg’. Then no more conversions for the side-by-side since it’s already ACEScg.

This is the money shot, because it shows that you can stay all Rec709 or all ACEScg and comp images without dramatically distorting things. No need for inverted ODTs. The only differences may be some specularity loss if you go from high DR camera footage through a low DR color space like Rec709.

If you work with mixed color space material in Flame, the best workflow would be this in my mind (which kind of what mimics what Nuke does with the input transform dropdown in the read node, except you have to do it):

  1. Set an appropriate color space preset and working space (ACES 1.1 / ACEScg is a good default)
  2. Make sure all media are properly tagged in terms of color space (defaults are good, but checking is warranted)
  3. Any input that isn’t already in your working color space, should be followed by a CST node that is set to ‘input transform’, ‘from source’, ‘ACEScg’ or whatever your working space is.

This will convert all materials to your working color space, and then you don’t have to worry about Action preferences, comp nodes going haywire, etc.

From there your viewing rules will fit your monitor based on setup, and you can export ACEScg if asked for.

Here’s a zip file with the new still life image, the Nuke script, and the Flame batch: Dropbox - Materials.zip - Simplify your life

I would welcome anyone double checking this to make sure I didn’t make any mistakes. I did go over it a few times, but anything is possible.

PS: I disabled the blur nodes, as it’s easier to compare, but I understand why you wanted them.

1 Like