Timeline colormanagement bitdepths

What dictates what bitdepth are accepted? seems to be SUPER weird imho

acesCG to different log spaces , gives me different output bitdepths for the transforms in a 32bit timeline:

appleLog → 32bit Float (good)
Arri Logc3 → 10bit integer (very bad)
LogC4 → 16bit Float (bad)
RedLog3g10 → 10bit integer (very bad)
Slog3/Sgamut3CineVenice → 10bit integer
AcesCCT → 16bit float (bad)
ADX16 → 16bit integer (good)

what kind of crazy nonsense is that, why are all these log formats encoded in… float thats just wrong ? @doug-walker could explain why maybe?

16bit log is not acceptable quality, 10bit integer is just as bad.
Only useable ones are AppleLog and adx16?

seems like the CTF file to convert ACES to acesCCT is hardcoded to 16b float… crazy.

I wonder if that is just a function on how this description is assembled.

I looked at the syncolor files, not sure it’s exactly this one, so if you can find t he one that matches the screenshot we can find out more.

/Applications/Autodesk/Synergy/SynCoolor//transforms/camera/Arri/Alexa-v3-LogC-EI800_no_cam-black-to-ACES.ctf

There are texts in there, but note that your example inverted a transform, so I wonder if the description didn’t get flipped.

If you look at the contents it has a 1D LUT for the gamma transform and a 3x3 matrix for the primaries. The matrix identifies as 16f → 16f, the LUT indicates 10i → 16f, which in that direction is correct. If you invert it, there’s a question if it actually would truncate data to 10i, because it thinks it’s a camera file, or if that just a labeling problem?

Also does the transform actually prescribe the result bit depth. Note that in the CTL in the timeline, the dropdown is grayed out, but set to 32f. So it could well be that you’re on a 32f timeline, and regardless of the transform details, the actual operation actually happens at 32f.

If you do the same transform in the format section of the clip, rather than adding a ColorMgmt TL-FX that dropdown stays active, so you can select your processing bit depth.

Or replicate in batch and turn on the display of bit depth. That way you can see what the transforms do. With that in mind then check how that translates in the timeline.

I would compare the results and see if there’s a noticeable quality difference. I suspect not. That seems more like a UI problem than a data problem. But I could be too optimistic here.

I will further try and find out, super odd imho.

looks like the matrix is hardcoded to 16b float in/out in the CTF file.

I have a workaround tho, just made acesCG to Log* DRX files with resolve CST nodes and apply them using the resolve OFX :smiley:

That is one of many weird and obscure stuff in syncolor colour managment. A non-clear nomenclature, an unintelligible info, and the certainty of not knowing 100% of what you are doing.

2 Likes

It seems that these numbers actually mean nothing.

here is a 32bit batch with chained acesCG->LogC3 ->acesCG → logc3 and so on

both of the 2Dhistos have the same result so it seems completely transparent

flipping the nodes to 16 bit or lower shows banding.

Which leads me back to this thread about timeline bitdepths , i will from now on set my timelines to 32bit?

1 Like