Comping linear (I think, maybe rec709) screen insert into Alexa Log TV footage producing heavy banding

Hi guys,
Embarrassingly, I really don’t know a lot about colour management. I need to learn, but I don’t know where to start. Anyway - sorry, I know this is probably hilariously simple, but…

I’m comping some graphics supplied by a designer into a TV shot on an Alexa in Log. Is it LogC? I don’t even know. I assume so. What does the C even mean. :man_facepalming:
I’ve done the comp, converted the graphic using the “Lin to Log” preset, and exported back into Avid. I applied the Alexa LogC to REC709 lut and the colours look right, but the graphic insert is bandy as hell. Did I miss a step?

Thanks in advance :slight_smile:

Nah. Don’t use Lin to Log.

Sign up for a free trial and watch this…

It’s 8 MInutes

then, watch this. It’s 9 minutes.

1 Like

Not hilariously simple at all @Unclextacy

Like all good things it has a steep(ish) learning curve. Not unsurmountable and totally worth the climb.

We were dealing with linear the wrong way for a long time and it can be hard to shake. What turned it for me was an article by @toodee about the difference between display-referred and scene-referred linear.

Display Referred vs. Scene Referred - Brylka - TooDee - PanicPost

Blew my mind once I got it :exploding_head:

Are you simply removing the gamma curve (linear) or are you dealing with the tone mapping (s-curve) that packs in all of those high value which go beyond 1?

1 Like

But to your specific problem @Unclextacy ;

Did you finish the comp in Log C ?

*I don’t know what the C stands for either :smile: Hopfully someone will answer this for us :+1: *

Your graphic is for a screen comp is probably sRGB/rec709 :thinking:

Do you need to deliver back Log C? You probably just need to make your GFX appear as close to LogC footage as possible and make it look right in your comp when using your viewing LUT.

I would cheat and run my GFX through a ColourMgmt node and do an invert ViewTransform to make it LogC

LogC generally means that it hasn’t been graded yet so it will be very hard to make it colour acurate without knowing how the grade is going to go. Let me know if this still creates banding :+1:

1 Like

“Images encoded with Log C (C is for Cineon ; the original Cineon log encoding is based on the density of color film negative) can be identified by their flat and desaturated nature. Whites and blacks are not extended to their maximum values.”

Thanks Arri - Log C

1 Like

Also what is the bit depth of the supplied graphics, I’m assuming it’s 16bit, but if it’s 8 bit, that will give you banding.

4 Likes

My thoughts too. It certainly wouldn’t do you any favours if the graphics are supplied at 8bit, especially if you are going to colour transform them into LogC or linear.

the C in log C stands for “cool”.

5 Likes

Ah yes this would explain a lot, cheers!

You already got a lot of good advice here.

Some additional considerations:

Color management when done correctly is an exact science.

When you cook a dinner, you can wing the ingredients and with some minimal experience, it will come out alright, as it’s mostly about combining flavors. If on the other hand you bake a cake or bread, you must measure all ingredients precisely as you’re undertaking a chemical reaction which relies on precise ratios. If you wing it, it will likely look wrong or even be inedible.

Color grading is cooking. Color management is baking.

Color management is the the process of correctly translating the image from one encoding to another encoding with either no loss of information and how the image will be perceived, or calculated degradations due to limitations of the medium.

Where this relates to your description - Log To Lin is not necessarily the same thing in all cases. There is only one linear, but there are many possible log curves. It has to the be right one. In fact in many cases Log curves have a precise inverse that was applied earlier in the pipeline (usually the camera).

That link about scene referred vs. display referred was a good starting point. But it’s also a very complex topic, and people still disagree about some of the aspects of it.

One way to think about it is that in the past most material we worked within an edit was encoded the same way (mostly Rec709), so it was easy to comp and mix and match without side effects. And our displays were mostly Rec709, so once we had a result it look right without further considerations.

With today’s wide range of advanced cameras, many different display standards and distribution channels - our pipelines are a lot more complex, and almost invariably we end up with materials that don’t match (i.e. aren’t encoded the same way) when we receive them. So we have to take extra steps that our comps take that into account and apply the appropriate conversions, so that the resulting frame is made up of a single color encoding in which each pixel has the same formula to display it. Think of it that each input has to do their contribution to arrive at a common answer. And this needs to be done before the comp operators that combine the pixels.

Scene referred doesn’t have to be ACES, though this is the most popular. Scene referred only means that you work the scene (or your comp) in a specific color space that everything you receive is normalized to while you work on it, and then at the end you will translate it once (or multiple times) to the display this will go to.

It’s best to have the scene referred color space be equal or bigger than the best of the image components to avoid losing quality in your master. The idea is that you will work on a future proof master, and then shed quality only at the end of the pipeline due the constraints of specific displays. In the future you can adapt that same master to newer displays that are more capable without having to redo anything.

It’s worth watching a few tutorials and reading materials on color management. But it takes time to digest and get the hang of it. Randy already linked the Flame Academy Classes. If you have fxphd, Charles Poynton’s color theory class is great, and there are other Flame specific courses that cover the topic as well,

In the meantime, if it doesn’t look right, something has likely been mis-translated. Chase each image component’s journey and see where things may have gotten of the rails.

And understand how the tools work. Flame color management is definitely quirky to say the least.

If you get banding, it most likely is a case where an image with too little detail is getting stretched color wise. 8bit image or 8bit sequence settings are commonly the issue.

Technically banding happens when color values that were originally just 1 value apart in the camera file / GFX render suddenly get stretched to being 3 or more values apart causing visible jumps in the color/exposure. Banding can happen with any material, but are more likely with 8bit processing since the values are coarser to begin with, so there is less room for error.

That said, there is a case to be made that the 8bit GFX may be better off just being composited as is rather the put through scene referred transforms which may stress it more than it has leeway.

That applies for example in Flame sequences where you may work in ACES on track 1…3, then have a color mgmt layer on track 4, and the place Rec709 / display referred GFX on track 4. Of course that will then limit you to Rec709 deliverables, but may be the best answer in the given circumstance and also considering the nature of the job and it’s half-life.

Over time as you get a better handle on color management it become easier to make these tradeoffs.

6 Likes

Thanks so much for all your suggestions on this. Epic replies! I’m coming back to this job in a few days and I’m gonna really get down and study so it’s not such a gosh-darn mystery. Appreciate it!