Matchbox request: Colour Balance from Macbeth


Can somebody with the relevant skillz create a matchbox that balances colours based on a macbeth chart reference please, which can then be applied to other clips from the same scene?

Or point me to the one that already exists.

I’ve been looking at Crok New Balance and White Balance but whilst giving a good white balance they give fairly spurious results across the other colours. Also Crok New Balance appears to resample every frame from what I can tell.



idk if it helps you but i learned that colorcharts arent to be trusted :slightly_smiling_face:

I also tried to find a solution for mmcolortarget type stuff in flame but nowadays I just dont bother with it anymore

1 Like

:joy: What a silly article.

I am surprised there is no Color Neutralization tools in Flame…

In Nuke we had to install a plugin to do so and also NumPy which is a huge maths library…

A clear gap in the software… Autodesk should put this in place because it is a pain otherwise to do any serious CG.

1 Like

Color Neutralization in the simple form exists. Matchbox ‘White Balance’. Simply set color sampler and it does it.

Screenshot 2024-02-02 at 8.27.19 AM

In more general terms:

Are you looking primarily for white balance and exposure, or also saturation and detailed color matching for certain chips (e.g. red, skin, etc.)?

The MacBeth chart was omni-present in stills photography where I would use it more frequently. But then stills photography is orders of magnitude simpler in terms of color management. I’ve had very mixed results with color charts in video/film work. And I generally don’t use them at all (the simplest of reasons is that they get chopped off by editorial way too often before material gets to me, but even then).

The old color warper has a match function which generates a custom set of curves I’ve tried out, but don’t use regularly.

Before we go down the Macbeth road, it would be interesting to see if using MacBeth charts would actually solve the problem. To test, install the free version of Resolve, load the clip, use the color checker tool in the color page to perform a match and then export a LUT from that match. Use that LUT in Flame and see if it produces consistent and usable results. I think that would be a good proof point that we need a Macbeth matchbox/OFX. I have my doubts.

Sample from Resolve:

If you do this Resolve test, be mindful to use the same color management pipeline (e.g. ACES). The generated LUT, while a relative adjustment, may be different based on color pipeline.

That being said, the best way to match colors is in the Image node along with a good scope, primarily the vector scope. A reference monitor helps, though when you’re matching colors, any monitor will do as you’re comparing two colors on the same display.

For neutralization alone, watch the center of the vector scope, you can immediately see the bias and offset. The more advanced version, is using an H-M-L vector scope so you see the center for highlight, shadow and mids separate, makes it easy to dial in shadow neutrals (those reddish Canon cameras), or remove slight tint out of white highlights.

Example of overall + HML vector scope in Omniscope:

Yesterday I had to do some color matching on beauty brand imagery. Client gave me one Toyo and one PMS color code. Made a color target in Illustrator from the libraries, imported into Flame, added Image node and used combination of primary and curves while watching vector scope to dial it in. Client happy.

Try to stay away from hexcodes someone gives you. Hexcodes are color space dependent. Unless you enter them in the context of the same color space used by the person that gave them to you, results will be unpredictable. Hexcodes are simply the the encoding of a color relative to the color space, not an absolute color value.

And if you do rely on Scopes, it can be a good investment to get a license of OmniScope which is much more comprehensive than Flames internal scopes (though they are quite good). The main thing you sometimes need for color matching is to isolate a certain area of the image, which you can do in OmniScope by holding Alt and drawing a mask.

Here’s how I have OmniScope configured lately, which runs on a separate monitor next to my Linux Flame for color critical work. Of course that’s not as far as everyone may have to go, when you just want to neutralize that shot you received from the client so you can comp it into the batch. For that existing Flame tools like ‘White Balance’ and some basic adjustments in Image should be sufficient.


Thanks for your interesting points on colour balancing and scopes.

It’s probably worth pointing out, that if Resolve has a colour checker node and Nuke has a colour checker node, then Davinci & the Foundry seem to think that these nodes are useful.

Exporting a LUT from Resolve based on a colour checker in one shot and then using it across multiple shots will not give consistent results as the point of the colour checker is to use it on each shot separately to bring them together. For this to work, you need to run the Resolve colour checker on each macbeth shot and export a LUT for each individual shot.

It’s a process that works to give an initial balance to each shot. Being able to do it inside of Flame without resorting to Resolve or Nuke would be really useful. It’s not a magic bullet and would be used as a tool in a pipe along with the other methods you’ve listed.

You shouldn’t need it on each shot. You should only need it once per setup. Unless lighting conditions or camera settings are changed, the color balance of the scene relative to the camera capture should remain constant. The color balance is mostly a function of light and camera sensor, not object being captured.

Case in point - on yesterday’s job I balanced two setups (one product & one people shot) and then forward copied these settings through the rest of the 8min sequence with some 50+ shots of different people with different skin tones. But it was a singular studio setup. Camera and lighting didn’t change.

Once you move between scenes, the color checker could give you a theoretical starting point, but in my experience you will need human intervention to make these setups/scenes cohesive to the viewer. And having a black box correction at the beginning of your chain which you then will be fighting downstream is bothersome. I’d rather have an Image node where I can see exactly what was done, and where I can tweak it. I’ve seen color checker application take an image in weird direction, mostly because not enough time has been spent to expose the color checker on site in a consistent fashion (lighting wise). Just a color checker in the shot somewhere is not enough.

It’s true that Resolve had this feature for as long as I can think. Not sure if Nuke has a built-in version, or just a gizmo from the community. The question is, how many people are actually using it? I rarely see it come up or talked about. Maybe a false sense of usefulness?

Kind of like every producer preferring an Arri or RED over a Panasonic camera, even if all of them would do just fine. Nobody ever got blamed/fired if an Arri or RED was used. But God help you if shot Panasonic or Sony and something goes sideways, even if not camera related at all… :slight_smile:

I feel like some camera crews religiously film color checkers, which editorial immediately strips out never to be seen again (for valid workflow concerns). By the time the colorists gets footage it’s no longer there, or has to be very tediously chased down.

There’s an interesting effort by Pawel Achtel’s camera to capture and transport metadata in the image which would allow for the color balance to be done in post without a chart, but based on color balance meta data measurements in the camera. Ultimatley that would be a better workflow, but he’s fighting an uphill battle. He will tell you endlessly about it :slight_smile:

Use case:

Motion control shoot. Exterior. Day time.

Plate 1: 10am, light cloud in the sky.
Plate 2: 12pm, still overcast but brighter clouds.
Plate 3: 2pm, sunny but set and talent is in shadow
Plate 4: 1am, studio. totally different location and DP’s best attempt at matching the lighting

(basically the light and colour has changed across each plate)

Each plate has been shot with a macbeth. The editor has not deleted it from the drive. I have the 4 plates and 4 macbeths. The macbeths have been shot correctly.

Post process: Analyse the 4 macbeths and apply the 4 balance results to each relevant plate.

After that, look at the results and manually adjust accordingly.

I was with you til Plate 3. It’s always bright, direct sunlight with hard shadows.

see note: set and talent are in shadow - not direct sunlight

…or are you joking?

Half joking?

Wearing my colorist hat, I’ve dealt with situations like you described few times a month for the last decade. I used to call it the Vancouver Tax Credit Special. Exteriors all day. In and out of clouds in the morning, rainy at lunch, and then sunny in the afternoon. Intercut without a care and it’s my job to make it all feel cohesive. In my experience Plate 3 never happens, but I get that anecdotes aren’t data.

I’m with Jan on this one. If you have to adjust by hand then what’s the point of the Macbeths? As a vfx supe, I’d have to pick a huge fight with most productions to get them shot. What happens when they shoot one at the beginning of a series of takes in the sun and then clouds move in. You need them on every take to really be safe, and at that point the 1AD has exiled you to the trunk of a production van. If I have to spend credibility like that I’d rather have lens grids.

I HAVE managed to get Macbeths (well, XRite Color Checkers) recently on shots that will have CG in them. I then shot the chart on my pano rig and used it to line up the HDRs with the Alexa. I guess I wouldn’t mind a Macbeth auto-matching tool but it seems like something that would be more dangerous than useful most of the time. As much as I’ve tried over the years, I’m generally unable to engineer myself a perfect world.


I’m also not sure the Matchbox GUI stuff would support enough stuff to make this happen. Maybe go make a general feature request with ADSK and post it here for us to upvote?

1 Like

Very much agree here. If you move between overcast to mixed to harsh mid-day sun you have a lot more than just color balance to worry about. Shadow quality, specular highlights vs. flat lighting. Making that look like it was all the same time is way more complicated than a Macbeth chart. You’re trying to match dynamic range and a range of lighting and texture qualities that have changed between the shots.

I mean does it hurt to have a chart shot? No, if you can get them, no harm. It can be more useful as a reference. If you dug yourself a hole and would like to go back and check how it looked, or make sure the camera wasn’t misconfigured that’s where it can come in handy. But not for the ‘simple button’.

Next time you’re in that scenario, have them fly a big silk over talent and immediate area and relight it with soft light. That way at least talent will be consistent. As long as the background isn’t like right behind talent but more in the distance, you might be able to even that out more or worst case replace it in some shots.

If they can afford a Bolt for the camera, they can afford a small crane for the fly swatter and the gennie for the 18Ks.

1 Like

You can see if they can use a slate that has a mini color chart at the top. Most of them do now. That way they catch two flies with one. And it will always be in focus :slight_smile: Won’t be a full Macbeth pattern, so gizmos and Resolve won’t work, but you have a least a good reference for a black, white, and RGB chips.

1 Like

For the purposes of this discussion, we’ll have to assume that it is possible to shoot macbeths on set. If you’ve never managed it, you have to trust me that it’s been done. And yes, a colour chart would not be much help going from diffuse lighting to harsh direct sunlight. :upside_down_face:

I’ll attempt to describe another scenario…

shot 1: Backplate for CG.
shot 2: Backplate for CG. For whatever reason (maybe weather, maybe time of day, maybe a different day, maybe a different unit, maybe this plate is in a studio) the light or colour is not the same, but is in the realm where it can be colour adjusted to align.

Macbeths and HDRs have been captured for each shot.

Options in post:

  1. Treat the 2 shots as 2 completely separate CG lighting setups.
  2. Tech grade the 2 backplates and HDRs to match (using the macbeths). Light the scene once in CG.

Now imagine that but with 10 - 20 shots

Option 1 is crazy
Option 2 is made easier with a colour balance tool.

Yes, you might have to adjust the result manually. But saying that negates the need for the tool is like saying there’s no point in a stabilizer tool because sometimes you have to add an offset after it.

Here’s my request on Flame Feedback. Upvote if you agree please:

I’ll differ on that one. An experienced colorist with a good app, a control panel, a scope and a reference monitor can white balance your shot better and faster than you dig up that Macbeth chart and futz with it afterwards.

It’s a matter of experience and practice.

The problem is that we expect people to wear many hats at the same time, and sometimes that hat is a bit more like a sheet of 216 and not a solid hat. Not their fault, bad expectations of management.

We all have to get of the ‘do more with less’ train at some point…

That’s not option 1, that’s somebody else doing option 2.

The colourists I work with that I’d trust to perfectly tech grade each plate are generally pretty busy people. I reckon I could load the macbeth from my library before their client attended session ended.

View Transforms negatively clamping footage - #21 by jordibares

This is mysteriously jumped forum threads, so putting this back here…

Fair enough. Those are more technical applications than what I usually encounter when working on color in film and commercial/beauty work.

I spend some time looking into what it would take to add this to Flame and how the algorithms actually work.

First of all, I don’t think this can be accomplished via a Matchbox shader. Matchbox shaders aren’t general purpose plugins but front-ends for GLSL shaders that get executed on the GPU. Excellent for some task, but not everything that core Flame doesn’t handle. They’re a unique niche.

There are UI limitations for Matchbox shaders, though we could probably provide an overlay of the Macbeth patch positions and it would be up to you to crop and resize the image so that it lands right under overlay. Not as easy as the other tools which allow you to just drag a grid around.

But the bigger limitations are that Matchbox execute everything all the time. And you don’t want to keep running the matching in real-time (see Leo’s comment on crok_new_balance - which I looked at the code as well). You want to do it once and then save it. For performance and stability. And I’m not sure the matching algorithms would be suitable for GLSL, also considering that Flame only supports GLSL 1.2/1.3. And not sure how that is affected by the fact that Flame no longer is based on OpenGL but now translates everything to Metal/Vulkan.

Anyway, so no Matchbox.

However, it may be feasible to write a Python script that handles this, not dis-similar to how MLTW works. Run it on a clip and it generates a 3D LUT that is saved in the library which you can then apply via the CM node. Also UI challenges, but there may be ways of auto-detecting the Macbeth markers (see example).

There is an existing implementation of the histogram matching algorithm in the OpenCV libraries which has a Python binding.

See here: Automatic color correction with OpenCV and Python - PyImageSearch

Don’t have time to work on this right now, but maybe down the road… Or if someone wants to collaborate on it let’s talk.

On second thought, it seems you guys are doing this as part of the pipeline. So if we’re relying on a python script to generate these LUTs it almost seems there’s no value to doing this in Flame, but you could just run such a python script as part of a batch in the pipeline and generate the LUTs, which can then be loaded in Flame or Nuke or wherever else.