"New Colorspace" and "Legacy Colorspace" questions

I’m fond of the color management in flame, but I do run into trouble with it. This trouble is almost always when something is being done outside of the modern definitions of ACES/SceneLinear/LogC.

So, if something has an old Nuke-style linear that’s just a gamma shift of 2.2 or 2.4, or if someone built something that looks good under whatever the Flame Legacy linear modes are, how would you correctly convert it into a true scene linear file that can zoom through CM nodes with no problems AND looks correct under a modern viewing transform like Alexa Rendering or ACES to SDR Video?

I know there are ways I can move the image into ACES, but the goal here is to do so without upsetting how the image looks.

I always end up using view transforms for this, but I believe they are not technically correct

What is 'old style nuke linear" you mean the stuff where people squeeze the whole linear range into 0-1 to make their non-tonemapped horrible view transforms happy ?
I havent seen that in a LOOONG time, mostly just from people creating motion Graphics in nuke,

I dont think there is a way to do what you want, except for using the linear file in aces as you said and then adding a bunch of transforms at the end to go back to the look the old view-transform gives you, basically creating your own RRT/ODT thats just a simple gamma operation, so you would output linear/709 from whatever and then do a gamma operation on those values from linear to 2.4 or sRGB 709EOTF (nukes 709) , after that you can add a aces 709 (or whatever you output before) to acesCG or LogC or so and then use the modern view transforms on top.

Hope that makes some sense :smiley:

1 Like

Colorspace is just a problem. Most people don’t understand it and it becomes very difficult to have conversations with the people who have created an image and can’t really say what colorspace it’s in. For example I was recently given a comp from a Nuke artist with no gamma (linear). I’d seen his nightly postings and I tried to reproduce the color, it took a while. When I asked, he told me it was linear, ACES, rec709 …which obviously is Not possible. [ Also using ACES as a working colorspace is insane ( at least for TVCs ), just my opinion, the gamut is too huge, and it was never meant to be a colorspace you work in ( I know we now have acesCG etc …) but I dislike all of it strongly. ] #EndRant But it’s a problem of terminology and education. I think it would be a great benefit to all to have an in depth Q and A with Doug Walker. I’d pay a fee to to be invited to such a thing.

1 Like

I have dedicated some time to getting a better understanding on this.

It is a tricky thing to enthuse others in our office into comping in a high dynamic range. For years we have just been switching gamma on and off and I am mindful of not forcing them into doing anything they find out of of their comfort zone.

But one thing that never fails to impress people is when I show them the difference between scene referred and display referred. Not just the exposure range but I show the two images side by side and chuck a defocus blur on and boom.

So why do I feel like I am pushing water up hill with my CG department. I am still getting renders in the range of 0-1. Surely of all the departments.
Isn’t it harder for them to introduce this clamped range?

So anyway. This is what I am getting from them and I want to push their display referred linear into scene referred territory.

I tried colour correcting, increasing the highlight intensity.
I have tried an ugly transform from Legacy Linear to Photomap.
A strange combination of adding gamma and then using some inverse tone mapping options but I haven’t stumbled upon a solution that I am happy with.

I made this transition a few years ago, going from the “0-1” to proper scene reffered linear in a medium sized studio. here are some of my learnings/pointers.

-Its very important to make people understand what scene reffered means as you said, the comparisson with blurs/DoF ETC is very good, I usually also show a picture of someone taking a picture with their phone, the “scene” is completely blown out while the image on the phone screen is nice and tonemapped.

  • make sure people grasp the concept that their viewer is a “virtual camera” that films the scene just like you would film something in a real scene, you must expose up/down to see all the detail, noise, hoghlight details etc, because our current monitors are not capable to show of all it at the same time (thats where HDR comes into play :slight_smile: )

-Make some example light values , like look at this specular highlight that we filmed its at like “48” linear value or whatever so they understand the correlation between real world light intensities and linear values.

We then also deployed OCIO where we can, in our case vray/nuke etc, first using custom tonemapping curves/Luts based on the “alexa 709” process, this is neccessary to have the lighters “see” their light beign tonemapped into their display space,

Why they put everything into 0-1 comes mostly down to them lighting with a normal sRGB view transform that isnt tonemappes

you cant just fix it in comp that easily as rhe light intensities dont have anything to do with real world values, relative to each other… one is like filming in a studio with very controlled lighting and one is outside.

we then later switched to full aces, to be honest it was pretty frightening to commit to a full aces pipe, thankfully netflix “forced” us to do it so its not like anyone had a choice.

its not really the lighters that are having problems with this, they adapted rather quickly, for them its jusr very nice to pump in more light and get more accurate GI bounces etc “for free”.

Biggest issue is Mattepainting, photoshop and lookdev, substance painter doesnt support aces or ocio so you need to like make custom luts …just all in all very annoying.
Photoshop kinda works with a log workflow and custom ICC profiles but you are limited to 16bit because photoshop is stuck in the 1990s… affinity is way more modern.

The whole texture authoring and conversion to aces is making this very much “not fun” for those deartments and I understand their pain, just downloading some srgb textures and slapping them on doesnt really work anymore(I heard thats better with arnold)

  • Compositing wasnt very effected by this, some pain points with negative color values on chromascreens but thats easy to handle, we used client luts before anyhow that where mostly based on some kind of lin->log->display transform , mostly alexa based, so for them the whole tonemapping thing wasnt new, for fullCG shows however it was very different (bo more random softclip nodes everywhere… hehe)

It was one of my main projects pushing the colorpipline further and its a topic very close to my heart , let me know if you want some more info :slight_smile:

4 Likes

Thanks @finnjaeger good to know.

I have a couple of mountains to climb.

We are a small post house who still work with graded picture and although we have recently dabbled with Arnold we predominantly use redshift.

I often try and promote an ungraded workflow and have completed several longform projects this way but commercials are still our main revenue and they still like the grade being done upfront (even though they seem increasingly incapable of getting that signed off and insist on regraded or post grades!)

So we currently use ViewTransform to invert the graded footage into PhotomapLC linear. Not an ideal process but I believe it gives us scene referred material with Rec.709 primaries.

In Flame we comp using Photomap or ACEScg, as the working colour space.
A shift in the primaries will take the CG into ACES. The tone mapping in Photomap looks remarkably similar to ACES.

I am currently working my way through a potential grade workflow that uses ACES so that the material we get from grade is ACEScg.

Writing this down now I know it sounds like a hack because it is, but that is why I am posting. In the hope that discussion might offer up some better alternatives.

I am yet to complete my ACES grade pipeline but it is my hope that this gives us graded footage but with scene referred linear scope for composing and potential post grading.

1 Like

I’d say comp on a graded material is not a good idea.

Grading 99% of the time is a non-linear and non-invertible color transform which leads to problems on a comp stage (keying became a pain in the ass, etc). Good news – straight color transforms from a known colorspace to something scene-referred are free from these problems. I made a logik live on this topic couple of weeks ago

Intro to color management

I showed it in Resolve + AE/Fusion, but the same principles would apply to any other software combination.

Feel free to ask any questions.

Hope this helps

1 Like

I saw your logik live and thanks so much for sharing your knowledge with us @val. So much of what I have learnt about colour management has come from generous contributions.

I know comp on graded material is not great. This is not my preferred workflow but what my commercial clients expect. I will continue to try and sway them whenever I can.

You welcome.

Do you have grading in-house, or it external process?

Anyway, correct workflow (where everyone would be happy in the end) allow you to work on VFX in parallel with grading. Colorist will do it’s job, your team will do yours, but on raw sources. If everything done correctly (colourspace, metadata and timecode not lost), colorist will reapply it’s magic to VFX-ed shots in couple of clicks.

I know that every situation is different, and it’s hard to advise something not seeing the “whole picture”, but I’d try to talk with color guys first. If they are on Baselight you in luck, all other softwares - a little bit trickier, but still doable.

1 Like

I wish we were using baselight. We could then use the BLG node and problem solved.
Resolve has really taken over that side of things here in New Zealand but we do grade in house so we can easily run shots through the grade as many times as we like.

1 Like

Once there will be a sad day for earth, when only Resolve will remains as a grading system… And I say this as a Resolve user and fan since version 7.

But to your problem - Resolve has a nifty feature called “Flat Pass”. Basically what it does is takes your sources, if your sources is RAW debater them to your preferred settings (one for whole project, and custom if you changed it on a clip by clip basis), apply color management settings (if you work in ACES or YRGB Color Managed workflows), and output without grade applied. Also, Resolve will create an .xml along with rendered files so you can easily transfer everything in Flame, do what you need to, and return VFX-ed shot to your color guys. They reapply grades with simple middle mouse clicks, or, if there is a lot of just reconform to new sources.

Problem solved))))

Of course you need to test it, but this workflow works perfectly fine on other side of earth, here in Ukraine, should not be that different in NZ)))

2 Likes

I can only second that, you should always comp on ungraded when CG is involved anything else and you are just guessing stuff and you will not get back anything scene reffered

3 Likes

I feel like this is the gateway drug to linear workflows, because it makes CG comp so easy.

The extended range and color retention is nice and all, but having CG look correct with zero effort is really something.

3 Likes

haha yes, when I showed compers how good CG can fit if the textures and plate are the same colorspace (aces) their jaw dropped a bit :smiley:

1 Like

@PlaceYourBetts… The most simple solve for you is to render the grade (out of Resolve) as EXR files (with whatever the default float/ half-full/ blah blah’s). This gives you all the range of the original, so you can recover clipped whites/blacks etc. It’s still 709, so you would still need to covert it into ACEScg etc etc.

Another way (which I just set up for a freelance grader) is to add a simple OFX colourspace transforms at the beginning of the node tree (basically going from ACEScg to arri_logc). So Flame did the conform of the Arri_Logc, and I converted to ACEScg and exported all the plates. We then re-conformed those plates in Resolve and at the same time Nuke compers started to work as well. We set a first look in Resolve (with no finished comps) then just dropped in the comps into Resolve and applied the grade as the shots got done later in the week. Many ways to skin a cat!!!

1 Like

This might help you as well :slight_smile:

need to be careful with developing RAW , and no need to export 709 EXRs from resolve either, you can debayer or convert straight in resolve and get out pretty Aces EXRs :slight_smile:

1 Like

@finnjaeger The real battle with ACES on Resolve has more to do with freelance colourists having VERY unique ways of working (ie… what’s a LUT??). I’ve not seen one pass through this facility that has embraced an ACES workflow. They all have there own ‘secret sauce’ for going from CameraX to Rec709 look. That being said, Resolve does feel very, very different when you switch it into ACES mode. The control surface react very differently. This is enough to put almost everyone off it. Who wants to be in an intense client session (remember those?) and not be 100% comfortable with what the software is doing. Hopefully Resolve17 has some new tricks to overcome this, and more colorists embrace ACES.

1 Like

I am actually toyally fine with colorists not wanting to grade in ACES, even I dont do that and I love me my aces :slight_smile: The only practical advantage is that you can “just” swap the ODT and have a p3 master although i dont belive you shouldnt do a trimpsss ons a projector but I guess it can work well.

that said, colormanaged workflows will be pretty important going forward wuth HDR. so there is that.

the new Hdr grading tools and new managed mode in resolve with their own sauce intermediate colorspace is basically the same idea as aces so yea surez

Call me crazy if you like but I am considering converting my footage into ACES but then changing primaries to Rec.709
Just until my CG department pulls itself out of 2012.

They get full scene referred linear but with Rec.709 colour gamut.
This has to be better than PhotomapLC since no-one else has that colourspace other than Flame.

1 Like

you can tag your imput clips as linear/709 then it will convert that to aces automatically and look proper, maybe not what the CG dept has seen but a more sane image , this is the same thing you do when you want to render a old asset with 709 gamut textures but want the aces Tonemapper for lighting.

2 Likes