Stripping those default ACES viewing rules is a must if you want to stay sane. Admittedly very daunting if you are dipping your toe into ACES for the first time.
I’ll share my Viewing Rules for this “Waka” project I am working on atm:
The only thing that might be interesting here is the two custom viewing LUTs I have at the top, “Waka_Linear_LUT” and “Waka_LUT”
I don’t work with Sony cameras very often. I am most used to Arri Alexa which is why I keep the “Alexa_rendering” one around. It makes more sense to use ACES on the log and linear so that you get the same look as you switch around but the “Alexa_rendering” is an older version of the LogCv3 conversion and I keep it handy since I find it a little softer.
Anyway. The Sony Venice camera was shooting SLog3 and the DOP had a viewing LUT.
Since I am working ungraded I could work under any viewing LUT, It doesn’t really matter but since we are presenting everything under this LUT it is good to have it available throughout the process. This viewing LUT is designed to give the SLog3 a cool look and make it rec709 for viewing. If I wanted to see this through out my entire comp pipeline I need to make another version that works on linear (ACEScg). I built that in ColourMgt-Custom Colour Transform and exported it as a LUT that I could then import into my ViewingRules.
For me, I would love to see some other peoples Input Rules
Tagging is all important in a Colour Managed Workflow and I have set my Rules up to get as many automatically as possible. I have just modified the default bunch you get with an ACES project but I have tried to make it catch as much of our material as possible.
The Pattern and Extension can be a little limiting and I would love to see this expanded to allow some knowledge of the folder structure files are found in. I have created a Feature Request if you agree with me
Your two top transforms in that pic are linear (or log) and while it may not be correct (I can’t remember what Nuke considers “raw”) video is what I’ve always considered raw; when I’m in video I can look at any image and tell if it’s log (will look washed out) or linear (will look dark).
Tapping on the LUT name in the GUI to bypass it will also put you into video, but without the exposure controls, so I like to add a video transform the same as you.
The only thing I do differently is not have aces to SDR video and Alexa rendering both on. Use whichever one you want—they will both work with all correctly tagged footage regardless of camera origin—but having both live could get confusing since they are similar but different.