Ive been using the new match grain a lot. it’s quite brilliant - I assume you have to manually extend the analysis to cover the duration of the clip. a T click doesnt bring in the correct duration. And surely this tech would be good with a degrain too.
Upvote this Improvement Request. Match Grain: Add an auto range for analysis
I cant say enough how brilliant this match grain node is. im using it all my comps and its unbelievable how close the grain is to the original clip. Utterly amazing.
Jon, have you never done the subtract grain from a denoised plate? This is just doing the same.
It’s not the same. It’s analyzing the grain and recreating it. Not just adding the difference.
It’s a lot more nuanced than a simple subtract/add comp, which is why everyone has migrated to variations of Nuke’s DasGrain, which this implementation is related to.
When you just subtract and add, you put the exact grain back pixel for pixel. But what if the grain in that area you were working in looked different? You can see noticeable patterns in the grain in some scenarios, that give the comp work away. We wouldn’t have wasted countless topics and discussions if add/subtract was the gold standard.
What MatchGrain does is it samples the grain in a different region of the image (you can pick where), and then creates a new grain patch based on that sample, but employs a voronoy pattern for the tiling so that there are no obvious repeating seams.
There are some additional features at work and a lot more control over the grain that it puts back rather than a straight copy.
You should watch the videos and the specific examples Fred has show to understand where the differences are.
I stand corrected, however I’ve always had good results using that method combined with Ivars ReNoise matchbox.
Right, there were various tools over time to bridge the gap between a plain add/subtract and DasGrain. Some where better than others, some needed more hand-holding. This new node finally just skips all that and goes right for the goal line.
Part of the issue was that the full functionality went beyond what a Matchbox shader could do. It needed to be either a complex batch node tree or some custom code.
this is better than doing that old trick!
Ghosting is a problem with that technique. Dispersion (tile repeating based on a voronoid pattern) is the main adventage of this node.
Ghosting (different densities of grain due to inconsistent luminance values in the source clip) also occurs with this new tool, so it is not immediately obvious how to extract grain from one shot and apply it everywhere else.
Shoot some black, grey and white cards with your camera.
It will probably cost you about 45 seconds of shoot budget so you may get pushback from production or the camera department.
During beta there was a discussion on how to use grain from one shot in another.
There are 2 way. One straightforward the other the ADSK way. I didn’t pay attention to their way, so I’ll focus on the straightforward.
ShotB Comp —> Front-----------\
Shot A Degrain → Degrain------------->Match Grain Disperse
Shot A Original → Original------/
Simple. This will apply the grain structure from Shot A onto the comp from Shot B.
I believe the ADSK workflow that Alan is referring to is covered in this tutorial here at 4:23, and the big difference is exporting the extracted grain vs bringing along the Shot A Degrain + Shot A Original clip into the Shot B batch. The only thing missing from my tutorial is turning on Dispersion to disperse the grain in the new shot.
The ghosting will not magically disappear, but you have tools to handle them within that node.
Either by using a grain sample you carry forward from another clip (as Alan described), or by using the sample region to sample it from an area that has even luminance (assuming one exists in your shot) and then using the dispersion.
Also the ghosting, if it exists, will be different from a standard add/subtract, because the grain is also being normalized based on the source and destination (additional mult/divide operation), which is not usually present in the quick borrowed grain workflows.
I think the part that gets overlooked is that dispersion to be turned on specifically (not a bad thing, and consistent with DasGrain).
The irony of course is that streaming services are moving towards denoised masters which aids in bit rate rate reduction, and using hardware or software noise generators to help with playback debanding.
There are already streaming codecs that do this already in the way they compress/decompress.
It’s true - vfx pixel polishing asymmetrically contributes to the entropy of the universe, in so many ways…
Curious if this would work for what I’m doing. I’ve got a 10 minute clip with a small area being blurred, I’d like to add grain back to this area but I’m not degraining the whole clip to do it.
If I sample say 20 frames, would this work. The autodesk way, the grain stops after the rendered normailized pass.
Loop
i thought the obvious wouldn’t work. will try.