First and foremost, to get it out of the way, DasGrain is awesome and makes adding back in noise (who works on film these days?) such a breeze and I’d love a similar tool in Flame. Mainly because it is quick and easy.

I would like to say though, that I can still get a decent result with the built-in regrain tool. I just go to the View screen, lift the gamma a bit then solo the red channel, then go to the regrain tool with proportional turned off across the board and dial in the level and size of the grain to match the plate. Then I do the same for the green then blue channel. Always seem to get a close enough result. It doesn’t take a great deal of time either.

Now I understand the whole process of denoising a plate, subtracting the “grain” (it’s noise unless it’s film) using plate & denoise pass, doing your comp then re-adding the subtracted grain but way more often than not it brings back unwanted residue from in the grain of the original plate in any region you’ve changed. Can’t say I like seeing the ghost of whatever I’ve changed or removed, especially when I’ve spent so long trying to remove it. So why do so many people use and persist with this technique? I get that you want to stay as true to the original plate as you can but if you comp your changes back over the original plate and just regrain the parts you’ve changed (which I find I need to do regardless in the majority of situations) then what is the benefit of the subtraction then addition approach?!

Please discuss…

Because adding the original grain back also adds detail back. If you gamma up the subtracted grain, you’ll see what I mean.
I use crok_renoise to add grain / noise back to areas where the original grain doesn’t work.

But if you comp only what you’ve changed back over the original plate then you won’t have lost the detail anyway

1 Like

I would say it totally depends on what you’re doing.

1 Like

In a complex show there could be multiple artists doing multiple things to your shot without you necessarily knowing it. Adding the original grain back in is of course, best case scenario to protect and preserve. If this additive step brings back what you are removing, well, clone or front source the grain pass to solve it. It always depends. There is a good / better / best situation in complex pipelines and at some levels nobody cares an at other levels you have conference calls about grain management.

I use the additive way to add the grain/noise back in but when the “ghosting” comes back I then do an additional fix just on the grain/noise layer.
90% of the time I use this method purely for bringing that detail back to the shot.

It’s very fast and requires very little thought, so I love it.

As to why not just comp a regrained patch over the source plate, that’s a fair question. I certainly do that for ST maps and screen comps, but because the add/subtract method is so damn fast I always start there.

I’m rarely happy with either my attempts to match grain, or with the masked area grain soft blending with the plate grain, making a little fucking blurry-grain halo around the work area, so my two primary approaches are the Subtract/Add method and regraining the whole plate.

I hate grain as a VFX concept. I hate that people feel it’s the brown M&M’s of VFX. Nothing any of us make is important enough for the grain to matter. (goes back to his toothpaste commercial, telling the clients “I suggest the Vision 320T stock grain for this” like a sommelier)


Enough likes and I’ll “MasterClass this issue”.


I would like to know how to avoid this without doing grain subtraction/addition. I suspect it somehow involves Divide.

This says it all.

Unless you’re doing film(and even then…) the compression being applied to the final output of your work is going to obliterate the grain/noise.


Is this a major issue for folks? Granted, I’ve worked primarily in episodic tv for the last seven years, but the amount of times I’ve had comps kicked back for grain mismatch I can count on my fingers. And when it has happened, there were way larger problems on that show in general than just grain- that was just the final most low hanging kick in the groin from client side supervisors.

1 Like

We work exclusively in theatrical film and 4K streaming… My god you commercial/episodic people have it so fucking easy. These supervisors sit there and A/B frame by frame, and examine each R G B channel for grain. So yes, its a big deal.

1 Like

That sucks. That sounds like an industry issue not a grain management issue. I’ve never watched a single frame of any moment of any film and thought, “grain seems off…”

1 Like

The grain ghosting is a real thing and a problem. That said, using the original grain is an important consideration, because it will be the same grain that’s on the next shot that had no vfx done to it. The worst thing you can do is bring in 3rd party grain (e.g. cinegrain) unless you do the same thing to every shot. Now, could we just degrain the whole timeline and leave it that way - sure, that’s a totally valid solution, and nothing wrong with it (if it weren’t for the many people that think that having grain makes it a better shot).

Where DasGrain (or any similar homegrown technique) makes a difference, is that it relies on the original grain, but sampled from a region that does not contain a pattern and then repeats that across the whole frame without visible seams (the voronoi part of it). So you get matching grain, but no ghosting, without headaches. You could do the same with your stolen grain workflow, and simply look at the grain before you comp it back. And if you see a ghosted image in the grain, but have another part of the shot where the grain is more even, just copy/paste that other region to where the ghosted image is. That has the same effect.

Or you can just save yourself the hassle, get an Si license and use the Si Regrain for your shots. There are other areas of our pipeline where we can add a whole lot more value.

1 Like

Depends on the compression algorithm used. Some actually do their own form of subtracting grain/noise, compressing then adding it back in again.

Possibly. But our goal is to create a clean master. What gets compressed after we’re done is a separate story.

I work in both film & episodic and that is why I actually raised the question in this thread. I find that more often than not the subtracting & adding technique will introduce more issues than it fixes and it is looked at closely so I end up regraining with the separated RGB technique I mentioned earlier. I was interested as to why so many people use this technique in case I was missing something.

I’d love something equivalent to DasGrain in Flame. Something as simple as feed in original plate, feed in denoised plate, feed in image you want to apply an analysed and accurately simulated grain/noise to and then be done with it.


That is indeed the holy grail.

The chance of ADSK implementing this is realistically pretty slim, given that solutions exist albeit at a bit of a cost. The more realistic path may be BorisFX offering a Continuum unit for Das Grain, which would be the Si implementation as a solo / headless node. That’s in the realm of possibilities if there’s enough interest.

See this thread for details (at the bottom where Marco weighed in on it): You've Been Inform'd Ep10: A Plugin, Within an App, Within an App - DasGrain

I was taught on this very forum that when subtract and add introduce artifacts then change the blends to devide multiply. Has git me out of grain scrapes many times.