Swear to Christ I'm Going to Burn Down a House if I Hear "Sand Screens" Any More

Have we discussed how shitty VFX PR is?

How the big talk about Dune right now is that they used Sand-Colored screens in lieu of blue or green, and now the fucking discourse is that THIS is why the effects looked good?

Way to shit on the eighty thousand people who had to roto that shit, who had to comp it. “Oh yeah, our team had a clever trick,”

And now all of us will get to hear directors and agencies suggest random colors of screens.


To be honest, If every damned major motion picture is going to roto every single plate for stereo once the offline is done, I don’t see why anyone in the motion picture industry should be doing any keying AT ALL. As for agency people who don’t know how to shoot, if everyone started charging by the hour again, they would call us before shooting like pigs.

1 Like

We’ve done it in the past like this for instance:

It works well but yeah it does need roto and also the right camera angles. And a bit of difference matting. But hey, just give your friendly vendor a shout and add 20%. No?

I laughed my ass off when I read that. “When you invert the sand color, it turns blue!” You know what else turns blue when you invert it? Human skin., you assholes. Maybe it helped them with the hair? Oy…


To be clear: I have no issue with nonstandard screen colors in any situation where the choice is made in a sensible way.

Dune was smart to do what they did and the results speak for themselves. My issue is this PR push about them, like it’s some brilliant thing that saves time or is inherently better.

I had a small novel composed for the comments section of an otherwise quite good Youtube video about the VFX of dune because the dude implied Black Widow looked shitty because of greenspill and fringe and the sand screens fixed that ergo Dune’s effects are great.

Dune’s VFX were lit well, Widow’s less so. It’s a common problem when you get into a giant green volume and aren’t completely clear about what is going to replace the green. Coincidentally, this is also why those LED volumes are useful: someone had to decide what the BG looked like before the camera rolled, so the DP has something to go off of when lighting.


Counterpoint: with a decent greenscreen and nice lighting, I can key dozens of shots in a day, but with roto I often find I have to paint in all the fill for any semi-transparent areas.

Now, GETTING a decent greenscreen and nice lighting is another matter…

1 Like

From this reddit thread:
“They also mentioned that inverting it to blue only helped in the sense that their keying tools are built for green and blue hues, not to magically turn it into a blue screen.”

Oh I agree. But modern blockbusters aren’t built that way. As soon as the first cut is done, all the sources are send to roto regardless if effects are present in the shot or not. Nobody shoots two eyes. The second eye is automatically extrapolated by a stereo team. If all that outsourced cheap high precision roto is being done anyway, why pay western prep, key and despill at western rates for no really good reason except jack up the local studio subsidy income ?

I mean, I’d argue that’s a fine reason to do something. Haha. As much as I dislike doing roto, It makes me happy to think that because I’m tied up cutting mattes, some other flame op is getting the work I’d be doing otherwise. A rising tide and boats and all that.

Meh. I’ve been stuck doing 6 weeks of roto on a shot that was already being roto’d. It was a bit ridiculous. No really good reason to do a shot twice. But since we are paid by the state to do a job twice over… Insert cynical glance here.

Good lighting, correct depth of field and the right camera angles. It’s not much to ask.


I once held hope when I heard Lytro were making a light field camera. When it came out and was the size of a tank it was a real shame because that tech could have theoretically produced some awesome mattering abilities as you’d be able to auto generate mattes and depth maps from it.

I also thought at one stage ARRI were experimenting with something which was going to produce depth maps as an additional channel on capture which would have been awesome as you could then do depth based keying too.

Alas…. Keying or roto we seem to be stuck with for a while longer. I’m hoping the AI/ML keying improves considerably quickly but I still think that is a few years away bar doing crude roto (which is still good enough for a lot of tasks).

Same. The only project I’m aware of got bought up by Apple in like 2016. I’ve been harping on for a while that an ML keyer would be way more useful than even ML roto, and it would presumably be an easier problem to solve.

1 Like

I also think that ML could eventually solve spill issues and even fix motion blur keying issues.