@randy I think this is the video we were just looking for on the Patreon Zoom:
Alfie Vaughn showcase
and follow up https://youtu.be/iBj_k7KcjD8?si=-j0tmQgZe-UpnULX
About the 3rd party lens kernels for realistic bokeh
@randy I think this is the video we were just looking for on the Patreon Zoom:
Alfie Vaughn showcase
and follow up https://youtu.be/iBj_k7KcjD8?si=-j0tmQgZe-UpnULX
About the 3rd party lens kernels for realistic bokeh
I like that video because it demystifies what often feels like a blanket “nuke does it better” statement around defocus and PG Bokeh.
We almost have that in flame…
Another DoF plugion is comming to NUKE soon.
https://docs.magicdefocus.com/latest/
Looks amazing, NUKE’s DoF reproduction methods is really jealous.
pgbokeh is the goat, its slow af but once you deal with DEEP data and stuff… its just so much better
However I am always a fan of just rendering with DOF … if you cant afford deep
This is the Dope
oh damn.
I need this in houdini damn
Love that… giving me Lentil vibes
When you say “almost”, do you mean because physical defocus almost works but doesn’t actually work? Or is there something new that works and is almost ready for release?
“Almost” because we have the same tools but there may be specific use cases where Nuke does indeed do it better.
All of those nodes, barring on some level PG Bokeh, are in flame. Blur does both regular and lens blurs, so that’s covered, there’s a convolve filter in the sapphire set and in the default matchbox set, and 3d Blur behaves in a similar way to both PG Bokeh and the Nuke DoF node, with a kernel input and depth slicing system. Heck, the flame DoF node has a built in edge extend/pixel spread (but no kernel input.)
I probably said “almost” because we don’t have PG Bokeh specifically, but it was nice to see that it too eats shit on “uncertain edges” in a depth map. Haha.