Holy Shit! Adobe Max Sneaks is CRAZY

1 Like

Well sure, but how are their animation channels views?

3 Likes

wow. so impressive. these are relatively simple shots with close to no occlusion but still amazing how the lighting and motion is matched with the tie and latte.

camera reconstruction is even more impressive. wouldn’t want to be a moco operator in the near future… maybe not even a compositor lol

1 Like

I really, really hope that Flame will soon introduce some creative imaging innovations, which is what Flame was built on.

There have been some welcome interface and tech improvements lately, but that’s not what is going to drive users to a creative tool.

i dont want to get crazy here, there could be a $100K GPU farm behind the curtain of this demo, or maybe the’ve spent days training a model to replace that guy’s tie but there is a seismic shift coming in the way we work. Hopefully ADSK will be able to leverage some of this tech but frankly Adobe prob has way more resources than the foundry and Autodesk combined. The future seems to be lots of decentralized tools rather than a one stop shop application…

Agreed.
This has been mentioned before, but without a creative designer behind Flame, it’s just iterative improvements and not a lot of image creativity.

1 Like

I have now learned that Steve McNeill has retired from ADSK/Flame.

Is there anyone creatively driving the development of Flame at this point? I would be very glad to hear of something.

And who from the ADSK dev team do I tag at this point? We used to have relationships with many devs and it showed in the product. Now I can name maybe two?

@fredwarren
@Slabrie

Damn that’s cool.

Out of curiosity? Who are you referring to?

Hey Fred. Philippe and Francis, just to name a couple.

I’m sorry if I come across in any way disrespectful.

I’m asking the same question, or in the direction of questions from quite a few, that have been asked by a lot of us in recent years.

Very cool… but I’ll wait until it becomes available for us to play with… Seen quite some Adobe demo’s over the years that looked amazing but ended up… nowhere? (deblur?)

Regardless, this will be a thing in the near future. Cool stuff. Been using Photoshops generative tool to create cleanplates in a jiffy…

2 Likes

agreed. Adobe seems to be focused like a laser on the consumer / prosumer market primarily. lots of stuff under the hood, dont want expose to much to user base whereas Autodesk and Foundry seem to squarely sit in the high end pro VFX market…

Well Photoshop generative fill is pretty nifty… I’ve been using it for matte painting bits and pieces and it’s impressive.

Project Rez-Up looks pretty cool.

Damn their shareholders are gonna love this!

1 Like

I only managed to watch the generative fill stuff. To me it just looked like a still being generated by AI and tracked in using motion vectors all in one button press. Looks slick. But I can’t ever imagine any advert containing such easy shots.

1 Like

It wouldn’t be the first time that demos like that are cherry picked (no ding on that at all, it’s supposed to be a aspirational demo after all).

While these tools are getting better all the time, the general problem is that they work quite well under ideal conditions, but falter quickly with unforeseen circumstances. Once people start using them, expectations in terms of budget and schedule change because it’s supposed to be ‘easy’. Until it isn’t.

As a consumer, when the tool fails, you can bail and do something else. As a pro you still got to deliver the shot. We all have been on shots where our go-to method suddenly didn’t work, and we had to rummage in the proverbial toolbox for a plan b.

I think these are fascinating and inspiring new options. And I’m delighted to see them. But I also keep a ginormous grain of salt at hand. And don’t let any producers watch these videos. You will only suffer if they do.

PS: I’ve been digging a bit into ML tech on the code side to understand it better. Not that this is for everyone. But it does give you a better sense of what might work and what might not, or where it will hit the guardrails hard.

3 Likes

I’ve come to think of this as “the myth of 80%” because sales people or evangelists or just folks who are excited about the new toy always say the same thing: “sure, it’s only 80% there, but look how quickly it did it!”

And sure, that’s usually impressive, but unless there are manual controls or some way to get it from 80 to 100, it’s probably not that interesting from a production standpoint. And at this point, after many, many disappointments, if you say 80% to me in the context of a tech pitch I will assume you don’t know what you’re talking about.

Interestingly, the Foundry, DNeg, and University of Bath recently concluded their attempt to build an ML roto tool with a bit of a shrug:

“The principal learning is that rotoscoping is very hard and that people are going to be involved, certainly for the foreseeable future. There’s a lot of considerations in the rotoscoping process that need to be taken into account before starting out on a project like SmartROTO.”

10 Likes

I think about this all the time with the various “look out, AIs going to change the world. It’s not quite there yet, but it will be in x years” commentary. I mean, with the seemingly murky understanding developers seem to have of how these models actually work, who’s to say the curve of progress on that front won’t slow or that it will never be able to get out of the uncanny valley that artists have been battling for generations? I’ll tell you the people who definitely won’t say that or even entertain the idea: the folks getting massive investments in their companies right now. Really hard to tell the hype from the reality at any given moment.

7 Likes