So you think AI isn't going to take your job?

Regardless of AI, in the ever more rapidly changing industry and software landscape, being able to ride more than one donkey is good for everyone.

I kind of sit at the other end of that scale to a fault at times, and I think about it a lot. But I’ve had much less buffeting or sleepless nights over recent industry shifts, strikes, and whatnot.

1 Like

special

1 Like

hahaha. So true. Well if this all kicks off, at least I know I’ll be free to take the “how to build a house” camp that they offer here in VT and I can finally get good at a trade. All those folks getting rich off AI will need mansions built!

3 Likes

I feel like I’ve seen Yojimbo and A Fist Full of Dollars enough times to know the best gig in times such as these is undertaker.

3 Likes

So the next frontier is becoming clear:

Marc Andressen and others are recognizing that they’re running out of quality training data. The world isn’t big enough anymore for their AI ambitions.

The fix? They want to employ all the folks that lost their job to AI to create new human generated training data???

That’s the equivalent of the old tech outsourcing boom to India in the 90s, where you lost your job, but you had to train your off-shore replacement.

I mean can you be any more tone deaf and un-human?

Alas fear not, if AI takes over Flame, you will still have a job producing content for the Flame learning channel which will be used to train your replacement operating Flame.

2 Likes

Anecdotally this has already been going on for a while. I first heard 3 or 4 years ago of Meta, Apple, et al hiring 3d artists to model, render, and tag content to train ML algorithms.

Yes, all those motorcycles, bridges, traffic lights you tagged as part of captcha challenges were all dual purposed ML training.

But it’s one thing to hire some people for incremental use cases. It’s another to make everyone redundant and then make them train their replacements.

1 Like

That’s like…one of the only ways corporate America can climax. I fully expect it to happen.

That’s fascinating because it seems to suffer from a problem of scale. If all the media created by all the people in the last fifty years is not enough, how much would these hires need to create?

3 Likes

The other problem is that the gain in quality of the more recent LLMs has slowed. They’re throwing ever more data at it, hoping to squeeze more accuracy out of it, but the returns are diminishing rapidly.

So more data, means they keep doing the same thing hoping for a different outcome.

3 Likes

So something has ingested nearly all the data in the world and is still kind of dumb. Sounds like it’s just kind of dumb. Could be any number of reasons. I will say, a lot people over the years have pointed out that the bottom up model of architecture (what most machine learning is built around) may well seem the most obvious model of the human brain, but in actuality a top down model may indeed be the counterintuitive correct direction. Who knows? I do love this anecdote though:

Tell me," Wittgenstein asked a friend, “why do people always say, it was natural for man to assume that the sun went round the earth rather than that the earth was rotating?” His friend replied, “Well, obviously because it just looks as though the Sun is going round the Earth.” Wittgenstein replied, "Well, what would it have looked like if it had looked as though the Earth was rotating?”

3 Likes

This reminds me of the Isaac Asimov short story, The Last Question.

https://users.ece.cmu.edu/~gamvrosi/thelastq.html

1 Like

A very good observation.

Last year we were discussing a family member way up in their age and the problem said person is creating for everyone. An old Italian women we know had the perfect perspective: “God is saying, I’m not quite ready to have her back, she needs to stay there a bit longer and reflect”.

Why does that come to mind? For me it’s the entitlement and lack of hubris that the current AI crowd has with which they’re chasing not the advancement of humanity, but pure profits. If natures brain is still so much more superior to even a $3B piece of hardware that needs to get powered by a dedicated power plant, and had to consume all available data, yet the results are a fraction of what we can accomplish on the power of a steak and some potatoes, you’ve lost the plot somewhere.

I think understanding the human brain, and being able to reproduce its function through technology is a worthwhile pursuit as it can indeed have positive use cases that advance human mankind, in finding medical cures, answering difficult questions based on oversized data sets, etc.

But instead, the way I think about it, it’s the next version of the industrial revolution. A few hundred years ago most of the workforce was in agriculture. The came along the machines that could harvest food faster, and instead of working on the field people worked in factories to make the machines.

Then came the knowledge worker phase. As the machines became better, and some machines even became robots do this with fewer hands. But we needed people to program these machines; businesses got bigger and global supply chains popped up. Instead of working in factories, people got degrees and sat at computers, writing emails and making PowerPoints.

Now with AI we seem to be the cusp of the next step function. We no longer need to write emails or write reports, the PowerPoints get generated (poorly still, but improving). So all the people that used to write emails (or did beauty cleanup on videos), are no longer needed. Some of them will move up to make training materials for the AI that has taken the place. But where the rest of them goes is unclear so far.

The problem to solve is that in order for all these companies to make money, they need consumers (B2C and B2B, which just powers other B2C) to serve, which therefore need to have earned some money. Henry Ford grokked that. But if there is no work of value for people to do, how will they earn money that they can spend on what all these AI powered companies produce?

For the AI companies to be successful, they not only need to improve their LLMs and make a good portion of jobs redundant, they also need to answer the question of what these people are supposed to do instead to earn a living. Because without that, their house of cards will collapse. To pay for AI infrastructure, companies need to have paying customers, which need to be gainfully employed (if you discount the rich people with their passive incomes for a second).

But alas, the Sam Altmans of the world can’t be bothered with that. So here we are.

2 Likes

ha love it. As a big Asimov fan i actually had not read that before. thx for sharing @Sinan

1 Like

only tangentially related to orig topic but still a fascinating read:

2 Likes

AI is Maxell’s demon. People just aren’t willing to accept it.

What’s worse, they’ll burn the world down trying to disprove it.

4 Likes

There’s also the idea here that if I set a course due north because my intended destination is due north and it is a million miles away, if I have strayed even a fraction of a degree to north east after 50 miles I’m still very much on track, after 1000 miles I am certainly much closer to my destination, but after 1,000,000 miles I am not anywhere close to where I want to be and will need to begin traveling westward, but the westward journey could be blocked by mountain ranges and oceans, so I may very well need to head all the way back to where I started and try for north again. Either way, I very well may never reach my destination.

2 Likes

FXguide’s Mike Seymour was commissioned by NVIDIA to unpack the impact of generative AI on the media and entertainment industries, offering practical applications, ethical considerations, and a roadmap for the future.

Check out the article on the link above.
Just download the pdf of the document.

2 Likes

Thanks for finding and posting. It’s a good read, and a more detailed assessment of where the different capabilities are and near term direction.

In some ways, things are further along than one can see from the outside - we’ve all seen the PR stunts and the inevitable flaws. But what’s interesting is how many of these tools actually do have control points much closer to what we have in VFX pipelines, and how many tools can replace individual steps, rather than one-button results. There’s a lot of focus on speeding up specific steps and building new pipelines, rather than vending machine content production.

It doesn’t answer the question on what that means for the industry and the people who make a living in it long-term, but it does show that artists will still be part of the process, just differently. And way fewer of them.

And those who are left will need or have access to deep pocket books. We know the history from just over a decade ago where super expensive hardware was needed to run Flame. We currently live in a time where you can be a freelancer with hardware and software which fits on most people’s credit card. We may be swinging back into a period where access to the tools will be a lot more costly, but not necessarily optional. That will remove some freedoms and flexibility we have gotten accustomed to.

These new tools still need to pass the economic full scale test. While achievable during this high-investment period, if the stable state goes back to expensive and somewhat gated tools, this inevitable will translate into the production cost of content, and we have to see how the economics of content that have favored quantity and insatiable appetite of platforms and consumers can adjust to that, or if the math fails to work in the end - at least for the time being.

Russia has become quite adept at “AI” generated News Casts.