So... Does Gen-AI Work or Not?

The entertainment press doesn't want to tell you!

Happy December Hollywood tech nerds!

In this week’s post:

Subscribe to get Hollywood Tech Nerds magically delivered to your inbox every Tuesday!

So… Does Gen-AI Work or Not?

Back in October I wrote a post about the tendency of the industry rags to always mindlessly accept the framing of the tech companies with regard to generative AI. I specifically used the example of “Tilly Norwood,” the PR stunt that Variety, The Hollywood Reporter, Deadline, and others insist on treating as a real creation based solely on the word of its creator.

This week I want to focus on a corollary tendency in the entertainment press, which is significantly underplaying a consistent theme across multiple stories of entertainment companies engaging with generative AI: it never seems to work as well as promised!

This was inspired by a recent article in The Wrap: How Disney’s AI Ambitions Hit a Wall. The article’s subhed gives an immediate indication of this strange framing: “Acting and below the line talent who oppose AI, combined with technical limitations, has stalled Disney’s efforts” [emphasis mine]. Curious that it is these things “combined” which have “stalled Disney’s efforts.” Let’s see what they’re talking about!

For Disney, the problem was twofold. From a cultural perspective, many in the acting and animation communities within the company oppose AI and have been vocal about the issue (“Black Widow” star Scarlett Johansson and director James Cameron are among many who have warned about its dangers), requiring the OTE [Office of Technology Enablement] to tip-toe around their concerns, according to people familiar with the company’s AI work. With leadership not looking to ruffle feathers with talent, there’s been no champion to get buy-in on AI from the disparate parts of the Disney corporate empire.

Then there were the technical hurdles. For all the promises that generative AI offered, the team realized the technology wasn’t quite ready for prime time. After a year of work, Disney found the AI systems didn’t meet its standards to replace the pricier aspects of production, like visual effects, leaving CEO Bob Iger frustrated with the lack of progress, according to the insiders.

I would say that “For all the promises that generative AI offered, the team realized the technology wasn’t quite ready for prime time… Disney found the AI systems didn’t meet its standards to replace the pricier aspects of production” would be the central issue here, most assuredly more significant than people being “vocal.”

Let’s imagine this circumstance outside the realm of magic spells and wizardry which the generative AI peddlers work hard to evoke, and instead picture your friendly author Steve needing to drive to the store. Now, Steve is feeling lazy and doesn’t particularly want to go to the store, but also his car doesn’t work because the timing chain broke. Which would be considered the actually significant factor in Steve not driving to the store?

This dynamic plays out again and again in The Wrap’s piece, conjuring ideas of cultural opposition to generative AI being as powerful as the tech not working as well as promised:

…the OTE had to navigate the core creative class of Disney, which was vocally opposed to the technology. It proved to be too much of a challenge...

“It became more about what they won’t do vs. what they could do,” Wang said.

…Cultural hurdles aside, Disney was beset by the classic problem of generative AI: an overly optimistic perception of what it could achieve vs. the reality that the technology is still limited in many ways.

“There’s a big expectation gap with AI in general,” Ross said.

Indeed, a September report issued by software tools maker Atlassian found that 96% of CEOs have failed to see any meaningful return on investment from the technology.

…“After all these millions of dollars spent, they’re finding out you can’t deliver what we’re used to delivering over the past two to three decades,” the producer said. “It just doesn’t meet the standards.”

If “you can’t deliver what we’re used to delivering” is the classic problem of generative AI for the entertainment business to create the product that makes it money, why are the cultural considerations even remotely relevant? The creative class’s opposition to generative AI is an interesting clash in a universe where the technology is actually functional and actively reducing costs, but in the real world where the tech isn’t as good as sold, the cultural protests against it are - it seems to me - beside the point.

This shouldn’t be a surprise to The Wrap, which extensively covered this exact same dynamic in their reporting on the troubles Lionsgate was having with its own generative AI deal with Runway.

An article with a similar stance is Variety’s Hollywood Script Readers Fear They Could be Replaced by AI. They Set Up a Test to See Who Gives Better Feedback. Only 25 paragraphs into the piece do we get to this point:

The more complicated the script, the more likely AI was to get things wrong — to misattribute the action of one character to another and to hallucinate plot points.

The humans won hands down when it came to notes, which require actual analysis rather than just distillation. The AI programs were “an almost total fail across the board,” Hallock says.

That sounds… like it doesn’t actually work? Like, why isn’t this the lede of the piece instead of being buried deep within it? Why is the information that these products don’t work as well as promised almost treated as an afterthought. Would anything else get this kind of gentle handling?

Why do we think CEOs like Bob Iger are “frustrated with the lack of progress?” Could it be because tech is bound and determined to worm their products deep into the ecosystem of the movie studios by making fantastical promises and wishcasting while getting the entertainment press to play along? Who can say!

For closing thoughts I’ll quote from Nikhil Suresh via Ed Zitron:

A huge amount of the economy is driven by people who are, simply put, highly suggestible. That is to say that it is very, very easy to get them excited and willing to spend money.

Here’s a round-up of cool and interesting links about Hollywood and technology:

The trickiest effects in Predator: Badlands. (link)

Fake journalists are fooling publications with AI-generated fake stories. (link)

Wired’s top 7 OLED TV recommendations. (link)