- Hollywood Tech Nerds
- Posts
- George Lucas Was Right!
George Lucas Was Right!
PLUS: Reviewing The Wall Street Journal’s AI Movie
Hola Hollywood tech nerds!
In this week’s post:
Subscribe to get Hollywood Tech Nerds magically delivered to your inbox every Tuesday!
George Lucas Was Right!

Back in April, there was a 20 year anniversary theatrical rerelease of Star Wars: Revenge of the Sith, the final part of George Lucas’s Star Wars prequel trilogy. I don’t need to rehash the history of the prequel trilogy, which has many detractors but an increasing number of revisionists who love the films.
What is uncontroversial about the prequels is that (in part perhaps due to the production difficulties of the original Star Wars) George Lucas was incredibly prescient about virtual environments becoming increasingly important to the production process, particularly of big IP blockbusters like Star Wars.
A few weeks ago we looked at how disparate shows like HBO’s The Rehearsal and Disney+’s Andor (itself a Star Wars show) both used virtual production environments to enhance their storytelling. Variety explores how these production tools may also assist to reduce overall production costs for many different types of shows and films:
[Virtual production] has quickly become a fiscal lifeline for producers fleeing the rising costs of traditional shoots. Put simply, it reduces the unpredictable: weather delays, location logistics, travel time and schedule slips. In this sense, it can be cheaper — and frequently smarter.
And as tariffs and strains on global supply chains make importing set materials more expensive and less reliable, creating worlds virtually offers a way to sidestep mounting construction costs and delays.
Another exciting angle is being able to use these effects not just in film and television storytelling, but also in live broadcasts and in-person events. VFX Voice describes how these tools are finding use cases in sports, concerts, stage plays and even museums:
“For the past two, two and a half years, you are starting to see the crossroads of everything coming together,” states Jess Marley, Virtual Art Department Supervisor at Halon Entertainment. “That being visual effects heading towards games, AR, VR and broadcast because of USD and all of the programs finding good ways to work together. [The virtual news desk for] ESPN is a good intersection to that because we’re using Jack Morton’s Cinema 4D files to introduce content into Unreal Engine and having to bridge the gap between all of those different departments working together, which is what virtual production does across the board.”
The configuration resembles a LEGO set. “You are putting the pieces together in Unreal Engine, but you are getting pieces from different kits and packs from the client,” remarks Andrew Ritter, Virtual Art Department Producer at Halon Entertainment. “You have to make sure that they plug into the technology and creative vision sides and go directly into the audience’s eyeballs at the end of it.”
Check it out in the SportsCenter clip below:
We’ve come a long way from awkwardly interacting in a completely green room!
Reviewing The Wall Street Journal’s AI Movie

We talked a bit about Google’s Veo 3 last week and so I thought I’d take a looksee at an AI-generated short film from the creative minds at *checks notes* The Wall Street Journal…?
You can watch the short film as well as its making-of below:
First things first: I would like to retire the word “wild” from all online discussion of tech developments. Everything can’t be described as “wild!” Find a new word. I suggest cromulent. “We Tested Google Veo and Runway to Create This AI Film. It Was Cromulent.” Isn’t that better?
Having watched this, I don’t think the non-AI filmmakers out there have to worry about their films getting displaced at the local festivals. I assumed the script was written by AI and was horrified to learn that human beings wrote it. Maybe keep that one to yourselves!
I did like the breakdown on how they maintained character and location consistency across the movie by using a combination of Runway and the two Google Veo models. Pretty interesting!
Of particular interest to me was this statement at the end of the video:
We estimate the total cost would have been around $1000 for Google’s and Runway’s tools. We paid for some of it, and the companies gave us special access to the rest.
So what exactly was the “special access”? Was the special access to tools that anyone could utilize? Or were these tools that the average person would not be able to use, and are instead other means by which the company can try to plug its products through earned media hits like this?
It’s this deliberate vagueness that fuels my ongoing skepticism about these generative AI tools. How do we know these costs are reflective of what these tools will actually cost when they’re being used at scale? How do we know it’s not going to be another Uber-style bait and switch where the cost of production on Veo isn’t any less expensive than just producing it in real life?
As for the movie… eh. The more I see of this stuff, the less worried I get.
Kernels (3 links worth making popcorn for)

Here’s a round-up of cool and interesting links about Hollywood and technology:
Why MUBI’s head of international distribution is gung-ho about theatrical. (link)
Elden Ring solidifies it: video games are the future of movies. (link)
Ed Zitron on tech’s desperate AI promises. (link)