Any Details??? Tubi AI Edition

PLUS: The Future of Film

Hola Hollywood tech nerds!

In this week’s post:

Subscribe to get Hollywood Tech Nerds magically delivered to your inbox every Tuesday!

Any Details??? Tubi AI Edition

It’s time for another installment in my long-running series “Any Details???” Longtime readers are familiar with this series, which encapsulates my concerns regarding AI reporting in the industry trades. From Tilly Norwood to fake podcasts to bungled Amazon recaps, the trades consistently write up all claims made about AI integration in entertainment with absolutely credulity, rarely asking any followup questions about how these things work, or even if they do work.

It’s not just the trades, though! Even The Wall Street Journal is not immune, as evidenced by their recent article “Why Tubi Is Betting on AI to Win Over Gen Z Viewers”. Why indeed! Let’s take a look:

More recently, Tubi combined large language models, or LLMs, with deep learning techniques—enabling it to better understand “the nature of the user’s intent,” [Tubi’s chief product and technology officer Mike] Bidgoli says. For instance, an LLM can predict a user’s viewing patterns the way it predicts text from a few letters or words. The company uses a variety of commercial LLMs and its own custom model.

That means Tubi can create a level of “hyper-personalization”—suggesting more personalized shows and movies, as well as ads, based on data like a user’s location and time of viewing, as well as watch history and even what’s in the content.

The company says it automatically plays AI-selected scenes from movies, individualized to viewers as they scroll, and it can create a trail of digital breadcrumbs for them to follow as they delve deeper into subgenres.

OK “The company says” it does that, but does it do that? Did you go and see how that worked or did you just print what they said it did? Unasked here is how exactly it selects the scenes from the movies. Just because I watched a horror movie doesn’t mean I want horror movie scenes to play as I scroll looking for something to watch, and this is assuming AI-generated selections actually work, which they often don’t.

“We understand the script, we understand the plot, the subgenre and all of these other things,” Bidgoli said. “We feed that back in the model, and now we can find someone something that’s hyper-personalized for them, but also that foundation helps on the advertising side.”

One of the company’s ad products, called Tubi Moments, uses AI to tag scenes within movies and shows, reflecting the scene’s tone, sentiment and visual cues. In practice, that means a scene featuring a drink gets labeled as a category a beverage maker can buy—allowing the beverage maker to display an ad during the movie’s ad breaks.

I don’t work in advertising, but “that means a scene featuring a drink gets labeled as a category a beverage maker can buy—allowing the beverage maker to display an ad during the movie’s ad breaks” seems a totally crazy thing to write. Is somebody QCing this feature? What if Coke buys “ad space after drinking scene” and it turns out to be someone drinking blood? Or even worse: Pepsi!

Do advertisers value this type of placement? Aren’t they more interested in the demographics of the viewer watching the content rather than matching the action of the content? You’d think the WSJ might be interested in asking someone besides Tubi!

Bidgoli and Sud also say that AI tools are going to supercharge creators and people who make video content. A platform like Tubi, they said, isn’t only open to partnering with AI-assisted filmmakers and other creators, but aims to help them earn an income.

“It’s going to lower the barrier,” Bidgoli said. “We want the stories from these sort of creators, and we’re going to let them have a shot.”

How is that going to work exactly? How do you monetize AI-generated content if said content was created with video generation tools trained on someone else’s intellectual property? What are the demographics you can report to advertisers for people interested in consuming other people’s AI-generated content? Is this actually content during which advertisers would actually want to run ads?

The whole story is well summed-up here:

The company needs to win over Gen Z viewers, whose attention is captured by TikTok or Instagram as much as television or even streaming, said Sud. More than 34% of Tubi viewers are between the ages of 18 and 34, according to the company.

That age group, which is largely part of Gen Z, is where the company thinks AI can help.

How does AI help with this age group specifically? Well, you know… Because! Ugh, why are you asking for actual details??? Quit being a Luddite!

The Future of Film

American Cinematographer is in the midst of “The State of Shooting on Film”, covering the present and future of using film for production. Parts I and II both have tons of great interviews with technicians, projectionists, filmmakers and more. Check out this shocking tidbit from Vanessa Benditti, VP at Kodak Film:

“Last year, Kodak sold more motion-picture film than we have since 2014, so film is definitely growing. In 2025, we also doubled our 65mm finishing capacity to meet the increased demand for large-format film, especially with [the new Imax film cameras coming to market]. So, when people talk about the future of film, I’m not worried about Kodak’s capacity to produce it. Our 35mm sales are stable and strong, and 16mm has grown exponentially for years. And one of the most exciting things happening now is the at the base of the pyramid: We’re less reliant on big studio features and television shows because there’s so much growth within the emerging-filmmaker demographic, from student films to music videos to commercials to small indie features.”

Speaking of Kodak, IndieWire has a good writeup on how the production needs of the new season of Euphoria resulted in the creation of an entirely new Kodak film stock:

Today, Kodak is officially making VERITA 200D available to the general public, after collaborating on its creation with “Euphoria” cinematographer Marcell Rév, who used the 35mm and 65mm versions to shoot the upcoming Season 3 premiere set for this Sunday.

The medium-speed, daylight-balanced film stock is being billed as “classically cinematic,” rendering colors and skin tones in a way that more closely mirrors older film stocks, and differs from the pristine look of Kodak’s flagship VISION3 stock.

With The Odyssey and Dune: Part 3 both shot on IMAX film, it’s looking to be a banger year for our old analogue friend!

Here’s a round-up of cool and interesting links about Hollywood and technology:

Movie theaters are having a great 2026! (link)

The saviors of the box office are turning out to be Gens Z and Alpha. (link)

Google and Amazon want in on microdramas. (link)