In Steven Soderbergh’s beguiling new movie The Christophers, a reclusive artist (Ian McKellen) tangles with the quiet art forger (Michaela Coel) who his greedy children have hired to secretly finish further entries in a well-known painting series. The movie is smart and provocative about the nature of artistry and authorship, exploring what it means to create – and to stop creating. It’s especially fascinating coming from Soderbergh, who has made movies with workhorse dependability (The Christophers is his third theatrical release of the past 18 months) and also spent four years retired from directing features entirely.
It also provides particularly jarring context for Soderbergh, in interviews promoting the film, to voice his interest in something that a lot of great artists have pointedly refused to embrace: using AI in films. Soderbergh mentioned in an interview with Filmmaker Magazine that he used what sounds like generative AI to produce “thematically surreal images that occupy a dream space rather than a literal space” for his upcoming documentary about John Lennon and Yoko Ono. He also said that a movie he’s hoping to make about the Spanish-American war would use “a lot of AI”. In a subsequent conversation with Variety, Soderbergh didn’t sound like an AI evangelist, but nor did he back down: “I don’t think it’s the solution to everything, and I don’t think it’s the death of everything. We’re in the very early stages. Five years from now, we all may be going, ‘That was a fun phase.’ We may end up not using it as much as we thought we were going to.”
Soderbergh described the reaction to his initial comments as “mystifying” – which might be a fair reaction, given his past embrace of new tech and approach that seems more open-minded than gung-ho. For that matter, it’s not especially clear whether Soderbergh’s planned use of AI on his Spanish-American war movie would be purely generative – the tech can be used to augment the work of any number of technicians – or particularly noticeable, given how often the technology is conflated with (and reminiscent of) the use of computer-generated imagery.
Moreover, he’s far from the most pro-AI voice out there. Other and far more powerful names in Hollywood have sounded ready to embrace it, often in language that has all the handcrafted individuality of, well, a generative-AI corporate spokesperson. Just last week, Sandra Bullock offered these words of autocompleted wisdom: “We have to observe it. We have to understand it. We have to lean into it. We have to use it in a really constructive and creative way, make it our friend.” She joins a relentless Reese Witherspoon on the frontlines of girlbossing their way through what they see as a mandatory part of life and art.

There’s plenty of disdain for performers who strike such a magnanimous, even proselytizing tone when discussing tech that can be easily used to plagiarize, mislead or just create stuff that sucks. But for a lot of movie-watchers, it stings more to hear acquiescence, or even full endorsement, from people whose job it is to think more carefully about how movies are made. In addition to Soderbergh, James Cameron also wants to know more about how he might use the tech, even as he’s promised that generative AI on its own (that is, not ultimately controlled by real visual effects artists) will have no place in his Avatar world (he has also joined the board of company StabilityAI). Ben Affleck has literally invested in AI through a startup company, and his brother Casey is starring in Doug Liman’s new AI-dependent movie, which will use actors and plenty of human crew but also AI-generated sets and lighting. (Adding insult to injury, it’s a movie about bitcoin.) Darren Aronofsky also lent his name to an AI-generated web series.
It may not be realistic to expect lockstep agreement with Guillermo del Toro’s perspective that he would “rather die” than use AI on his films, or even Steven Spielberg’s gentler affirmation of human creativity over this new tech. But it does prompt questions about determining the right amount of support (or at least agnosticism) that anti-AI advocates can tolerate in their creative heroes, and whether those lines will hold. (Del Toro’s, at least, seems likely to. Spielberg, on the other hand, says he hasn’t used AI “yet”, evoking a scene from High Fidelity where record-store clerks attempt to parse the use of that word in predicting the future.)
It’s inherently easier to hear out Soderbergh, who often works on tight budgets these days, or Cameron, who has insisted on keeping humans beneath even his CG characters, as opposed to, say, Liman’s claims that a $300m production (of a non-action, non-fantastical movie that seems to be mostly about people talking?) has been pared down to $70m because of AI (were the sets all due to be constructed out of solid gold? Were they planning to light it exclusively with rubies?). There will probably come a point where at least some technical work in films will use AI not unlike CG: a tool that can look absolutely horrible when used as part of a rush job, or downright invisible when given the proper time and human touch.
Another imperfect analogy may be the introduction of the cinema-ready digital camera, which Soderbergh embraced early on (perhaps even earlier than qualified as “cinema-ready”) and has since mastered, sometimes shooting terrific movies on a well-appointed iPhone. Years later, digital movie cameras are the standard, and the number of committed celluloid holdouts – Spielberg is one; Wes Anderson and Paul Thomas Anderson are two more – has dwindled. Yet it’s also hard to argue that the visual baseline has improved or even remained the same in this new era. There are directors who use digital brilliantly, by leaning into its unique qualities and even limitations (like Soderbergh and Michael Mann), by using it so exactingly that it doesn’t feel like a poor substitution (like David Fincher and James Cameron), or by switching it up on a project-by-project basis (the usually film-friendly Sofia Coppola was absolutely right to shoot The Bling Ring that way, for example). But on average, the bad stuff looks worse; plenty of movies from the ’90s and ’00s now look practically ravishing in retrospect, simply because they’re well-lit, celluloid-shot studio productions.
In a few years, we may well be saying similar things about AI, and again pointing to film-makers like Soderbergh or Cameron who figured out how to use it “right” – whatever that looks like. The real danger is not that Steven Soderbergh will start lending his name to AI-churn slop; to assume that requires a willful misinterpretation of his words and his past actions. No, the greater danger is that the lower tier, the normal movies that sometimes look ghastly already, will be further degraded, and that discerning audiences will come to depend on an increasingly small group of top-tier purists just to provide something that isn’t kinda crap, while less discerning audiences will be trained to ignore the vast gulfs in quality.
The blue-sky idea behind a lot of digital tools is that they democratize art, giving more people access and ability to make movies, and when that actually happens, everyone wins. But the corporate-coded push toward AI doesn’t often sound like film-makers enthusiastically embracing or democratizing new tech; it sounds more like Reese Witherspoon running an HR meeting (or a pyramid scheme) that ends with an announcement of mass layoffs. Resisting the kind of degradation that AI makes so easy may require more than a few noble holdouts and thoughtful Soderbergh types. People who claim to speak for the film industry will need to start thinking of movies as a craft worth learning or an art worth making, rather than an office getting some cool new software.

2 hours ago
11

















































