On Friday, TriStar Pictures released Herea $50 million film directed by Robert Zemeckis that used real-time generative AI facial morphing techniques to portray actors Tom Hanks and Robin Wright over 60 years, marking one of the first Hollywood feature films built around AI-powered visuals. .
The film adapts a graphic novel 2014 set primarily in a New Jersey living room across multiple time periods. Instead of casting different actors for different ages, the production used AI to alter Hanks and Wright’s appearances throughout.
Anti-aging technology comes from Metaphysicsa visual effects company that creates aging and changing face effects in real time. During filming, the crew watched two monitors simultaneously: one showing the actors’ actual appearance and another showing them at whatever age the scene required.
Metaphysic developed the facial modification system by training custom machine learning models on frames from previous Hanks and Wright films. This included a large data set of facial movements, skin textures, and appearances under different lighting conditions and camera angles. The resulting models can generate instant facial transformations without the months of manual post-production work that traditional CGI requires.
Unlike previous aging effects that relied on frame-by-frame manipulation, Metaphysic’s approach generates transformations instantly by analyzing facial landmarks and mapping them to trained age variations.
“This movie couldn’t have been made three years ago,” Zemeckis said The New York Times in a detailed article about the film. Traditional visual effects for this level of facial modification would require hundreds of artists and a substantially larger budget, closer to standard Marvel movie costs.
This is not the first film to use AI techniques to age actors. ILM’s approach to rejuvenating Harrison Ford in 2023 Indiana Jones and the dial of destiny It used a proprietary system called Flux with infrared cameras to capture facial data during filming, then old footage of Ford to rejuvenate it in post-production. In contrast, Metaphysic’s AI models process transformations without additional hardware and display results while filming.
Rumors in the unions
the movie Here It comes as major studios explore applications of artificial intelligence beyond visual effects. Companies like Runway have been develop text to video generation toolswhile others create artificial intelligence systems such as callaia for script analysis and pre-production planning. However, recent contracts with guilds impose strict limits on the use of AI in creative processes such as screenwriting.
Meanwhile, as we saw with the SAG-AFTRA union strike Last year, Hollywood studios and unions continued to hotly debate the role of AI in film. While the Screen Actors Guild and the Writers Guild have secured some limitations on AI in recent contracts, many industry veterans see the technology as inevitable. “Everyone is nervous,” Susan Sprung, executive director of the Producers Guild of America, told the New York Times. “And yet, no one is quite sure what to be nervous about.”
Still, The New York Times says Metaphysic’s technology has already been used in two other 2024 launches. Furiosa: A Mad Max Saga He used it to recreate the character of the late actor Richard Carter, while Alien: Romulus brought back Ian Holm’s android character from the 1979 original. Both implementations required estate approval according to new California legislation that govern AI recreations of performers, often called deepfakes.
Not everyone is happy with the way AI technology is developing in cinema. Robert Downey Jr. recently said in an interview that he would order his heirs to sue anyone who tried to digitally resurrect him from the dead for another film appearance. But even with controversies, Hollywood still seems to find a way to make death (and age) defying visual feats happen on screen, especially if there’s enough money involved.
This story originally appeared on Ars Technique.