Technology Turns Back Time on ‘The Irishman’
By Daron James
LOS ANGELES (Variety.com) – Motion capture in one form or another has been around for decades, and certainly the most recognizable modern use comes from “The Lord of the Rings” films in which Andy Serkis portrayed the beady-eyed creature Gollum. The actor wore a special bodysuit, helmet and strategically placed markers so that each detail of the part computer-generated, part live-action performance appeared on screen.
Howver, the Pablo Helman-led visual-effects team behind “The Irishman” from Martin Scorsese didn’t have the luxury to use such technology.
The story chronicles the criminal and union ties between Frank Sheeran (Robert De Niro), notorious crime boss Russell Bufalino (Joe Pesci) and Teamsters leader Jimmy Hoffa (Al Pacino) over four decades. The iconic actors, now in their 70s, needed to appear younger in many scenes. To rejuvenate them, Scorsese didn’t want to use technology that could intrude or restrict the performances. The ILM team took on the task to develop a camera rig and newly designed software dubbed Flux to free the actors from wearing facial markers or helmets.
“Marty felt strongly about us not touching any of the performances,” says Helman. “Another important thing to him was de-aging the actors so they looked like younger versions of the characters and not younger versions of themselves. It’s a subtle difference but it meant we couldn’t simply rewind the clock.”
The technical development started in 2015 when Helman and ILM had De Niro reenact the pink Cadillac scene from “Goodfellas” to prove it would work. The test got the film greenlit by Netflix and then the team spent another two years fine-tuning the complex pipeline.
To pull it off, visual effects recorded a range of facial movements of each actor in prep under different lighting conditions. Models of those performances in 3D were created to later help extract the on-set performances. In production, scenes were shot using a unique rig comprising three cameras — one primary (RED Helium) for the director and a pair of witness cameras (Alexa Minis) on each side to record the visual effects data. The offset cameras focused on the actors’ faces and were fitted with an infrared light ring and filters.
The Flux software then analyzed the infrared lighting and texture to make renderable models and compared it to the models created in prep. The system then re-targeted the on-set performance to the younger version of the character.
“We did all of this without adding animation or keyframes to the performance,” notes the visual effects supervisor.
In creating the younger versions of each character, ILM collected thousands of images and videos of the actors’ targeted ages and built an AI-based engine as a way to check the facial performance of the younger model. They then would reattach the digital head to the live-action body. Visual effects also performed work on the hands and the bodies, or at times, aged-up De Niro if he didn’t look old enough.
Visual effects accounted for 1,750 shots for the three-plus-hour movie. Now, Helman and the team are focusing on giving the system a facelift by reducing the size and weight of the rig. They’re also integrating AI into the coding for improved accuracy and faster performance.
“We did this for the actors and how performance can be so powerful,” notes Helman. “The whole thing about this is if we take away the technology from the actors and from the director and get better performances then we have a better movie. That’s the thing that drives what we are doing.”