Machine Learning Saves ‘Avengers’ VFX Artists Time
By Karen Idelson
LOS ANGELES (Variety.com) – For visual-effects artists, time is always a struggle. When the call comes in to create something spectacular, artists and supervisors have to calculate how much run-
way they have to get from the point of the idea for the vfx to the deadline. On “Avengers: Infinity War,” the vfx crew found that a new innovation — machine learning — made it possible to create the character Thanos in a way that would have simply been impossible without it.
The filmmakers envisioned a version of Thanos — played by Josh Brolin — that would be CG, but also incorporate all the subtle facial expressions and delicate hallmarks of a physical performance that could only been done by an actor. They knew that the facial tracking tech was there but asking vfx artists to manually adjust every inch of the CG version of the face of Thanos once they had all the tracking and scanning information would have been a disaster. There would not be enough time to do it and the artists would likely burn out.
If they wanted to get to the promised land, machine learning — a type of artificial intelligence that allows a computer to learn how to move the facial features of a CG character — would be their vehicle.
“There’s just no time for a vfx artist to move every inch of a character’s face when you’re on a project this size and you need to move forward because it’s not just creating something,” says Dan Deleeuw, vfx supervisor for Marvel Studios. “With machine learning, we can teach the computer to know ’s face by correcting it each time we do something. We can tell it that a smile or a frown doesn’t quite look right and the corner of the actor’s mouth needs to move a little or the lines in his face need to move with his smile and then the computer learns what looks right for this actor’s face and then we can get into more of the overall look of what we’re doing.”
Though most of the big VFX houses are using their own version of machine learning, human judgment, as with human performance isn’t likely to be replaced anytime soon. The algorithm relies on the feedback from the vfx artists to tell it what’s right and even after that all the refinements are judgment calls for the VFX team to make in the same way the crew looks to an actor’s performance to give them the information to create a character that doesn’t look artificial or wrong.
“The performance clearly belongs to Josh Brolin,” says Kelly Port, VFX supervisor at Digital Domain. “We’re relying on him to create something that only he could create and then we’re working with what he gives us to help bring Thanos to life. Machine learning makes it possible for a vfx crew to have the time to create something amazing when a director asks us for that next step up in visual effects.”