Français/French Deutsch/German Italiano/Italian Português/Portuguese Español/Spanish 日本語/Japanese 한국어/Korean 中文(简体)/Chinese
Simplified
Select Language:

Monday, March 30, 2009

MAGIC of 3d technology of Benjamin Button

Ed Ulbrich, visual effects executive producer at Digital Domain, gave a fascinating demonstration of how a team of 155 artists working for more than two years created the face of Brad Pitt for The Curious Case of Benjamin Button.

Contrary to published reports that Pitt's head was superimposed on the body of other actors for the film, the face of Benjamin Button was entirely digital for the first hour of the film, though it was painstakingly crafted using Pitt's face and a series of technologies that were merged for the first time to create a new technique they're calling Emotion Capture.

Ulbrich explained how they did it. The project was first conceived in the late 90s when studios first considered making the film. But the the idea of creating a life-like face that was capable of minutely expressing all human emotion and age over several decades was deemed impossible and beyond the technology of the day.

"The human form, and particularly the human head, has been considered the Holy Grail of our industry," Ulbrich said.

Then director David Fincher came on the project.

"David won't take no," Ulbrich said. "He thinks anything is possible as long as you have the time, resources and money."

Fincher wanted the main character of the film to be played from cradle to grave by one actor, but the filmmakers quickly ruled out the idea of using prosthetic makeup, which would be distracting in close-ups and wouldn't hold up under everything they wanted the character to do. So they decided to cast a series of people of different sizes to play the different bodies of Benjamin through his life and superimpose a digital version of Pitt's head onto the bodies.

But the studios were worried. They wanted proof that it would work, so in 2004 Ulbrich's team created a screen test showing a raw prototype of an actor with a digital head superimposed on his body to show the studios what they had in mind. He played that initial video for the audience.

When he got the phone call saying a studio was willing to green-light the project, he says he threw up.

"We had a big problem," he said. "We didn't know how we were going to do this. But we believed we had enough time and resources and hopefully had enough money as well. And we had passion to will the process and technology into existence."

They needed to make Pitt age 45 years, and they needed to make sure they could take his ticks and the subtleties that make him who he is and translate them through computers. They needed a character who could hold up in various kinds of light, who would be able to sweat, cry, take a bath and do everything an actor would do, but in a seamless way that wouldn't distract the audience or make it obvious that what they were looking at was digital.

But there was a giant chasm between the technology available in 2004 and where they needed to be. They considered marker-based motion-capture technology to capture the motions of the body and apply them to computer characters. But that didn't work. They also looked at facial-marker technology, but the result was artificial.

"We realized that what we needed was the information going on between the markers to see the subtleties of the skin and muscle," Ulbrich said. "We were now well out of our comfort zone."

In the end, they created a "technology stew" that reappropriated technologies from other fields and wrote code that would allow the disparate pieces of software to come together.

They used the research of former psychology professor Paul Ekman, who devised a facial action decoding system. Ekman determined that there are 70 basic poses of the human face that can be combined into every expression the human face is capable of making.

Then they decided to use a then-unproven technology called phosphorous capture technology that involves applying phosphorescent powder to a face, exposing it to black light, and imaging it to create 3D-data of the facial expressions. It worked.

They created a database of everything that Pitt's face was capable of doing. Then they had to age him from 44 to 87. They brought in artists to take a life cast of Pitt and make several busts of him as "Benjamin Button" at 60, 70 and 80 years of age. They scanned the faces into a computer at very high resolution. Then they began the process of transposing the 3D data of Pitt's expressions onto each of the scanned faces.

They still had to create teeth and a tongue. One person worked solely on the tongue for nine months.

Although the finished product had a bit of what Ulbrich called a digital botox effect -- "it kind of sandblasted some of the edges off the performance" -- they had succeeded in creating a new visualization technique.

(courtesy by Wired)

No comments:

Post a Comment