The Lion King opened on July 17, giving audiences a look at the classic Disney animated film through more realistic eyes. Way back in December of 2017 we got a chance to visit the virtual set of the film and interview the filmmakers. You can check out our chat with director Jon Favreau by clicking here. On set we also spoke with VFX Supervisor Rob Legato, Virtual Production Supervisor Ben Grossman, VFX Supervisor- MPC Adam Valdez, Director of Photography Caleb Deschanel, Production Designer James Chinlund and Animation Supervisor Andy Jones.
Legato spoke to us about about the challenge of the film that drew him in. “Well, the first one is, part of it was the Jungle Book thing that you can get to that portion of photorealism with creatures that you have seen and seen them in documentaries and zoos and various other things that you can see them up close and then kind of confuse the eyes that was fake. And we also tried not to make it in that particular movie have any kind of human characteristics. It could only do what it really is built to do. And so that test was like we can do this and now we want to improve the art form.”
On the set, we got a chance to check out some of the virtual reality cameras, which were set up for us as an island near Pride Rock. We could “film” the creatures that were there, using the cameras as though this were a real set. Legato spoke about finding shots on the actual day using this technology. He said, “In various different versions of that we first to go –, James the art director, production designer, he builds the set with the idea of obviously what we’re going to stage in there, what the scene is about. Picks just like you would on a live action set, picks the location or the set or whatever you’re going to do and he explores that. So he has a couple of versions for just him. And then he brings it to the stage where myself and Jon and Caleb and whoever else, Andy Jones, Adam Valdez, we’ll all go in there and say, this is what we’re thinking. Any thoughts before we start committing, start going down the food chain and committing it? So all that happens. And then on the day you know Caleb will be behind the camera, Jon will be there, I’ll be there or we’re all doing it. And so we’ll even try this shot.” He explained that it gave them a chance to set up shots in different ways before anything was committed to film.
He also spoke about incorporating actors performances in different ways with this technology. He explained, “It is Jon’s belief in it and it does work obviously, it’s the same sort of idea that when you have a bunch of actors in a room they start to rub off on each other. The chemistry of you forgetting your line, remembering at the last second creates a kind of a timing that I take advantage of especially from a comedian and I could you know play on your flub. And then that flub causes –. You know so that chemistry now creates a much more lifelike thing. And then our belief system when we do this is unlike doing things that are motion captured where you’re trying to make Andy Serkis’ arm be the arm of the ape or whatever is that we don’t believe that works that well.
“And so you have a very talented animator who takes the timing and all the rhythm that an unencumbered actor has. When I look up, when I check my watch, when I do whatever I’m doing, all those ticks and things, they’re not driving an arm that isn’t my arm. They’re not doing all that. They’re inspiring timing. And so to make it actually look more authentic you use less of motion capture and you just use motion and the ability to capture somebody in a –, you know in a movie way is very free-form and improv like. And use it as your spirit to guide you in how to do it.”
Grossman spoke about the technology as well. He said, “If you go back to Avatar, Avatar solved the problem of how do you film a movie that usually gets created with computer graphics in a computer and so we put computer graphics into the Cinematographer’s monitor so that they could use more traditional equipment to see the movie. Fast forward to Lion King and what we’re doing is we’re putting the film makers inside the monitor. So now, they can put on a VR Headset and be in Africa or on the Empire State Building or on the surface of the Moon so that they can walk around and see and feel the film making process with all the equipment as though they were there.”
He continued, “The next thing you want to do that’s a step towards making the film making process more like the real world but with super powers. So the next step you want to do is make this world interactive like the world is. So in order to put characters and people into it, we need to start driving more intelligence systems so that in a sense since we have these artificial worlds that we’re creating, then there’s no reason you wouldn’t want artificial life. And so to drive that, you would need artificial intelligence. And you see a lot in the news but it’s hard to think of a practical application for artificial intelligence. But just imagine you were standing on an island with a dumb rhinoceros. Imagine if that was powered by artificial intelligence. Imagine if, you know, there was a sense of God for this world that you could just say Hey Siri, move that tree over here, paint the sun over there and start having voice control over things. I want this scene to be more depressing or happy. Artificial intelligence helps translate what you want into this virtual world.”
We were shown test footage of Rafiki, which gave us a look at the eyes of the character. The group asked about getting rid of the uncanny valley effect. Valdez told us how it was done. “I’d say there’s two things. One is the nonverbal language, right. It’s the main instrument of empathy. And it’s incredibly subtle. So, tiny, tiny things make a huge difference in whether or not you understand basically what’s going on inside the head of someone else. It’s really easy if you get it just wrong for that message to come across as kind of creepy, which is the definition of the uncanny valley, so close yet so far. And actually, it’s not really that it’s not accurate. It’s that you’re saying the wrong thing. And when you say the wrong thing with the face you can say, instead of me looking at you lovingly I’m looking at you drunk or sick or crazy. And that’s an instant weird reaction in the audience member or any other human being. I mean that’s how powerful the phase can be. Eyes, doubly so.”
He explained further. “The classic example is the top lid. If the top lid gets near the pupil you look sleepy or drunk or you might look sexy, you know…So, it’s like there’s these little tiny, tiny things. And so, animators and the people who create the puppets, there’s an entire puppet just in the eye of all the moving parts that the animator controls, all of them just get better and better and better at understanding the language and executing it more precisely. That’s one big thing. The second thing is the look of the eye, getting the eye to have – not look like a doll because we all think dolls are creepy. So, you have to avoid a doll eye. And so, as the light in the eye, actually, we simulate it more accurately, you know, you literally think of, like you seen a video game of three dimensional characters running around. So, you guys will all understand 3D. So, you just think of 3D model of the eye, how the light penetrates, how it scatters inside the eye and comes back out, similar to the way light moves through water, through the caustics and the physics of that, you have to get it just right. And when you get it right, the eye is engaging and it’s inviting and it communicates – it helps the communication, too.”
Jones spoke about taking a trip to see actual animals and how they actually interact in their environments. He said, “It was quite fascinating to me ‘cause I only seen these animals in zoos and to really see them in their own environment and get a feeling for how they do roam around each other and, and how they react around each other. Especially the big cats versus the prey animals. We learned a lot about behaviors and different things that we’re gonna’ try and bring in this film.”
Deschanel spoke about the emotion of filming even VR creatures. He said, “The one thing I have to say, because the film was made before and it was animation before, but these creatures, I don’t know how much you’ve seen, but when they’re done, you believe them so much that, you know, I mean, I feel like with these characters and when they’re in danger or when they’re, you know, joyful or whatever, you feel the emotion really a lot. And I find myself getting emotional as we were filming things. So it’s pretty great, you know. It’s really, it’s been fun.”
He also talked about visual references from the animated film and what he used from it. “Well, it’s funny because in the D-23 [footage where we saw the first scene at Pride Rock]…there’s a pullback at the end on the one side of, you know, Pride Rock and as we were putting it together we actually flipped that shot and looked at it the other way. And you know, we all really liked it so much better that way. And then somebody reminded us that no, that screen direction is the iconic image you know? So there are certain things that obviously kind of lock you into that. But in reality, we certainly looked at that and studied it and there’s certainly elements of the film that imitate it. But I think it’s going in a whole other direction because of the reality of the nature of these characters. How much you feel the sort of emotional tie to them that I think you don’t get in an animated film as much, you know?”
Chinlund spoke about designing sets for a movie with not actual set. He said, “I think the daunting challenge coming into this coming from live action is the fact that there is no world at all when we started. So I think the initial job was laying in a representation of Africa that we could explore and certainly I’m used to being able to problem solve locations by going out in the world and scouting and looking at things and finding inspiration and I think in terms of the process on this movie has been really interesting. We obviously started with Pride Rock and built out from that, but my ambition was to build a world that was entirely cohesive so that at any given moment the audience is gonna feel like they know where they are. They’re in true geography that when you’re at Rafiki’s tree you can see Pride Rock in the distance. You know that, that’s gonna be a true relationship that you can count on so basically we built the world map to start so we place the cloud forest and the elephant graveyard and all these things on the map and started to build a piece of topography that would contain them and then just up res-ed and up res-ed that world as we went.”
The Lion King is now playing in theaters. Pick up your tickets by clicking here!
The Lion King
-
The Lion King
-
The Lion King
-
The Lion King
-
The Lion King
-
The Lion King
-
The Lion King
-
The Lion King
-
The Lion King
-
The Lion King
-
The Lion King
-
The Lion King
-
The Lion King
-
The Lion King
-
The Lion King
-
The Lion King
-
The Lion King
-
The Lion King
-
The Lion King
-
THE LION KING
-
THE LION KING
-
The Lion King
-
The Lion King
-
The Lion King