Entertainment

How 'The Jungle Book' Made Its Animals Look So Real With Groundbreaking VFX

"I do not want to embellish with the computer; I want to simulate real life."

Disney Studios

A small, colorful bird flutters about on screen in the opening scene of Disney’s new live-action adaptation of The Jungle Book, inviting the audience into the mythical wilds with an adorable chirp and clear message on behalf of the filmmakers, which amounts to, “Look what we can do now!”

Directed by Jon Favreau, who made the first two Iron Man films, this version of The Jungle Book, which borrows from both Disney’s 1967 cartoon and the original Rudyard Kipling novel, sets a new standard for life-like CGI animals. Shot entirely on a soundstage in downtown Los Angeles, it is sort of a hybrid of Avatar and Who Framed Roger Rabbit, with one human performer surrounded by animated creatures — the difference being that every effort was made to trick the audience into believing the animals were real. Many people were surprised to learn that the tiger in Ang Lee’s hit film Life of Pi was CGI, and The Jungle Book multiples that visual wizardry many times over.

Led by Oscar-winning visual effects director Rob Legato, who oversaw the groundbreaking work on Avatar, Disney and several VFX houses created a new standard for realistic CGI by both pushing technological limits and purposefully restricting themselves creatively.

“Everyone wants to know if there’s some invention, like, ‘Buy this box and it’ll spit out photo real pictures,’ but thats not really the case,” Legato told Inverse. “It’s really taste and skill and desire to do it that way.”

Instead of one incredible piece of software, Disney and partners — including MPC and Peter Jackson’s WETA — required several steps to create their incredible images. Everything was shot on a soundstage, with Neel Sethi, the 12-year-old actor who played Mowgli, interacting with a few sparse props and blue-screened bumps to represent land masses and other jungle obstacles.

“We motion captured Neel, and we had humans kind of mimicking the animals he talked to so we could see a live digital composite of Neel with the dialogue with the bear or other animals, so we created eyelines for him to look at,” Legato explained. “That was essentially the shot, and then we took that apart on a blueprint basis and replicated it on stage, because we liked that shot, liked the composition.”

As you can see in the video above, Sethi was required to do a lot of acting to very little, because the more they added to the physical setting, the more they had to remove when they went into the very lengthy post-production process.

“He had small props to interact with, but because the animals were interacting with the environment he was in, and would cast shadows on it, to fully engulf the physical set with the real jungle set, it took away from and would have to be replaced anyway,” Legato said.

They were a bit hamstrung when it came to replicating the animals’ movements. Disney no longer allows exotic animals into studios to help animators’ research, ending a decades-long tradition that spanned from the production of Dumbo to The Lion King. That meant the animators had to work off of reference photos and videos, which created a hurdle in their effort to totally replicate real life.

“You have to have the will or desire to say OK, I do not want to embellish with the computer, I want to simulate real life, how things move and why they move,” Legato explained, noting that everything down to the movement of an animal’s jaw was restricted by natural limits. “As soon as its realistic, you also notice that if the animal isn’t just right, if someone thought it would be funnier to move a little faster than the animal can move, you can pick up on it right away because you’re believing everything else, and the one thing you’re not believing is an animal that big can move that fast.”

The one time they did bring in animals, giving Sethi an opportunity to play and cuddle with puppies for an early scene, they wound up taking them out of the shot and replacing them with CGI wolves, anyway. The cubs were likely the same size as the puppies, but some of the animals in the film were up to 50 percent larger than their real-life counterparts, a nod to the child protagonist.

The hard work came in post-production, which in many ways ran alongside the production of the film. They used Maya animation software to create the complicated animal rigs and the RenderMan software, which was developed by Pixar, for the shading and lighting, which was immensely complicated for immersive jungle scenes. One of the most crucial parts of creating realistic CGI animals is getting the fur just right, which includes creating natural movement in real-time.

“They call it ‘grooming,’ and it’s not just the ability to have a million individual hairs that react to light, it also has to be groomed, like the way an animal has patches of hair,” Legato explained. “If you put hair on someone’s head, it has a wave pattern that your brain sees as real. The grooming was designed to allow you to control it to some degree, but also to naturally replicate the wave patterns of hair.”

When Disney’s Zootopia hit theaters, there was a lot of buzz around just how many individual hairs they were able to place on the many creatures that populate the hit cartoon. The Jungle Book far surpasses both the sheer number of animals (there were 70 species) and, because it was a live action film, the amount of realistic detail required in each of them.

“MPC revamped its pipeline to allow more ray tracing, which is expensive, because it takes a lot of computational power to figure out every pixel of light, how it bounces around and affects the area around it,” he said, noting that WETA focused its work on the scenes with King Louie, the gigantic ape voiced by Christopher Walken. “They used the new Renderman ray tracer, which is very computationally heavy, and very expensive to use, because of how much computer power it takes to actually simulate. When you have that much hair on an animal, and you have 5 to 15 animals in a scene, and then you have every blade of grass and piece of floating dust, you’re into a tremendous amount of computing power.”

For the most complicated scenes, the computational power required was astounding.

“It would take 30-40 hours per frame, and since it’s stereo [or 3D], it requires two frames to produce one frame of the movie — at 2K, not even 4K,” Legato said. “So you can tell how much the computer has to figure out, exactly what it’s doing, how it’s bouncing, how much of the light is absorbed, because when it hits an object, some gets absorbed and some gets reflected.”

The math there is mind-boggling; it takes a full 24 frames to make up a single second of the movie, and most shots are between five and ten seconds. That required “literally thousands of computers,” Legato said, and eventually, some creative solutions.

“I think they started using the Google cloud, which has tens of thousands of computers, and sometimes it would take two or three days to render a shot, he said, exasperated at the mere thought of the process. As powerful as the computers were, they ultimately were just taking cues from the human innovators who spent years on the film.

“In all this,” Legato said, “there’s no real computer that replaces the skill of the operator, of the person who is pushing the buttons.”

Related Tags