It would have been hard to blame the filmmakers behind the new adaptation of Ghost in the Shell if they made a movie that was “live-action” in name only. Based on a classic anime set in a high-tech and semi-dystopian future, the story focuses on a cyborg cop named Major (played by Scarlett Johansson) who helps patrol a city of boundless skyscrapers, infinite digital displays, and robot denizens. It practically screams out for CGI, and while director Rupert Sanders and his team certainly relied on computer graphics, they also mixed in a significant percentage of physical props, animatronics, and on-location elements.
“The goal was to blur the lines. To be able to say, ‘Here’s a CG shot, there’s a real shot. Here’s a CG character, there’s a real character,’” Guillaume Rocheron — who works for effects studio MPC and served as film’s VFX supervisor — tells Inverse. “You want to mix up techniques as much as you can, so as a viewer, you’d never know which is which. And we were fortunate to shoot in New Zealand, which gave us the opportunity to work with Weta Workshop. They can manufacture miniatures and they can make animatronics. It’s like a one-stop shop.”
Rocheron spent years working on the project, with design work alone beginning a year before cameras rolled and lasting well into post-production. MPC was involved in just about every scene, and worked with new technologies, developed by a company called Digital Air, to bring some of the more complicated concepts to screen.
In some ways, Major’s journey is only half the story being told on screen. There was a concerted effort by the filmmakers to present a scary, endlessly stimulating version of the future, expressed in the design of the fictional Japanese metropolis of New Port City, circa the year 2029. “The world is overfilled with information that can affect the machines, and technology is overtaking nature and humans,” Rocheron says. “So we really wanted to overpopulate the world with holograms, information, and advertising everywhere.”
Featured prominently in all the promotional materials for this version of Ghost in the Shell are giant, living holograms — a geisha the size of a skyscraper, koi fish swimming through the air at eye-level — known as solidgrams. A new creation never seen in the 1995 anime, they required the invention of a new and very involved photography apparatus and digitizing process to properly manifest.
“We call them solidgrams because they are solid holograms, basically a projection of something that, when you see it in your environment, looks real,” Rocheron explains. “We had to have actors perform and had to have a way to capture the performance of those actors and then be able to create a volumetric living version of those actors. So we created a new camera system that allowed us to record those performances.”
Weta helped pioneer the motion capture technology that has injected life into dragons, superheroes, and cartoon characters for the last decade and a half. For Ghost in the Shell, Digital Air developed a next-level process for recording far more than what was required to render Gollum or warrior apes.
“It’s a system that has 80 cameras around an actor. It’s what we call motion photogrammetry. We create a volumetric version of those performances. Once we had those in the computer, then we could integrate them into the cityscapes. It’s not just motion capture, where you just try to capture an actor’s movement. We are trying to capture the skin, the hair, the cloth, the performance. So it’s literally a moving 3D scan.”
They recorded a whopping 200 different actors with the new system, creating a massive amount of data. “There are 24 3D scans per second, and in the end, we ended up creating 30,000 3D scans, which is absolutely ginormous,” Rocheron marvels. And all that data was put to good use, creating a fully realized character hologram like none that have come before it.
“We were able to capture a full actor performance, not just a face or just a performance. Then we could put them into cityscapes and really manipulate them instead of just being the traditional holograms that you see in Star Wars,” he explains. “It will look like you it is really sitting in my room. If I have a window, it’s going to be lit by my window. If I have a bunch of Christmas lights, it’s going to be lit by my Christmas lights.”
One of the most iconic sequences in the original 1995 anime is the transmutation of Major from fallen human to seemingly invincible robotic stealth hero. Much of that transformation involves the creation and application of a high-tech endoskeleton, which is then covered with a human-like exterior. Unlike the solidgrams, a lot of this scene was made with physical props.
“We designed it in the computer with the concept artist and we made a 3D model of the skeleton,” Rocheron says. “Then Weta workshop 3D-printed the whole thing, a one-to-one scale skeleton. There were 1,400 parts. Then, we made a one-and-a-half scale head. We animated panels and things [on the body] this so we could really get close up and add some good details. We filmed that on the blue screen so we could just create the wider environment.”
The next part of the sequence, which included wide shots, was all digital, with computer-generated skeletons in a CGI environment. Then, it was back to the real world and a tactile skeleton.
“When we go back, when she is pulled out of the white liquid, we built a version of her body that was filled with concrete. It was very, very heavy, so it could sink into a pool of white goop,” he explains. “Then, the special effects guys created a rig that could pull her up. We shot it on the Phantom camera, and there are a lot these shots where she’s coming out of the white liquid on camera. Then, we transition to the moment where all the petals are breaking off, and that’s where we transition back to all digital.”
Not all of the 3D printing made it into the film. While Sanders had several crews running around Hong Kong — the inspiration for New Port City — shooting on location for nine days, even more of the movie was made on huge soundstages in Wellington. And while the cityscapes and some of the more complicated exteriors were rendered in CGI, they were planned using eight-foot-tall models printed out by Weta.
“We made miniatures but we never shot them with the film camera, because it’s difficult to film them. But, it’s a great design tool. Workers could come in and we could walk around the miniatures. Talk about the features, remove some features, and all this stuff.”
The models were arranged around the basement studio, creating an ever-changing, fictional miniature version of Hong Kong buried deep inside New Zealand.
“Once Rupert was happy with the design, we basically created a 3D scan of skyscrapers,” he says. “Then we added more details to them. Then we adjusted to enhance the miniatures to make them look better and put them in authentic cityscapes. The miniature was used more as a design tool than a finishing tool.”
And so, not only does the spirit of the anime live on in the film, but so does the spirit of production in the old Japanese kaiju films of yesteryear, with their cities of miniature buildings spread across studio floors. It’s just another way the film was more tactile than audiences might expect.
Correction: A previous version of this story said that MPC developed “motion photogrammetry.” It was created by Digital Air for “Ghost in the Shell” and used by MPC on the film.