Entertainment

The 5 VFX Secrets That Made 'Spider-Man: Homecoming' So Realistic  

There's good reason why this Spidey looked more realistic than ever.

When Sam Raimi’s blockbuster Spider-Man hit theaters in 2002, it blew away audiences with cutting-edge visual effects. It was the early days of the modern CGI boom, and new technology meant that Spidey could sling between skyscrapers with a degree of unprecedented believability.

Fifteen years later, and those effects, while impressive for the time (the sequel won the VFX Oscar in 2005), now look a bit quaint and even a little cartoonish. VFX have improved with leaps and bounds over the last decade and a half, to the point that it is sometimes impossible to distinguish between tactile and computer-generated images. There is no better showcase for this leap in effects technology than the web-slinger’s latest adventure, Spider-Man: Homecoming.

The Marvel and Sony Pictures co-production — the sixth Spider-Man film since 2002 — features some of the best character animation and location renderings seen on the big screen. And with that in mind, Inverse spoke with the movie’s VFX supervisor, Sony Pictures Imageworks’ Theo Bialek, who has been involved in the post-production of every single Spider-Man movie. The effects vet explained how technology has advanced over the years and made increasingly realistic wall-crawler movies possible.

1. Rendering

There are always small updates and improvements being made to CGI tools, but in many cases, it’s the exponential increase in the speed with which they operate that has made the biggest difference.

“Our ability to iterate and interactively work with the shops has improved on an order of magnitude since the first Spider-Man,” Bialek said. “You used to hit the render button and visualize what you’re trying to create as an image. The time it takes to see it is so much shorter than it used to be. The fundamentals of creating the imagery is essentially the same. But to get to the, from Point A to Point B, is just so much faster.”

Why does that matter? Imagine painting a picture, but you can’t see your brush strokes until the paint totally dries — it would make the task almost impossible. Similarly, the amount of guess work and re-visiting of shots to adjust for delayed images was both exhausting and incredibly expensive. Being able to see what things look like in real-time makes it so much easier to get work done, and get it done well.

2. Room Scanning

One of the great secrets of big blockbusters is that even when characters are in what seems like normal everyday settings — think classrooms, apartments, and offices — the locations are often at least partially computer generated. Bialek’s team worked on a warehouse scene in Homecoming, and the background was done almost entirely in CGI.

“A big improvement that led things to being photo-real was better set acquisition,” the VFX supervisor explained. “We have an ability to scan the practical sets that we’re at, with the LiDAR system, and we generate in real-time the models you can use.”

It was that kind of advance that helped Bialek and his co-workers produce this startlingly real high school hallway fight scene in The Amazing Spider-Man 2:

3. Light Scanning

Just as important as capturing the physical details of props and locations is making sure that their computer-generated counterparts interact with light. And because you can’t bring big lights into a digitally created environment, you need to be able to anticipate how it will interact with images.

“We have the ability to scan the lighting; we set up these systems that capture all the light sources, as well,” Bialek said. “And then, that generates data that we can pop right into our renderer. And it recreates very close to the lights that were on set in our CG world.”

But generating realistic light isn’t enough — the key is to figure out how the light will change as people and objects move around in the scene.

“They model the physical attributes of light and how it bounces around,” he added. “So you don’t have to do all these cheats and things that you used to have to do in the past. The lighter just drops something in there, and it will automatically balance.”

Spidey in a CGI warehouse

Sony Pictures

4. Blending Performances and Inspirations

Tom Holland made for the perfect, new Peter Parker, and not just because of his boyish charm. As Bialek noted, Holland’s incredible athleticism allowed the VFX team to capture his flips, twists, and leaps, as well as model full CGI characters based on his mannerisms.

Holland wasn’t the only model — they also used stuntmen, CGI artists (in a pinch), and even people that had absolutely no connection with the movie.

“There’s a shot in the warehouse where a stuntman jumps up in the air, and kind of steps off of the balsa wings,” Bialek said. “It’s an opportunity where we could do something really outlandish, right? But we wanted to keep the performance more muted — he’s not that cocky or in control.”

Spider-Man making a jump for it

Sony Pictures Imageworks

Their ultimate inspiration came from real daredevils, who do insane things without superpowers.

“There’s this whole subculture of people who jump over cars. Cars just drag super fast, and then these people do jump-ups straight in the air and try to leap over the cars,” Bialek said. “I don’t condone it, but that reference is interesting. You just see that anticipation, and how they start to sort of raise their shoulders up. And, just, the posture slightly changes, ever so, and they’re anticipating the quick motion they’re about to have to do. And then, how they jump, and how their feet move, it’s really, it’s just not what you expect. At some places, it’s really subtle and subdued in the performance.”

5. Compositing

When creating environments, it’s also important to be able to assemble things together, see how they look from all angles and be able to move them in real-time. That wasn’t always possible.

“Before, when a compositor used to work on a shot, everything was 2D, which is almost like you’re just working in an old version of Photoshop, just 2D images on top of each other,” Bialek recalled. “Nowadays, our composting software’s advanced, where it does a lot of 3D or 2.5D directly into compositing software.”

Imageworks made this beach entirely from CGI.

What does that mean, exactly? Flexibility, which is important when you’re constantly being asked for tiny tweaks and changes.

“For example, for the beach scene, they can create a tile world, which allows them to have the boardwalk,” he said. “That can all be done in 2D, but they can manipulate that, even on just a revision shot. It just means that you can always manipulate, even down to, back to the 3D stage, even so late in the game.”

Related Tags