Gaming

Ninja Theory Brings Real-Time CG to 'Hellblade' for Cheap

CG films don't even do this.

Ninja Theory

Ninja Theory is bringing some relatively unprecedented tech to their upcoming video game Hellblade: Senua’s Sacrifice: The ability to shoot, render, and edit scenes for the game in minutes instead of days.

As is tradition with Ninja Theory, Hellblade is not a typical blockbuster — it follows the journey of Senua, a Celtic warrior fighting mental illness. Given the studio’s interest in making smaller, character-led stories in games like Enslaved and DmC (even on a scale akin to triple-A), using technology to further an emotional performance seems like a natural progression.

The tech itself, which Ninja Theory is calling “real-time cinematography” after a recent demo shown at the VFX conference SIGGRAPH, essentially captures a mo-capped actor’s performance in real-time as it would be on a film set then transfers the rendered data into Sequencer, a new Adobe Premiere-like tool for Unreal Engine 4.

From there, the just-captured performance data — recorded facial animations, voice, and movement — can be manipulated in any number of ways like you would a linear segment of film, only the developers are free to adjust, tweak, or edit things like lighting or camera angles on the fly. (Since the data is real-time, it’s even possible to entirely change or add new cameras to a scene.)

For Ninja Theory’s co-founder and chief creative director Tameem Antoniades, shooting everything in-house using the tech makes all the difference.

Ninja Theory

“We can jump in and shoot scenes as we need them. We don’t have to do it all in one four-week shoot,” Antoniades says of the studio’s old method. It previously involved booking a block of time at a mottion-capture studio to capture a game’s performances all at once. “[We had to] shoot everything for the game there, and then build the game around what we shot.”

The technical nuts and bolts are unsurprisingly complex and involve a lot of hands. Aside from working with Epic, who helps provide support for Sequencer and Unreal itself, the facial capture tech in use comes from a separate mo-cap software company, Cubic Motion, while the Serbia-based facial performance studio 3Lateral was brought on for help creating realistic virtual heads — and that’s just a couple of Ninja’s outside collaborators.

Cubic Motion’s facial capture middleware and 3Lateral’s captured data — which required actress Melina Juergens to travel to Novi Sad to have her face scanned in a variety of ways — are used as a digital foundation to accurately capture any real-time performance. (Luckily, Juergens is Ninja Theory’s video editor, so her availability for shooting scenes isn’’t an issue.)

Ninja Theory

But Hellblade was started from a desire to make something completely independent on a triple-A scale, and done as efficiently and cheaply as possible. Thus far, their work speaks for itself, and is kind of amazingly being built by a mere 16-person team.

Antoniades says its that comparatively compact production, with a tight focus on a single character, that makes it possible at all.

“We built a performance capture space that could do the kind of things we did [before], but we built it for probably like $20,000 to $30,000. Which is super, super cheap compared to the kind of setups we were working with,” he says. “Even though it feels like it’s ultra high-tech, ultra expensive, it’s not — this is just a more efficient way to shoot, in the spirit of everything else [were doing].”

To help the studio keep from breaking the bank, some partners lent them equipment and provided technical expertise while Antoniades and the team made a conscious effort to keep equipment costs as low as possible. Using nine mo-cap cameras in a space about three-by-three meters, much smaller than a major studio’s twenty-by-twenty meter setups, was a big help.

Ninja Theory

“There’s a lot of compromises we made,” Antoniades says. “Our rigging to mount the cameras we bought from IKEA. They’re wardrobe poles. The lighting we needed — they’re office lights we bought off Amazon. We just took the cheapest possible solution at every turn to be able to capture one character.”

The studio’s openness about development is what led Epic’’s chief technology officer Kim Libreri to ask Antoniades if Ninja Theory wanted to do a live showcase with Senua’s actress at GDC in March; when that proved a success, the team worked with their partners to shoot and render an entire cutscene in seven minutes for a SIGGRAPH presentation in June by using Sequencer to transfer mo-capped data into Unreal Engine.

Unlike the more stripped-down GDC demo, the SIGGRAPH presentation (which they performed in front of an audience of 2,500 VFX industry professionals) showed Senua talking to a doppelganger, with Antoniades shooting both of Juergens’ mo-capped parts as they happened. After capturing the doppelganger, Ninja Theory had help from Epic, Cubic Motion ,and 3Lateral on stage to make sure data was being captured properly and subsequently put into Sequencer’s video editing-style interface.

Finally, the data was then played back as Antoniades filmed Juergens a second time, now playing the real Senua, against her first performance — at the same time capturing the motion and focus of the camera itself. For their performance, Ninja Theory received SIGGRAPH’s best real-time graphics and interactivity award.

What makes the tech so head-turning is that not even the best studios in filmmaking currently produce visuals so close to the final art like this. Instead they use a process called pre-visualization, which works a bit like a real-time animated outline, using basic rendered geometry to measure the blocking and action of a scene. To go from rough pre-vis to final image usually takes a matter of months, and lacks detail, particularly given its absence of facial capture.

Ninja Theory

“If you could see a character live, imagine all the things that you could do with that,” Antoniades says. “[With SIGGRAPH] we thought, ‘we’ve got a chance here to actually do something that you can’t even do on film sets.’ Film studios with millions of dollars don’t have this technology.”

It’s a benefit to the overall speed of development as well.

“Because we have our own set up here, we can jump in and shoot scenes as we need them,” he says. “We’re doing it more organically — as were building levels, we can jump in and shoot the scenes for those levels, or if we need extra scenes, jump in and shoot them. It’s still in line with the narrative and the story, it’s just made the scene a little bit cooler, so we can pull tricks with the multiple versions of Senua that we otherwise wouldn’’t have.”

While Antoniades says the real-time nature of the tech probably won’t have a noticeable effect on either the story or playing the game (apart from maybe more experimental cutscenes), he is optimistic it can be applied to new types of experiences and entertainment in the near future, particularly in virtual reality.

“[Imagine] the characters turn and face you or walk with you or interact with you,” he says. “I think this will lead to a different type of entertainment that we don’t have a name for, we don’t even have the grammar like we do with film and video games.”

Epic's Sequencer for Unreal 4 amazingly works just like video editing software, except the data can be manipulated at will. 

Ninja Theory

What could happen sooner is something like Twitch chat where people could talk with Senua herself in real-time — something fans could potentially see in the coming months. Taking advantage of VR, concerts, and other non-gaming applications are also possible.

“Being able to set up locations, actors, shoot things live, and then, because it’s all real time, you can make it all interactive on the spot, try out different scenarios,” Antoniades says. “There’s possibilities in the near future that just wouldn’t be possible even last year.”

With Hellblade being just one of several projects in and out of gaming proper that Ninja Theory is working on, it will hopefully prove a good jumping off point to see how things can evolve from here.

“I think its going to be good for gamers, because they’’ll get more interesting projects,” he says. “And it’ll be good for developers as well. I think the industry’s changing for the better, as long as we all manage to hang in there and survive for just a little bit longer.”

Related Tags