Tech

New iOS app for Unreal Engine gives avatars the ability to mimic your facial expressions

It's becoming a lot easier to create facial expressions for in-game characters.

Epic Games

Epic Games, the company behind Unreal Engine, has released a new iOS app that lets developers record facial expressions from an iPhone and map them onto characters in real-time. The Live Link Face app is supposed to make it a lot easier to animate avatars in the engine that powers some of the most popular games out there, from Fortnite to Borderlands.

Epic Games

Thanks, Apple — If you've ever used Apple's Memojis then you've seen how the iPhone is already capable of mapping facial expressions onto cartoon avatars. The iPhone's TruthDepth camera technology is capable of recognizing and distinguishing between the movements of over 50 specific muscles in the face. Apps like Snapchat and Instagram have taken advantage of the tech to create funky filters that alter a person's face. Live Link Face builds on that by making it super easy for developers to stream this complex facial data out to their computers and directly onto characters in Unreal Engine.

There are a bunch of advanced tools too, like synchronization, so multiple iPhones can start recording at the same time. Timestamps will help developers line up their audio and other recordings for complex productions. The app can also be used wirelessly, so developers can capture expressions sitting at their computer or with the phone mounted to their heads if they're also recording corresponding body movements. Facial animations can be adjusted after they've been imported into Unreal Engine.

Epic Games

Millions of games are powered by Unreal Engine, and a recent demo of the upcoming Unreal Engine 5 coming in 2021 showed some incredible performance to be expected in the next generation of games. Live Link Face could make it a lot easier for developers to do their work, especially those without significant resources.

Live Link Face is available in the App Store now, and Epic has documentation on its site explaining how it can be used within a full-scale production shoot.