In sync

Movie dubbing sucks. One filmmaker is using AI to fix that.

Director Scott Mann, cofounder of Flawless, says the company’s tech “creates a fully three-dimensional performance.”

During the filming of the 2015 action thriller Heist, star Robert De Niro called director Scott Mann one evening to ask which cufflinks he thought his character should wear.

“He’d read a scene and analyzed it right down to the minute details,” Mann recalls. “There’s a level of consideration that goes into every minor detail on screen.”

This attention to the small things, Mann says, proved all for naught when the movie was released in non-English-speaking territories. The culprit: bad dubbing. “The dubs effectively removed the way [De Niro] made character decisions, by changing dialogue to fit his facial movements, instead of adapting the script and staying true to his performance,” the British director says. “It became apparent that the process doesn’t work. At best, it’s a compromise.”

So Mann set out to find a solution. After three years working together, he and cofounder Nick Lynes, who has a background in software development, today are formally unveiling their company Flawless, which employs AI to dub films into any language. The pair believe that the London-based “neural net film lab” will create new opportunities for filmmakers and studios to attract audiences both here and abroad.

Flawless showreel trailer

This isn’t the first attempt at trying to solve the dubbing problem with artificial intelligence. Last year, Amazon developed AI to automatically dub movies in other languages, though the tech produces a voice that doesn’t sound remotely human. The Flawless system is very different. First of all, instead of leaving the dubbing to post-production, the audio can be recorded earlier, as scripts won’t need to be altered for voice actors to fit the mouth movements on the footage.

Flawless’ tech was inspired by a collaboration with the Max Planck Institute Institute for Informatics in Germany, whose research on digitally recreating and editing faces impressed Mann. The company uses a neural network to study millions of visual data points within the film rushes — the unedited footage from a day’s filming — which are then entered into a system that can recreate an actor’s performance in another language.

“We designed the system so that it doesn’t interrupt the filmmaking process,” Mann explains. “The extraction of facial data” — a time-consuming computational process — “runs parallel with the production itself.” The technology strips actors’ faces off, converting their visages into a 3D model, according to Lynes. “This creates millions of 3D models, which the AI uses as reference points,” he says.

“And then, using an existing foreign-language recording of the dialogue, it studies the actor and generates a new 3D model per frame,” he adds. Finally, the imagery is converted back to 2D. Digital effects artists can then manually fix anything that seems off.

Whatever you do, don’t call this deepfakery, says Mann.

Whatever you do, don’t call this deepfakery, says Mann. “Deepfakes use two-dimensional face-replacement to hijack people’s identity,” he says, “which is the opposite of what we do, which is responsibly utilizing neural networks to enhance a performance. We also take into account the geometry and geography of the image, which is the biggest difference — this creates a fully three-dimensional performance.”

Flawless has tested the technique on clips from classic movies, including Forrest Gump and A Few Good Men, and extensively screened the results to non-English-language audiences to reduce the “uncanny valley” effects of their earliest attempts. These tests proved useful: Both Mann and Lynes say that native Spanish speakers were able to notice dialogue-syncing problems with Flawless’ Spanish dub of Forrest Gump that had initially passed them by.

Forrest Gump en español

Over the past year, the pair have been showing off their tech around Hollywood, speaking to actors and producers to gauge their thoughts. “I won’t name names, but I’ve shown some of our clips to the actors involved, and they’re coming at this from a similar place to me,” Mann says. “They’ve seen their performances suffer through bad dubbing before, and the most common response I’ve got when I’ve sent these trial clips to them is, ‘Holy shit! Wow!’ It’s strange for somebody to see that they can speak Japanese all of a sudden.”

Mann says that Flawless is in talks with film studios and already working with a streamer, a deal that they hope to officially announce soon. The company expects the first film utilizing this technology to be released in the next 12 months. As far as cost goes, he says, “Our research suggests this will differ on a case-by-case basis. But when Hollywood is likely to pay $60 million to remake Another Round” — the 2021 Academy Award winner for Best International Feature Film — “in English, I can safely say this would cost nearly $60 million less.”


Although Scott prefers subtitles over bad dubbing, he’s still not a fan of the former. “I’m always thinking about how we frame a shot, and that gets missed if people are busy reading the subtitles at the bottom of the screen. That’s not watching the film as it was intended to be watched. This is the solution to ensure audiences around the world get to experience a film as a director and editor cut it — how it was written, how it was performed.”

Mann anticipates some “natural nervousness” from audiences, especially subtitle lovers, whom he points out are a small portion of the overall filmgoing public. (Of course, there are those who use them out of necessity.) “There will be people who will refuse to watch our dubs and say subtitles are the purest form, but this helps share great international stories with those who may have never otherwise seen them,” Mann says. “We’ll soon reach the level where you won’t realize we’ve done it, and the debate will solve itself.”