AI dubbing can make actors appear to speak any language

TrueSync can make De Niro and Hanks look like they’re speaking German and Japanese.
Subscribe to Freethink on Substack for free
Get our favorite new stories right to your inbox every week

A new AI dubbing technology syncs actors’ mouths with recorded dialogue to make the experience of watching a movie in an unknown language less jarring.

The challenge: If you want to watch a movie in a language you don’t understand, you have two choices: you can either read subtitles, which can be distracting, or you can watch a dubbed version of the film.

During the dubbing process, voice actors who do speak your language record all of the film’s dialogue in a sound booth. The original dialogue is then replaced with that audio.

This allows you to listen to the dialogue, just like you normally would, but the lips of the actors will appear out of sync with the words you’re hearing.

That can be as distracting, and it can even force changes to the movie itself — if the time it takes to say something in one language is too different from the time it takes to say it in another, for example, the script might be changed to produce a better fit.

The idea: For director Scott Mann, the experience of watching a dubbed version of one of his own movies, “Heist,” was a wake-up call that something needed to change.

“I remember just being devastated,” he told Wired. “You make a small change in a word or a performance, it can have a large change on a character in the story beat, and in turn on the film.”

That inspired him to co-found Flawless, the U.K.-based company behind the new AI dubbing technology, TrueSync.

AI dubbing: TrueSync is essentially a very sophisticated form of copying and pasting.

The AI determines what an actor’s mouth would look like if they were making each sound in the translated dialogue. It then finds a point in the film where they were making that sound and extracts what it looks like.

Smoothly weave all those movements together, and it looks (more or less) like the actor is speaking the new language.

“It’s able to essentially take an ‘ooh’ sound from (Robert) De Niro 20 minutes earlier and place that in a different moment in the film,” Mann told Reuters. “It measures at the same time and blends it so that the performance is the same, but it’s a different mouth movement.”

Getting there: In demo videos, the match isn’t flawless (despite the company’s name), but the original actors’ mouths do match the new dialogue enough to be far less distracting than standard dubs.

There are still uncanny valley moments, where the deepfake tech falters, but Mann believes the AI dubbing will only get better.

“It’s going to be invisible pretty soon,” he told Wired. “People will be watching something and they won’t realize it was originally shot in French or whatever. “

You’re going to see the rise of a much more diverse range of stars.


Scott Mann

Why it matters: China, the U.S., and Canada are the world’s biggest film markets, accounting for nearly 60% of ticket sales, and the vast majority of movies watched in those nations are recorded in Mandarin and English.

However, there are talented filmmakers across the globe creating movies in other languages, and smoother dubbing might inspire audiences in larger markets to give those films a chance, opening the door for new perspectives in the film industry.

“From a filmmaking point of view, you’re going to see the rise of a much more diverse range of stars,” Mann told Reuters.

We’d love to hear from you! If you have a comment about this article or if you have a tip for a future Freethink story, please email us at [email protected].

Subscribe to Freethink on Substack for free
Get our favorite new stories right to your inbox every week
Related
A personal assistant for everyone: The promise of ambient AI
We’re leaving the app era and entering the age of ambient AI: intelligent help that’s always on, but never in the way.
Gen Z: We must resist the temptation to cheat on everything
Adopting the “cheat on everything” mentality — treating thinking as a burden AI can eliminate — is not only wrong, it’s dangerous.
There are no new ideas in AI — only new datasets
Our next AI breakthrough will come when we unlock a source of data we’ve either overlooked or never fully harnessed.
Albert Einstein said automation caused the Great Depression. It didn’t.
Einstein blamed automation for the widespread unemployment of the Great Depression, but his reasoning was based on a false premise.
Has AI made “learn to code” obsolete?
Freethink talks to the creator of the world’s most popular AI coding assistant to find out whether learning to code is still worthwhile.
Up Next
3d video chat
Subscribe to Freethink for more great stories