AI dubbing can make actors appear to speak any language

TrueSync can make De Niro and Hanks look like they’re speaking German and Japanese.
Sign up for the Freethink Weekly newsletter!
A collection of our favorite stories straight to your inbox

A new AI dubbing technology syncs actors’ mouths with recorded dialogue to make the experience of watching a movie in an unknown language less jarring.

The challenge: If you want to watch a movie in a language you don’t understand, you have two choices: you can either read subtitles, which can be distracting, or you can watch a dubbed version of the film.

During the dubbing process, voice actors who do speak your language record all of the film’s dialogue in a sound booth. The original dialogue is then replaced with that audio.

This allows you to listen to the dialogue, just like you normally would, but the lips of the actors will appear out of sync with the words you’re hearing.

That can be as distracting, and it can even force changes to the movie itself — if the time it takes to say something in one language is too different from the time it takes to say it in another, for example, the script might be changed to produce a better fit.

The idea: For director Scott Mann, the experience of watching a dubbed version of one of his own movies, “Heist,” was a wake-up call that something needed to change.

“I remember just being devastated,” he told Wired. “You make a small change in a word or a performance, it can have a large change on a character in the story beat, and in turn on the film.”

That inspired him to co-found Flawless, the U.K.-based company behind the new AI dubbing technology, TrueSync.

AI dubbing: TrueSync is essentially a very sophisticated form of copying and pasting.

The AI determines what an actor’s mouth would look like if they were making each sound in the translated dialogue. It then finds a point in the film where they were making that sound and extracts what it looks like.

Smoothly weave all those movements together, and it looks (more or less) like the actor is speaking the new language.

“It’s able to essentially take an ‘ooh’ sound from (Robert) De Niro 20 minutes earlier and place that in a different moment in the film,” Mann told Reuters. “It measures at the same time and blends it so that the performance is the same, but it’s a different mouth movement.”

Getting there: In demo videos, the match isn’t flawless (despite the company’s name), but the original actors’ mouths do match the new dialogue enough to be far less distracting than standard dubs.

There are still uncanny valley moments, where the deepfake tech falters, but Mann believes the AI dubbing will only get better.

“It’s going to be invisible pretty soon,” he told Wired. “People will be watching something and they won’t realize it was originally shot in French or whatever. “

You’re going to see the rise of a much more diverse range of stars.


Scott Mann

Why it matters: China, the U.S., and Canada are the world’s biggest film markets, accounting for nearly 60% of ticket sales, and the vast majority of movies watched in those nations are recorded in Mandarin and English.

However, there are talented filmmakers across the globe creating movies in other languages, and smoother dubbing might inspire audiences in larger markets to give those films a chance, opening the door for new perspectives in the film industry.

“From a filmmaking point of view, you’re going to see the rise of a much more diverse range of stars,” Mann told Reuters.

We’d love to hear from you! If you have a comment about this article or if you have a tip for a future Freethink story, please email us at [email protected].

Sign up for the Freethink Weekly newsletter!
A collection of our favorite stories straight to your inbox
Related
AI chatbots may ease the world’s loneliness (if they don’t make it worse)
AI chatbots may have certain advantages when roleplaying as our friends. They may also come with downsides that make our loneliness worse.
Will AI supercharge hacking — if it hasn’t already?
The future of hacking is coming at us fast, and it isn’t clear yet whether AI will help attackers and defenders more.
No, LLMs still can’t reason like humans. This simple test reveals why.
Most AI models are incredible at taking tests but easily bamboozled by basic reasoning. “Simple Bench” shows us why.
The future of fertility, from artificial wombs to AI-assisted IVF
A look back at the history of infertility treatments and ahead to the tech that could change everything we thought we knew about reproduction.
“Model collapse” threatens to kill progress on generative AIs
Generative AIs start churning out nonsense when trained on synthetic data — a problem that could put a ceiling on their ability to improve.
Up Next
3d video chat
Subscribe to Freethink for more great stories