A Google computer scientist says his new lip-reading technology has terrifying i
In an era of fake news, we can still rely on our own eyes to spot the truth, right? Wrong.
Source: Hilary Brueck
Computer scientist Supasorn Suwajanakorn, now a research resident at Google Brain, has created a new kind of lip-reading technology that can make a fake video of just about anyone that looks and sounds almost exactly like them.
All the programme needs is a few photos and videos to learn from. Suwajanakorn demonstrated the tool onstage in Vancouver at the 2018 TED Conference on Wednesday.
He said the new system, which he created as a PhD thesis project at the University of Washington, uses a neural learning network to mimic movements in the mouth and teeth of a person from video footage. In essence, the computer algorithm teaches itself to imitate exactly how a person talks by watching them over and over again.
Suwajanakorn has already successfully created fake videos of celebrities like Tom Hanks and former president Barack Obama using only images and videos that are readily available online.
He said it doesn’t really matter what kinds of facial expressions a person makes or which words they say. All that matters is that the system has enough data to pick up on a speaker’s mannerisms by studying the subject’s teeth, lip movements, and jaw shape.
From there, the possibilities for fake videos become endless.
In fact, Suwajanakorn’s lip-reading technology — which he created with two professors while working in a graphics vision lab at UW — is so convincing that Google hired him to work on its vision and graphics systems.
Suwajanakorn said the technique still has a long way to go before it will be able to “fully model individual people” from head to toe. But he’s already concerned about what the technique’s consequences might be for fake news.
“We don’t want it to be in the wrong hands,” Suwajanakorn told the crowd at TED. “So we have to be very careful about it.”
There are signs that some similar technology may already be in the “wrong hands,” though. Recently, people have started making fake sex videos online by swapping out the faces of porn stars for celebrities like Taylor Swift and Gal Gadot, as Vice News reported in January.
The face-imitating technology isn’t inherently bad, of course. Suwajanakorn said this kind of video-modelling could be a great way to do positive and educational things, like tell more historical stories in vivid ways. As an example, he showed a video of a holocaust survivor telling his own story to the audience.