Skip to content
Technology & Innovation

Fake video could break your trust in the news entirely

Digital Video Portraits are already beating out deepfakes for creepy cultural dominance.
Jordan Peele creates a fake video of Barack Obama to show how scarily good this technology is. (Credit: Buzzfeed)
Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

You’re looking at a screen right now. Imagine, if you will, a video of a loved one pops up in the right-hand corner, a FaceTime or Skype call. Their image pops up and you see their familiar face. Instantly, your heart races just a little because you know it’s them. 


But they don’t look happy. They look sad. They’re in trouble, they say, and they want you to wire them some money. They need it right now, they say, so you grab your credit card and fire off a money transfer to Western Union. Then they smile, thank you, the call ends. Then you get a call from your loved one asking you what you want for dinner. You ask them about the trouble they were in. “What trouble?” they say. And then it hits you, that you’ve been duped by what’s being called a deepfake: a video likeness designed to mimic them exactly. 

They’re becoming more and more common, and the technology is improving exponentially year after year. In 2017, a deepfake video of Obama made waves when it was revealed to be a made by A.I. In April of this year, actor Jordan Peele and Buzzfeed created an even more realistic deepfake video in which Obama (as voiced by Peele) calls President Trump a ‘dipshit’. The whole thing was wrapped warmly as a warning against fake news.

But that’s perhaps not the scariest part. The videos are getting better and better. 

Deep Video Portraits—developed by Stanford, the Technical University of Munich, the University of Bath and others—just needs a single minute-long video clip (or about 2,000 photographs) to draw from to create an almost indistinguishable fake video. It wouldn’t be too hard—at all, really—to get a couple of voice actors together with Deep Video Portrait technology to create a video of Donald Trump and/or Vladimir Putin arguing for the mass extermination of a race of people. Deep Video Portraits are the much, much scarier older brother of deepfakes: they’re harder to distinguish and easier to make. Even Michael Zollhöfer, the visiting Stanford professor who helped birth Deep Video Portrait, argues for better digital forensics once this technology becomes more mainstream

For example, the combination of photo-real synthesis of facial imagery with a voice impersonator or a voice synthesis system, would enable the generation of made-up video content that could potentially be used to defame people or to spread so-called ‘fake-news’. Currently, the modified videos still exhibit many artifacts, which makes most forgeries easy to spot. It is hard to predict at what point in time such ‘fake’ videos will be indistinguishable from real content for our human eyes. 

The recently presented systems demonstrate the need for sophisticated fraud detection and watermarking algorithms. We believe that the field of digital forensics will receive a lot of attention in the future. We believe that more funding for research projects that aim at forgery detection is a first good step to tackle these challenges. In my personal opinion, most important is that the general public has to be aware of the capabilities of modern technology for video generation and editing. This will enable them to think more critically about the video content they consume every day, especially if there is no proof of origin.

So as you can see, even the people that made the technology are aware of its dangers. The full paper is here should you want to read the whole thing. 

And I hate to point it out, or even give credence to it, but deepfakes are already wildly rampant in pornography. Whole websites are dedicated to fake celebrity pornography (all easily googleable, but it is absolutely 100% NSFW) and the results really and truly are uncannily accurate. Again, it’s easy to assume that this could be done to anyone’s spouse and used for blackmail. Not that I’m giving anyone ideas that haven’t been actualized already; even Pornhub has blocked deepfakes

What does this mean for you? Perhaps invest in a digital video forensics lab. And, for what it’s worth, perhaps trust more mainstream news sources, even if it means reaching across the aisle and dabbling in news from different bubbles. Live in a liberal bubble? Maybe check out the Daily Caller once in a while. Love Fox News? Watch CNN. Somewhere there’s a middle ground that everyone is fighting to control. And, it might sound crazy, but fringe elements have far less to lose and more to gain from these fakes. 

Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

Related

Up Next