Hao Li:
I like vodka.
Miles O’Brien:
The ever increasing speed of computers, along with the advancement of the artificial intelligence technique called machine learning, is making these composites harder and harder to detect with the naked eye.
Hao Li:
We all assume that there will be a point where there’s no way to tell the difference. I mean, for visual effects, I think you can get pretty close already. It’s just the question of how much effort you put into it.
But in terms of content that it can be created by anyone, I think it’s getting very close to the point.
Miles O’Brien:
One technique is the face swap, which put Steve Buscemi’s face on Jennifer Lawrence’s body, Nicolas Cage onto a series of marquee stars in iconic roles, or Jimmy Kimmel’s mug on mine.
I have had to relearn very simple things.
But there is a deep, dark side as well. Indeed, the technology has been used to paste the faces of celebrities onto the bodies of porn stars.
Computer scientist Hany Farid is a professor at Dartmouth College:
Hany Farid:
I am worried about the weaponization and I’m worried about how it’s impacting us as a society. So, we are working as hard as possible to detect these things.
Jordan Peele:
Killmonger was right.
Miles O’Brien:
This video crystallized much of the deep concern, what seems to be President Barack Obama making a speech…
Jordan Peele:
You see, I would never say these things.
Miles O’Brien:
… is actually comedian and filmmaker Jordan Peele doing his excellent Obama impersonation synched with software created with artificial intelligence, or A.I.
Hany Farid:
The A.I. system synthesized the mouth of President Obama to be consistent with the audio stream, and it made it look like President Obama was saying things that he never said. That’s called a lip synch deepfake.
Miles O’Brien:
Just this week, the technique was used to put some pretty outrageous and comical words into the mouth of Facebook founder Mark Zuckerberg.
Man:
Specter showed me that whoever controls the data controls the future.
Miles O’Brien:
It’s a potent technology that is ripening at a time of deep polarization and suspicion fueled by social media.
Rep. Nancy Pelosi, D-Calif.:
So it’s really sad. And here’s the thing.
Miles O’Brien:
Just last month, something much less sophisticated than a deepfake, a doctored video of House Speaker Nancy Pelosi making her seen drunk went viral.
Rep. Nancy Pelosi, D-Calif.:
We want to get this president the opportunity to do something historic.
Miles O’Brien:
Deepfakes ratchet up the risks.
Hany Farid:
The nightmare situation is that there’s a video of President Trump saying, “I have launched nuclear weapons against North Korea.” And somebody hacks his Twitter account, and that goes viral, and, in 30 seconds, we have global nuclear meltdown.
Do I think it’s likely? No. But it’s not a zero probability, and that should scare the bejesus out of you, right? Because the fact that that is not impossible is really worrisome.
Miles O’Brien:
Farid is most worried about deepfakes rearing their ugly head during the 2020 election. So he and his team are carefully learning the candidates’ patterns of speech and how they correlate with gestures as a way to spot deepfakes.
Hany Farid:
We do that, of course, by analyzing hundreds of hours of hours of video of individuals.
We’re focused on building models for all of the major party candidates, so that enough we can upload a video to our system. We can analyze it by comparing it to previous interviews, and then asking, what is the probability that this is consistent with everything we have seen before?
Miles O’Brien:
Computer scientists have pushed this technology using generative adversarial networks, or GANs.
A GAN pits two artificial intelligence algorithms against each other. One strives to create realistic fake images, while the other grades the effort.
Hany Farid:
So, the synthesis engine says, I’m going to create a fake image, I give it to this A.I. system that says, this looks fake to me. So it goes back and you change it. And you do that a few billion times in rapid succession, and the computers are teaching each other how to make better fakes. And that’s what has democratized access.
Miles O’Brien:
And that’s why the Pentagon is interested in deepfakes.
Its research enterprise, the Defense Advanced Research Projects Agency, or DARPA, is exploring ways to defend against the threat of deepfakes.
Computer scientist Matt Turek runs DARPA’s media forensics, or MediFor, project.
Matt Turek:
So, there’s an opportunity here for us to essentially lose all trust in images and video.
Miles O’Brien:
Turek showed me some of the 70 counter-deepfake techniques DARPA is helping nurture.
Woman:
Necessary for one people to dissolve the political bands which have connected them with another.
Miles O’Brien:
This software is designed to characterize lip movement and compare it to the audio.
Matt Turek:
And so, when see these red bars, that means actually that sounds of the speaker are not actually consistent with the movement of the lips.
Miles O’Brien:
Take a look at this video, supposedly two people sitting together. But software that determines the lighting angle on faces concludes it is a composite.
Matt Turek:
So, it estimates a 3-D model for the face. Along with that 3-D model, it estimates the reflectance properties of the face, and also the lighting angles.
And so here we’re primarily using the lightning angles to see whether those are consistent or not.
Miles O’Brien:
In this example, video apparently gathered by a security camera shows only one car. This artificial intelligence algorithm is designed to predict how things should move.
Matt Turek:
What that is triggering off of is discontinuities in the motion. And so that gives us a signal to look at an image or a video and say, well, perhaps frames were removed here.
Miles O’Brien:
And it flags the video as altered. Another vehicle was edited out.
Matt Turek:
There’s a cat-and-mouse game. The more aspects that you can use to debunk an image or video, the more burden that you put on the manipulator.
Miles O’Brien:
But none of these ideas will work without the cooperation of the big social media platforms YouTube and Facebook, which would need to deploy the software and delete the fakes, something Facebook refused to do when the Pelosi video emerged.
Hany Farid:
And the platforms have been, for the most part, very cavalier about how they deal with this type of illegal content, harmful content, misinformation, fake news, election tampering, non-consensual pornography, and the list goes on and on, because it gets eyes on the platform, and that’s good for business.
Miles O’Brien:
A fake video amplified in an echo chamber can go an awfully long way before the facts even enter the picture.
For the “PBS NewsHour,” I’m Miles O’Brien in Los Angeles.