That is not a grainy picture of Katy Perry — the real one is apparently into cryptocurrencies now — doing porn. It’s a still from an AI-generated clip with Perry’s face edited over the body of some other woman shooting porn. But if I didn’t tell you that and somebody told you, Oh, yeah, that’s Katy Perry, you might believe it. Which is frightening because this technology is only going to get better.
Motherboard has been chronicling the AI-generated porn world since last December, when it reported on a Reddit user, “deepfakes,” who spends a lot of time creating these videos using machine-learning algorithms. Since then, the practice has taken off. “Deepfake” has become a noun in certain circles, Motherboard reports. There’s even an app now, FakeApp, which allows users who aren’t as computer savvy to create similar videos. “Eventually, I want to improve it to the point where prospective users can simply select a video on their computer, download a neural network correlated to a certain face from a publicly available library, and swap the video with a different face with the press of one button,” the app’s creator, “deepfakeapp” — note that’s different from “deepfake” — told Motherboard in a recent interview.
And, as though the thought of having your face shopped onto a naked body to the point where it could be hard to convince somebody it wasn’t really you isn’t scary enough, the implications of AI-generated video go way beyond porn. If somebody with a little bit of free time can turn Katy Perry into a porn star, there’s nothing to say that same somebody couldn’t make Donald Trump — or any other world leader or key political figure — appear to say or do whatever they want them to be saying or doing. As if Fake News wasn’t a big enough problem already.