IE 11 is not supported. For an optimal experience visit our site on another browser.

FBI warns about deepfake porn scams

The FBI said it has received a growing number of complaints of criminals who use deepfake tools to create fake pornographic videos of people and then demand payment from them.
Image: The seal of the Federal Bureau of Investigation hangs on a wall  at the FBI headquarters in Washington on June 14, 2018.
Al Drago / Bloomberg via Getty Images file

Scammers are using artificial intelligence to create fake pornographic videos of victims and demanding payment from them to not disseminate them, the FBI warned Monday.

Deepfake technology, which can create convincing artificial media, like videos of one person’s face on another’s body, has rapidly advanced and become widespread in recent years.

In a public service announcement, the FBI said it had received increasing complaints of criminals who use deepfake tools to create fake pornographic videos by using images and clips commonly found on victim’s social media accounts. The scammers sometimes “circulate them on social media, public forums, or pornographic websites,” the agency said. 

Criminals will often insist that victims, some of them children, pay them in money, gift cards or even in authentic sexual imagery, the FBI said, and will threaten to publish the deepfakes on the open internet or send them to friends and family if they refuse.

While manipulated pornographic images have circulated for years, the rapid rise in more advanced deepfake technology has led to an explosion of the technology being used to manipulate people’s images without their consent. An NBC News investigation in March found that images of nonconsensual deepfake porn were easily accessible through online search and chat platforms. 

Eva Casey-Velasquez, the president of the Identity Theft Resource Center, a nonprofit that helps scammed victims, said that perpetrators sometimes try to shock victims with particularly intimate material.

“If people are actually seeing videos that appear to be themselves, if they’re not aware that it’s a deepfake — thinking ‘I wasn’t there, what is happening, if I’m losing my mind, that’s not me, I didn’t do that’ — that would be very effective,” Velasquez said. “You would just want that to go away.”

The FBI recommends being cautious when accepting friend requests from strangers and to realize that complying with a scammer does not ensure the deepfakes will not be shared. The National Center for Missing and Exploited Center provides a free service, called Take It Down, to help stop the online sharing of explicit imagery of children under 18.