Skip to content
Menu
menu

Illustration by Security Management

Deepfake Bot Creates Pornographic Images of Thousands of Women

A new report from visual threat intelligence firm Sensity found that an artificial intelligence enabled bot was used to create deepfake nude images of more than 100,000 women. The bot, which used photos shared online and on social media, digitally removed clothes of private individuals in the photographs, including individuals who appear to be minors.

Individuals could message a photo to the bot run by a person known as "P" on the messaging app Telegram; after receiving the photo, the bot would digitally alter the photographs to make the individuals in them appear nude, according to the BBC.

As part of this service, the bot sends a gallery of new images every day via an associated channel on Telegram, which has roughly 25,000 members, according to Wired. The article also noted that while some of the images are “glitchy…many could pass for genuine.”

Sensity’s report found that along with the bot allowing “users to photo-realistically ‘strip naked’ clothed images of women,” it also only worked on women. Telegram’s free and easy-to-navigate user interface further enabled the ability to achieve this, demanding little-to-no technical savvy to create the images.

Approximately 104,852 women were the subject of a “stripped” image which was publicly shared by the end of July 2020, and the report said that between May and July the number of these kinds of images increased by 198 percent.

Given that all someone needs to generate a stripped deepfake through this service is a photo of the person, Sensity also found that 70 percent of the people were “private individuals” and that their images were lifted from social media accounts or private material.

Other key findings of the report included that 70 percent of the bot’s users were from Russia and former USSR countries—with “significant advertising” for the service found on Russian social media site VK—and that the broader threat from the images include the potential for attacks using public shaming or extortion.

The report’s findings, including details that were not publicly disclosed for privacy reasons, were shared with Telegram, VK, and law enforcement organizations.

U.S. President Donald Trump signed into law the 2020 National Defense Authorization Act in December 2019, which was the first U.S. federal law concerning deepfakes. The law requires the director of national intelligence to create an annual report on the potential national security implications of deepfakes and their ability to be used by foreign governments.

At the U.S. state level, Virginia, Texas, and California enacted legislation by October 2019, criminalizing deepfakes that aimed to influence voters or damage a candidate in an election. Virginia was one of the first to criminalize deepfakes used in content like revenge porn; as of the beginning of 2020, only California has explicitly enabled victims to seek damages from the person who created a deepfake used in pornographic images or videos, according to Government Technology.

While U.S. state laws have begun laying the groundwork for how to address deepfakes, criminalizing the use of these images does not prevent deepfakes from being created or distributed. First Amendment activists have also noted the potential for such legislation to censor or even erode the right to free speech.

Nina Schick, author of books on deepfakes, told the BBC that at the rate at which deepfake software is evolving and improving, legal systems are outdated in how to address and regulate such issues of privacy and left to play “catch-up.”

The United Kingdom’s law concerning deepfaked nudes was criticized for its limited scope in a July 2019 report from professors of Durham University and the University of Kent. The report found that the current law is “failing victim-survivors” of image-based sexual abuse. The authors described the law as “inconsistent, outdated, and confusing….Current law requires a focus on the perpetrator’s motivation, rather than on the fact of non-consent.”

arrow_upward