Researchers: More Accessible Deepfake Generators Fuel Rapid Increase in AI Explicit Images
It’s not surprising that artificial intelligence (AI) tools were swiftly adapted to create pornographic material. It is shocking, though, quite how broadly this technology has been perverted and how accessible it has become.
Researchers from the Oxford Internet Institute (OII) at the University of Oxford uncovered a dramatic rise in easily accessible AI tools designed to create deepfake images of identifiable people, including celebrities. On just one online platform, nearly 35,000 tools were available for public download. Since 2022, these deepfake generators have been downloaded almost 15 million times, fueling a rapid increase in AI-generated non-consensual intimate imagery (NCII). Each downloaded variant could generate limitless deepfakes.
Detailed analysis found that 96 percent of the deepfake models targeted identifiable women, from globally recognized celebrities to social media users with smaller followings. Many models targeted individuals from China, Korea, Japan, the UK, and the United States, and they carried tags like “porn,” “sexy,” or “nude” to signal intent to generate NCII.
The deepfakes are getting easier and easier to create, requiring as few as 20 images of the target individual, a consumer-grade computer, and a mere 15 minutes of processing time, the researchers found.
“There is an urgent need for more robust technical safeguards, clearer and more proactively enforced platform policies, and new regulatory approaches to address the creation and distribution of these harmful AI models,” said Will Hawkins, lead author of the study.
Governments have been trying to curtail the use of AI to make sexually explicit deepfakes. In April 2023, an amendment to the Online Safety Act made the sharing of sexually explicit deepfakes a criminal offense in England and Wales. The UK government is also trying to establish a new offense in its criminal code for generating non-consensual sexually explicit deepfakes or taking intimate images without consent.
“While it is already an offence to share—or threaten to share—an intimate image without consent, it is only an offence to take an image without consent in certain circumstances, such as upskirting,” according to a press release. “Under the new offences, anyone who takes an intimate image without consent faces up to two years’ custody. Those who install equipment so that they, or someone else, can take intimate images without consent also face up to two years behind bars.”
In April 2025, the U.S. Congress overwhelmingly passed bipartisan legislation to enact stricter penalties for the distribution of NCII and revenge porn. Known as the Take It Down Act, the bill addresses both real and AI-generated imagery, and it makes it illegal to “knowingly publish” or threaten to publish intimate images without a person’s consent. It also requires websites and social media companies to remove such material within 48 hours of notice from a victim.
The Take It Down Act awaits a signature from U.S. President Donald Trump has signalled support for the bill and is expected to sign it.
The OII research will be formally published in June as part of the ACM Fairness, Accountability, and Transparency peer-reviewed conference proceedings.