When Open-Source Deepfake Tools Fall into the Wrong Hands
When we look at intimate image abuse, the vast majority of tools and weaponized use have come from the open source space,” says Ajder. But they often start with well-meaning developers, he says. “Someone creates something they think is interesting or cool and someone with bad intentions recognizes its malicious potential and weaponizes it.”
Malicious Potential
Some, like the repository disabled in August, have purpose-built communities around them for explicit uses. The model positioned itself as a tool for deepfake porn, claims Ajder, becoming a “funnel” for abuse, which predominantly targets women.
Deepfake Porn and Abuse
Other videos uploaded to the porn-streaming site by an account crediting AI models downloaded from GitHub featured the faces of popular deepfake targets, celebrities Emma Watson, Taylor Swift, and Anya Taylor-Joy, as well as other less famous but very much real women, superimposed into sexual situations.
The creators freely described the tools they used, including two scrubbed by GitHub but whose code survives in other existing repositories.
The Difficulty in Policing Open-Source Deepfake Software
Perpetrators on the prowl for deepfakes congregate in many places online, including in covert community forums on Discord and in plain sight on Reddit, compounding deepfake prevention attempts. One Redditor offered their services using the archived repository’s software on September 29. “Could someone do my cousin,” another asked.
Torrents of the main repository banned by GitHub in August are also available in other corners of the web, showing how difficult it is to police open-source deepfake software across the board. Other deepfake porn tools, such as the app DeepNude, have been similarly taken down before new versions popped up.
The Challenges in Tracking Down Deepfake Models
“There’s so many models, so many different forks in the models, so many different versions, it can be difficult to track down all of them,” says Elizabeth Seger, director of digital policy at cross-party UK think tank Demos. “Once a model is made open source publicly available for download, there’s no way to do a public rollback of that,” she adds.
Criminalizing Deepfake Porn
At least 30 US states also have some legislation addressing deepfake porn, including bans, according to nonprofit Public Citizen’s legislation tracker, though definitions and policies are disparate, and some laws cover only minors. Deepfake creators in the UK will also soon feel the force of the law after the government announced criminalizing the creation of sexually explicit deepfakes, as well as the sharing of them, on January 7.
Conclusion
Reining in deepfake porn made with open source models also relies on policymakers, tech companies, developers, and, of course, creators of abusive content themselves. While it may be difficult to track down all existing deepfake models, it is not too late to get the problem under control, and platforms like GitHub have options, such as intervening at the point of upload.
FAQs
Q: What is the main source of deepfake tools?
A: The main source of deepfake tools is the open-source space.
Q: How do open-source deepfake tools get weaponized?
A: Open-source deepfake tools often start with well-meaning developers who create something they think is interesting or cool, and someone with bad intentions recognizes its malicious potential and weaponizes it.
Q: What is the impact of deepfake porn?
A: The potential for harm of deepfake “porn” is not just psychological. Its knock-on effects include intimidation and manipulation of women, minorities, and politicians, as has been seen with political deepfakes affecting female politicians globally.
Q: How can deepfake porn be controlled?
A: Deepfake porn can be controlled by policymakers, tech companies, developers, and creators of abusive content themselves. Platforms like GitHub can also intervene at the point of upload, and lawmakers can pass legislation to criminalize the creation and sharing of sexually explicit deepfakes.

