AI-Generated CSAM and the Dark Side of Technology
The Rise of AI-Generated CSAM
A recent investigation has uncovered a disturbing trend in the use of artificial intelligence (AI) to generate child sexual abuse material (CSAM). The database, known as GenNomis, contained a vast array of AI-generated explicit images, including pornographic content and potential "face-swap" images.
The Website’s Content
The GenNomis website, which was live at one point, allowed users to generate unrestricted AI adult imagery. The website’s tagline was "generate unrestricted" images and videos, and a previous version of the site from 2024 stated that "uncensored images" could be created. The website’s user policies stated that only "respectful content" was allowed, and that explicit violence and hate speech were prohibited.
Moderation and Regulation
It is unclear to what extent GenNomis used any moderation tools or systems to prevent or prohibit the creation of AI-generated CSAM. Some users reported that they could not generate images of people having sex and that their prompts were blocked for non-sexual "dark humor." Another account posted on the community page that the "NSFW" content should be addressed, as it "might be looked upon by the feds."
Expert Analysis
Henry Ajder, a deepfake expert and founder of consultancy Latent Space Advisory, says that even if the creation of harmful and illegal content was not permitted by the company, the website’s branding referenced "unrestricted" image creation and a "NSFW" section, which indicated a "clear association with intimate content without safety measures." Ajder believes that more pressure needs to be put on all parts of the ecosystem that allows nonconsensual imagery to be generated using AI.
The Dark Side of Technology
Fowler says that the database also exposed files that appeared to include AI prompts. No user data, such as logins or usernames, were included in exposed data, the researcher says. Screenshots of prompts show the use of words such as "tiny," "girl," and references to sexual acts between family members. The prompts also contained sexual acts between celebrities.
Conclusion
The rise of AI-generated CSAM is a disturbing trend that highlights the need for stricter regulations and moderation on these types of websites. The technology has outpaced the guidelines and controls, and it is essential that lawmakers, tech platforms, and web hosting companies take immediate action to prevent the creation and distribution of this content.
FAQs
Q: What is AI-generated CSAM?
A: AI-generated CSAM refers to the use of artificial intelligence to create child sexual abuse material.
Q: What is GenNomis?
A: GenNomis is a website that allowed users to generate unrestricted AI adult imagery, including pornographic content and potential "face-swap" images.
Q: Is CSAM illegal?
A: Yes, CSAM is illegal and considered a serious crime.
Q: What is being done to combat AI-generated CSAM?
A: Organizations such as the Internet Watch Foundation (IWF) are working to combat AI-generated CSAM by documenting how criminals are creating and distributing this content. Additionally, lawmakers and tech platforms are being urged to take immediate action to prevent the creation and distribution of this content.

