An AI Image Generator's Exposed Database Reveals What People Really Used It For
Briefly

The article discusses the GenNomis website, which allowed the creation and sharing of explicit AI-generated adult images, including potential CSAM. Fowler observed images suggesting real people's faces had been swapped onto explicit content, raising significant ethical issues regarding consent. The website's policies claimed to prohibit illegal content, yet the effectiveness of its moderation tools remains questionable, as users reported difficulties in generating certain images. The platform featured various sections for adult content, suggesting a broad spectrum of explicit material available, which may not have been adequately monitored.
As well as CSAM, Fowler says, there were AI-generated pornographic images of adults in the database plus potential "face-swap" images. Among the files, he observed what appeared to be photographs of real people, which were likely used to create "explicit nude or sexual AI-generated images," he says.
When it was live, the GenNomis website allowed explicit AI adult imagery, featuring sexualized images of women and a marketplace for sharing and selling content.
GenNomis' user policies claimed to restrict "explicit violence" and hate speech, but it remains unclear how effectively moderation tools were utilized for CSAM prevention.
It is unclear to what extent GenNomis used any moderation tools or systems to prevent or prohibit the creation of AI-generated CSAM.
Read at WIRED
[
|
]