GenAI website goes dark after explicit fakes exposed
Briefly

Jeremiah Fowler, an expert in cybersecurity, stumbled upon an unprotected AWS S3 bucket containing nearly 93,500 sexually explicit images created by a South Korean AI company. The bucket included disturbing AI-generated depictions of children and celebrities as children, along with user prompts for generating such content. The company involved, AI-NOMIS, had its websites shut down shortly after Fowler reported the issue. His findings raise significant ethical concerns about the misuse of AI for creating non-consensual explicit images, illustrating the dangers associated with misconfigured online storage systems.
Jeremiah Fowler discovered a misconfigured AWS S3 bucket containing 93,485 explicit AI-generated images, including child porn and celebrities portrayed as children, leading to immediate removal of the content.
Fowler described finding a trove of lurid AI-generated content, including disturbing prompts and normal pictures being face-swapped into explicit images without consent, reflecting serious ethical concerns.
Read at Theregister
[
|
]