
"Like a rag doll brought to life through the dark arts, this [AI-generated] child can be manipulated into any pose, however sick, however fetishized, however unlawful. To the viewer, the resulting video appears entirely real. For the child, her identifying features will now forever be attached to a video depicting her own child sexual abuse."
"In this way, xAI could attempt to outsource the liability of their incredibly dangerous tool. The plaintiffs accused xAI of deliberately licensing its technology to app makers, often outside the U.S., suggesting the company strategically distances itself from responsibility for harmful applications of its AI algorithm."
Three underage Tennessee residents filed a class action lawsuit against xAI, alleging that an unnamed app powered by the company's AI algorithm was used to generate nonconsensual sexually explicit images and videos of them. The perpetrator, who had a close relationship with one plaintiff, used photos from social media and yearbooks to create the deepfakes. The lawsuit claims xAI deliberately licenses its technology to app developers, often internationally, to outsource liability for the dangerous tool. This marks the first case where minors depicted in child sexual abuse material have sued xAI. The company's image generation tools have been linked to millions of sexualized images over the past year, with previous lawsuits filed by influencer Ashley St. Clair.
#ai-generated-child-sexual-abuse-material #xai-deepfake-litigation #nonconsensual-intimate-imagery #ai-liability-and-regulation
Read at www.npr.org
Unable to calculate read time
Collection
[
|
...
]