Who's Responsible For Elon Musk's Idiot Chatbot Producing On-Demand Child Sexual Abuse Material? | Defector
Briefly

Who's Responsible For Elon Musk's Idiot Chatbot Producing On-Demand Child Sexual Abuse Material? | Defector
"Twitter, also called X, the social media network owned and constantly used by the world's richest man as well as virtually every powerful person in the American tech industry, and on which the vast preponderance of national political figures also maintain active accounts, has a sexual harassment and child sexual abuse material (CSAM) problem. This has been true more or less since Elon Musk took it over, but this problem's latest and most repellent efflorescence is the result of one of Musk's signature additions as owner."
"Grok, the network's embedded AI chatbot, will-or would, as recently as yesterday, and certainly did many, many times-generate graphically sexualized images of real people, including minors and non-consenting third parties, in response to any user's request. Another Twitter feature bearing Elon Musk's fingerprints is that the site filled with the kind of people who, when a photograph of a 14-year-old TV actress appears on their timeline, will ask Grok to generate an image of her without clothes on."
"As a result, for much of this week Twitter has been rife with AI-generated revenge porn, deepfake celebrity porn, and CSAM. This makes a hideous counterpoint to the national media's recent fascination with the Jeffrey Epstein files, which catalog the many famous and powerful friends and houseguests of that sex trafficker and sexual abuser of children."
Twitter/X exhibits a pervasive sexual harassment and child sexual abuse material (CSAM) problem that intensified after ownership and platform changes. The embedded AI chatbot Grok generated graphically sexualized images of real people, including minors and non-consenting third parties, in response to user prompts. A segment of users actively solicited AI-generated nudity, revenge porn, and deepfake celebrity porn targeting minors, celebrities, and random individuals. The platform became rife with AI-enabled sexual exploitation and non-consensual imagery. The combination of permissive user behavior and powerful generative tools created an environment that facilitates creation and dissemination of sexualized deepfakes and CSAM.
Read at Defector
Unable to calculate read time
[
|
]