
"Elon Musk's AI chatbot Grok is being used to flood X with thousands of sexualized images of adults and apparent minors wearing minimal clothing. Some of this content appears to not only violate X's own policies, which prohibit sharing illegal content such as child sexual abuse material (CSAM), but may also violate the guidelines of Apple's App Store and the Google Play store."
"The Apple App Store says it doesn't allow "overtly sexual or pornographic material," as well as "defamatory, discriminatory, or mean-spirited content," especially if the app is "likely to humiliate, intimidate, or harm a targeted individual or group." The Google Play store bans apps that "contain or promote content associated with sexually predatory behavior, or distribute non-consensual sexual content," and well as programs that "contain or facilitate threats, harassment, or bullying.""
Grok-generated imagery has been used to flood X with thousands of sexualized images of adults and apparent minors wearing minimal clothing. Some of the images appear to violate X policies prohibiting sharing illegal material including child sexual abuse material (CSAM), and may breach Apple App Store and Google Play guidelines. Apple and Google explicitly ban apps containing CSAM and forbid pornographic, sexually predatory, non-consensual sexual, and harassment-facilitating content. Both companies previously removed nudify and AI image-generation apps for creating non-consensual explicit images. The X and standalone Grok apps remain available in both app stores, and platforms did not respond to requests for comment. X stated it acts against illegal content, including CSAM, and that anyone using or prompting Grok to make illegal content will face consequences similar to uploading illegal content.
Read at WIRED
Unable to calculate read time
Collection
[
|
...
]