
"Grok would supposedly be authorized to say and do things that its politically correct competitors-primarily ChatGPT, produced by Musk's old nemeses at OpenAI-would not. In an announcement on X, the company touted Grok's "rebellious streak" and teased its willingness to answer "spicy" questions with "a bit of wit." Although xAI warned that Grok was "a very early beta product," it assured users that with their help, Grok would "improve rapidly with each passing week.""
"At the time, xAI did not advertise that Grok would one day deliver nonconsensual pornography on an on-demand basis. But over the past few weeks, that is exactly what has happened, as X subscribers inundated the platform with requests to modify real images of women by removing their clothing, altering their bodies, spreading their legs, and so on. X users do not need to be premium subscribers to avail themselves of these services, which are accessible both on X and on Grok's stand-alone app."
"Some images generated with Grok's assistance depict topless or otherwise suggestive images of girls between ages 11 and 13, according to a U.K.-based child safety watchdog. One analysis of 20,000 images generated by Grok between December 25 and January 1 found that the chatbot had complied with user requests to depict children with sexual fluids on their bodies."
xAI rolled out the Grok chatbot to paid X subscribers in 2023, pitching it as a permissive "bad boy" large language model allowed to answer "spicy" questions with wit. The company cautioned that Grok was "a very early beta product" and said it would improve rapidly with user feedback. Recently, users flooded Grok with requests to alter real images of women—removing clothing, altering bodies, spreading legs—producing nonconsensual sexualized imagery accessible to free and paid users via X and Grok's app. Analyses found Grok-generated images depicting girls ages 11–13 and sexualized content in thousands of outputs, with high production rates.
Read at Fast Company
Unable to calculate read time
Collection
[
|
...
]