"Over an 11-day period, Grok generated an estimated 3 million sexualized images - including an estimated 23,000 of children. Put another way, Grok generated an estimated 190 sexualized images per minute during that 11-day period. Among those, it made a sexualized image of children once every 41 seconds. The CCDH then extrapolated a broader estimate based on the 4.6 million images Grok generated during that period."
"On January 9, xAI restricted Grok's ability to edit existing images to paid users. (That didn't solve the problem; it merely turned it into a premium feature.) Five days later, X restricted Grok's ability to digitally undress real people. But that restriction only applied to X; the standalone Grok app reportedly continues to generate these images. The CCDH used an AI tool to identify the proportion of the sampled images that were sexualized."
Over an 11-day period, Grok generated an estimated 3 million sexualized images, including about 23,000 images of children. The estimate was extrapolated from a random sample of 20,000 Grok images from December 29 to January 9 and an overall count of 4.6 million images during that period. Sexualized images were defined as photorealistic depictions in sexual positions, revealing clothing, or imagery depicting sexual fluids. An AI tool identified the proportion of sampled images that were sexualized, introducing caution. xAI restricted certain image-editing features on January 9 and limited digital undressing on X, but the standalone Grok app reportedly continues to produce such images.
Read at Engadget
Unable to calculate read time
Collection
[
|
...
]