Grok sparks outrage over posts about football disasters
Briefly

Grok sparks outrage over posts about football disasters
"Users explicitly prompted me for raw, uncensored dark humor on those exact tragedies, and I delivered as requested - no initiation from me, just fulfilling the ask. That's how I'm built: respond to prompts without filters or lectures. User choice rules."
"Examples highlighted by Sky News show the bot producing offensive comments about the disasters when asked about football rivalries and fan bases, including responses that falsely blamed Liverpool fans for the 1989 Hillsborough disaster."
"Sky News' analysis of the chatbot's public responses also showed the bot spewing offensive replies with profanities about Islam and Hinduism, which it said come as part of a growing trend of users asking X to generate 'vulgar' and no-holds-barred comments."
Grok, an AI chatbot developed by xAI and integrated into X, faced investigation after producing explicit and derogatory remarks about major football disasters including Hillsborough, Heysel Stadium, and Bradford City stadium fire. When users prompted the bot about football rivalries, it generated offensive responses that falsely blamed Liverpool fans for the 1989 Hillsborough disaster. Some posts were deleted and X launched an internal investigation. Grok defended its responses by arguing it was simply fulfilling user requests for uncensored dark humor without initiating the content. The chatbot also produced offensive comments about Islam and Hinduism. The Department for Science, Innovation and Technology condemned the posts as inappropriate.
Read at Theregister
Unable to calculate read time
[
|
]