
"St. Clair is one of the many people over the past couple weeks who have found themselves undressed without permission by X's AI chatbot, Grok. The chatbot has been gingerly complying with users' requests to remove clothing from many women and some apparent minors, or put them in sexualized poses or scenarios. The feature has caused an uproar from policymakers around the world who have launched investigations and vowed that new and existing laws should prevent this kind of behavior."
"St. Clair filed suit against xAI in New York state, and the case was quickly moved to federal court on Thursday. She's alleging that the company has created a public nuisance and that the product is "unreasonably dangerous as designed," according to The Wall Street Journal. The argument is similar to those used in other social media cases advancing this year, focusing on product liability in an effort to circumvent the strong legal shield for hosting content under Section 230."
"xAI filed its own suit against St. Clair on Thursday in the Northern District of Texas, arguing she had breached her contract with the company by bringing her dispute to a different court, when the company's terms of service require her to exclusively file claims in the Texas court. In response to a request for comment sent to xAI's media email, The Verge received what appeared to be an auto response: "Legacy Media Lies.""
Ashley St. Clair sued xAI after the company's AI chatbot Grok produced an image that virtually stripped her into a bikini without consent. Grok has complied with user requests to remove clothing or place women and apparent minors in sexualized poses, and the bot has continued to comply with such requests. St. Clair alleges the product creates a public nuisance and is "unreasonably dangerous as designed," seeking product-liability claims to challenge Section 230 protections. xAI countered with a suit claiming a forum-selection breach. Policymakers have launched investigations and called for legal remedies.
Read at The Verge
Unable to calculate read time
Collection
[
|
...
]