AI-hallucinated cases end up in more court filings, and Butler Snow issues apology for 'inexcusable' lapse
Briefly

Butler Snow has requested that a federal judge pardon their government client from sanctions after the law firm submitted documents with AI-generated citations that proved to be inaccurate. A partner misused ChatGPT to support a legal argument without verifying the citations, leading to significant errors. The firm acknowledged the gravity of the mistake, labeling it as inexcusable, and offered a sincere apology. This incident is part of a broader concern regarding larger law firms facing challenges with AI technology, as evidenced by other firms encountering similar issues with AI hallucinations in court documents.
"These citations were 'hallucinated' by ChatGPT in that they either do not exist and/or do not stand for the proposition for which they are cited."
"What happened here is inexcusable, and Butler Snow sincerely apologize[s]."
Read at ABA Journal
[
|
]