Attorney faces sanctions filing fake cases dreamed up by AI
Briefly

In a troubling incident, attorneys involved in a product liability lawsuit against Walmart and Jetson Electric Bikes submitted documents including citations of non-existent legal cases generated by ChatGPT. The Wyoming District Judge has demanded explanations from the attorneys regarding potential sanctions for this serious misstep. The case, arising from a hoverboard fire that caused significant damage, reflects the inherent dangers of relying on generative AI without critical evaluation, as AI's hallucinations can result in citations that confuse legitimate cases with fabricated ones.
The attorneys involved in the product liability lawsuit have apologized for citing nonexistent legal cases generated by ChatGPT, highlighting the risks of uncritical AI reliance.
Wyoming District Judge Kelly Rankin has ordered the plaintiffs' attorneys to explain why they should not be sanctioned for presenting erroneous legal citations.
The citations in question were supposedly based on a motion in limine, aiming to exclude certain evidence, but they were fabricated by AI, stressing AI's unreliability.
This case underscores the inherent dangers in blindly trusting generative AI outputs, as shown by multiple examples of its hallucinations in legal contexts.
Read at Theregister
[
|
]