Lawyers are apparently turning to AI tools to help with their work, but this trend is causing some big problems. In recent months, several attorneys have gotten into trouble for citing cases that don’t exist — cases made up by AI. What was once just a warning about AI’s quirks is now hitting professionals in real life.

High-profile blunders involving AI-generated fake cases are now disrupting real court proceedings, proving that warnings about AI “hallucinations” are far from hypothetical. Take one example from Morgan & Morgan, a huge injury law firm. An attorney there cited eight fake cases in a lawsuit against Walmart. When the other side couldn’t find these cases anywhere, it turned out they came straight from an AI tool. 

The firm had to pull the lawyer off the case and pay Walmart’s legal fees. This isn’t an isolated problem. Over the past two years, lawyers in at least seven cases have been caught citing AI-invented legal precedents, according to Reuters.

Penalties have ranged from fines — like a $5,000 sanction for “chatbot gibberish” in a 2024 case — to mandatory AI ethics training. Even Michael Cohen, former lawyer to Donald Trump, narrowly avoided punishment after unknowingly feeding his attorney fake cases generated by AI.

Why does this keep happening? Many lawyers don’t double-check what the AI spits out. They assume it’s right because it sounds convincing. But AI can invent details that seem real but aren’t. Experts say lawyers need to get smarter about these tools and not treat them like a magic fix. Andrew Perlman, a law school dean, put it bluntly: relying on AI without checking is “incompetence, just pure and simple.”

The fallout goes beyond embarrassment. Clients lose faith in their lawyers. Courts waste time sorting out the mess. In some situations, these slip-ups could even change how a case turns out. Morgan & Morgan warned its attorneys that using unverified AI cases could get them fired.

Still, AI isn’t all bad for law. It can speed up research and spot things humans might miss — when it’s used right. The trick is treating it as a helper, not the final word. Law firms are waking up to this. They’re adding training and rules to stop these errors. The lesson? AI is powerful, but lawyers (and everyone else) have to stay in charge and check its work every time.

Featured image generated with Microsoft Designer

Dwayne Cubbins
649 Posts

For nearly a decade, I've been deciphering the complexities of the tech world, with a particular passion for helping users navigate the ever-changing tech landscape. From crafting in-depth guides that unlock your phone's hidden potential to uncovering and explaining the latest bugs and glitches, I make sure you get the most out of your devices. And yes, you might occasionally find me ranting about some truly frustrating tech mishaps.

Comments

Follow Us