
We’ve talked about artificial intelligence and it’s role in different industries at length on this blog, and we admit that we are both cautious and optimistic about how AI may change many jobs and processes. However, we think it’s clear that AI is not yet ready to be trusted to completely take over some roles, including in the legal world. That sentiment holds especially true after learning that two Minnesota lawyers were recently repremanded for overrelying on AI to assist with their job.
Two Minnesota lawyers face discipline for submitting legal filings using phony case citations that were fabricated by artificial intelligence. In these instances, the legal briefs contained references to cases that were not real.
“In my experience and in the experience of colleagues, AI is not something that takes the place of human lawyers,” Hennepin County Judge Laurie Miller said in a hearing on Friday. “At this point, I don’t trust ChatGPT as far as I can throw it.”
In the case involving Judge Miller, a memorandum written by Twin Cities attorney Frederic Knaak cited three cases supporting his argument that were not in fact real cases at all.
“I’ve had at least one other case in the last year where I’ve seen this happen. We’re seeing increasing reports of other courts seeing it happening as well. It’s not an isolated problem. It’s one we need to nip in the bud to see if we can turn it into an isolated problem,” Judge Miller said.
Knaak owned up to the mistake, saying that he didn’t double check the citations put forth by AI when he was trying to streamline the development of the memorandum.
“I was relying on the tool for efficiency, not for deception,” Knaak said to the judge. “This has mortified me beyond belief. There is no intent to deceive here.”
Not The Only Case
In a similar instance in September, Minneapolis attorney David Lutz was referred to the Minnesota Lawyers Professional Responsibility Board for potential punishment and fined $5,000 for including a fake AI case when creating documents for a case he was working on. Lutz too said that he was leaning on AI for efficienacy and failed to cross-reference the cases it provided him to see if they were real.
These are just two of the more high-profile instances of AI causing problems in the courtroom, but they certainly aren’t the only cases. A database tracking similar instances across Hennepin County shows 134 cases where an attorney cited fabricated case law generated by AI. This issue is even more problematic when the average citizens tries to represent themselves in court, leaning on AI instead of an attorney.
We’ve mentioned that we don’t think AI will take a lawyer’s job anytime soon, but similarly, lawyers shouldn’t be outsourcing their job to AI either, as it’s clear that it is an imperfect science. The stakes are simply too high to put someone’s fate in the hands of AI.
For a real person who will get their hands dirty and put the hours in for your defense, reach out to Avery and the team at Appelman Law Firm today at (952) 224-2277





