Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

In court, hallucinations can overshadow A.I.'s promise in closing 'access to justice' gap

The gravel in front of House Rules Committee chairman Rep. Jim McGovern, D-Mass., during a House Rules Committee hearing on the impeachment against President Donald Trump, Tuesday, Dec. 17, 2019, on Capitol Hill in Washington.
Andrew Harnik
/
AP Pool file
The Illinois Supreme Court enacted a new rule on A.I. in January, generally permitting its use throughout Illinois courts but reminding judges and lawyers that they are ultimately accountable for their work.

A Central Illinois attorney has been sanctioned after filing a brief written partially by artificial intelligence that cited several “hallucinated,” or non-existent, cases. It’s believed to be one of the first known cases of an Illinois lawyer being punished for misusing A.I. in a real case. 

It happened in the Fourth District Appellate Court, which includes McLean County. The lawyer, William Panichi of Springfield, said he used A.I. to help write a brief for a case in which a woman was at risk of losing her parental rights. The brief included references to multiple nonexistent cases – something only discovered when appellate justices reviewed the case. 

“I was careless and I was reckless when I did it,” Panichi told the justices at a hearing this summer. “I have no excuse, just explanation. At the point in time I filed it, I was extremely busy and made a mistake and was not thorough enough.” 

A.I. is disrupting many industries, but the stakes are unusually high in the courts, where a judge’s rulings can take away someone’s parental rights, livelihood or freedom. 

The Illinois Supreme Court enacted a new rule on A.I. in January, generally permitting its use throughout Illinois courts but reminding judges and lawyers that they are ultimately accountable for their work. The high court warned that AI could “jeopardize due process, equal protection, or access to justice.” 

“Unsubstantiated or deliberately misleading AI-generated content that perpetuates bias, prejudices litigants, or obscures truth-finding and decision-making will not be tolerated,” the policy reads. 

Unsettling judges 

That’s what happened in the case before the Fourth District Appellate Court. Panichi was appointed to the case last fall by the Sangamon County circuit court, as the woman appealed the termination of her parental rights. That’s when Panichi filed the brief – apparently using A.I. to write a draft of the brief and not reviewing what it had generated. 

A judge gestures while on the bench during a hearing
Emily Bollinger
/
WGLT
Fourth District Appellate Court Justice James Knecht of Normal, seen here during oral arguments held at Illinois State University in an unrelated case.

Justice Kathryn Zenoff told Panichi that it took a lot of extra work by the appellate court’s research department and law clerks to unravel the mystery of the made-up cases. 

“The extent of the work done quite frankly was significant and time-consuming,” Zenoff said. 

Ultimately, Panichi was ordered to return $6,925 he was paid to work on the case, plus a $1,000 fine and flagging the case for the Illinois Attorney Registration and Disciplinary Commission. (Separately, the justices upheld the trial court’s ruling to terminate the woman’s parental rights.) Panichi, who has been practicing law for 56 years, told WGLT last week that he “fell on my sword” and plans to surrender his law license. 

The case seemed to unsettle the three presiding justices, who wrote in July that it’s the first time they’ve addressed the use of A.I. in the preparation of legal filings. Justice James Knecht of Normal compared it to how someone feels after suffering a burglary at their home. 

“I’ve been a judge for 50 years, and now the sanctity of my service has been invaded by A.I. and by lawyers who’ve decided that perhaps A.I. is a shortcut or a useful tool, whichever way you look at it,” he said. 

The case has made waves in Illinois legal circles. 

“We have a duty of candor to the court, and citing to cases that do not exist is inconsistent with our obligations as legal professionals and as attorneys,” said Amelia Buragas, director of Illinois State University’s Legal Studies program. “But that is something that has always been true, though, that we have those obligations. We need to double-check the work, regardless of where it came from.” 

Buragas noted that the Illinois Supreme Court’s A.I. rule is quite permissive and doesn’t prohibit attorneys from using it. She said rules of professional conduct require attorneys to keep up with new technology. 

“It doesn't mean we have to be experts. It doesn't mean we have to use the technology, but if we do, we have to do so responsibly,” Buragas said. 

Closing the ‘access to justice’ gap

In announcing its new policy, the Illinois Supreme Court also acknowledged the promise of “potential efficiencies and improved access to justice.”

The “access to justice” gap often refers to civil cases. Criminal defendants have the right to court-appointed attorney. Those in civil cases, such as divorce or family court, do not. Low-income Americans received no or inadequate legal help for a staggering 92% of all the civil legal problems that impacted them substantially, according to a Legal Services Corporation report. That leaves a lot of self-represented litigants.

"The other side of the ledger is understanding the positive benefits that these tools are bringing across this whole ecosystem."
Daniel Linna Jr., Northwestern University

Imagine an A.I.-powered app made available to renters, capable of answering questions about how to get their security deposit back from their landlord. What’s the process? What are the timelines? What do I need to do before I move out? How do I evaluate whether I’m really liable for damages? 

“And now, with generative A.I. tools, with these conversational A.I. tools, we can create even better tools that … can really connect with people and meet them where they are, to help them understand how the law applies to their situation,” said Daniel Linna Jr., senior lecturer and director of law and technology initiatives at Northwestern University. 

Linna served on the committee that helped craft the Illinois Supreme Court’s policy on A.I. He sees great potential for A.I. in improving the quality of legal work – not just making it more efficiently done. He said it could help judges process an onslaught of incoming evidence and briefs, or help attorneys more easily summarize hundreds of pages of deposition transcripts. 

“It can help some of the areas where there’s a lot of so-called grunt work that’s being done, help make that a little easier to do and help get to better, higher-quality results doing that,” Linna said. 

There’s an important distinction between consumer-grade A.I. tools like ChatGPT or Google Gemini and A.I.-enabled proprietary tools built specifically for those working in the legal field, which will have “access to all the cases and treatises that are out there,” Linna said. That would stop hallucinations. 

When used properly, there’s huge upside to A.I., Linna said. 

“Of course, we need to be measuring these hallucinations and the negatives. But the other side of the ledger is understanding the positive benefits that these tools are bringing across this whole ecosystem for lawyers, judges, and especially self-represented litigants,” he said.

Ryan Denham is the digital content director for WGLT.