Artificial Intelligence does not stand up to legal scrutiny, magistrate says

The legal world is not ready to accept information sourced through artificial intelligence. Picture: File

The legal world is not ready to accept information sourced through artificial intelligence. Picture: File

Published Jul 17, 2023

Share

Pretoria - Artificial Intelligence (AI) might be the way forward for the world, but the legal world is not ready to accept information sourced through this medium as being the legal truth, as recently emerged when lawyers were caught outsourcing cases via this medium.

In a case before the Regional Court in Johannesburg, an advocate said his attorneys had sourced case law through the medium of ChatGTP.

This was in a case where a woman wanted to sue the body corporate of the complex where she lived, for defamation. The legal issue of whether a body corporate can be sued arose when counsel acting for the body corporate raised serious doubts about this.

But counsel for the applicant argued there was case law in this regard. The magistrate granted them a postponement to source these cases.

The plaintiff’s attorneys then used the artificial intelligence medium, ChatGTP, to conduct legal research, which they felt would seal their argument. As it turned out, the several cases listed and cited to the court, did not exist. The names and citations were fictitious, the facts were fictitious, and the decisions were fictitious, magistrate Arvin Chitram noted.

He ruled that the court understood from the plaintiff’s counsel that there was authority, in the form of case law, to suggest that a person can sue a body corporate. The defendant’s legal team said they knew nothing about such authority.

The magistrate said as the question appeared to be novel, he gave them time to find these cases.

When they were back in court, the plaintiff’s team told the magistrate that the authorities could not be sourced. Chitram said ordinarily this could have been forgiven, but it had emerged that the plaintiff’s team – armed with their AI case law – meanwhile presented these to the team of the defendant.

This, the magistrate said, was evident by email exchanges handed in to court in support of the defendant asking the court to issue a punitive costs order against the attorneys.

“The defendants were anxious to obtain the case law the plaintiff had relied upon,” the magistrate said.

At some point the attorneys handed over a list of case law (at least eight), plus a synopsis of each case (as sourced on AI) to the defendants. The latter said it tried its best to source these cases in law reports, but to no avail.

In court, the plaintiff’s legal team had to admit to the magistrate that they had obtained the material through ChatGTP.

The magistrate questioned how they could have done legal research through AI without satisfying themselves of its accuracy. As it turned out, the cases did not exist. The court was asked to punish them with a punitive costs order.

The magistrate noted that the plaintiff’s team did not submit these false cases to the court, but rather to their opponents. The plaintiffs said these were the cases they would rely on in their arguments in court, before they realised the cases did not exist.

“It seems to the court that they had placed undue faith in the veracity of the legal research generated by artificial intelligence and omitted to verify it. Ordinary, if the court was satisfied that the attorneys had attempted to mislead the court, the consequences would have been far more grave.”

The magistrate, however, found that the attorneys did not try to mislead the court. but were “simply overzealous and careless”. The magistrate added: Courts expect lawyers to bring a legally independent and questioning mind … not to merely repeat in parrot-fashion the unverified research of a chatbot”, slapping the plaintiffs’ attorneys with a punitive costs order.

Pretoria News