A lawyer’s filing recently came under scrutiny when opposing counsel noticed that it contained citations to non-existent cases. Due to this, the court issued Orders to the plaintiff’s counsel requiring him to provide an affidavit to support the case he cited in his submission. It turned out that not just one case, but six of the submitted cases were bogus judicial decisions with fabricated quotes and internal citations. Consequently, the court ordered the plaintiff’s counsel to show cause why he should not be sanctioned.
In response, the plaintiff’s counsel filed an affidavit and his colleague who drafted the filing, explained that he relied on ChatGPT to provide the text of the cases. ChatGPT, a generative artificial intelligence website, is becoming increasingly popular in the legal research industry. It was this website that provided the bogus legal opinions, and the lawyer failed to fully verify the authenticity of the sources. He admitted that it was his fault for not confirming the sources provided by ChatGPT and that he had no intention of deceiving the court or the defendant. To show its regret, the law firm vowed never to use ChatGPT in the future without first authenticating its content.
The court appeared unimpressed and issued an order to the law firm and the second lawyer to show cause why they should not be sanctioned for citing non-existent cases, submission of copies of non-existent judicial opinions, and using a false and fraudulent notarization in the affidavit.
ChatGPT is an artificial intelligence-based chatbot that can provide legal research and opinions. The goal is to use machine learning to make legal research faster and easier. It can quickly generate accurate legal opinions with fully cited supporting documents. This technology is becoming increasingly popular among legal professionals because it saves time compared to traditional legal research methods and offers greater accuracy.
The lawyer mentioned in the article is Mr. LoDuca. He is an experienced lawyer that has 30 years of practice experience. This article highlights his mistake in relying solely on the information and opinions provided by ChatGPT in researching for a motion to dismiss without first verifying its accuracy. He did, however, answer the Court’s order to file an affidavit supporting his case, including excerpts from queries and responses which were provided by the website.
The firm mentioned in the article is the law firm of LoDuca & Associates. It is a successful legal practice with a record of high-quality legal services, which makes this incident all the more alarming. In response to the incident, the firm vowed to stop relying on ChatGPT for legal research, and instead to take the necessary steps to fully verify any sources that it uses in its research.