New York Attorney Jae Lee is reportedly the latest lawyer caught using ChatGPT after citing a nonexistent case in a legal filing. Attorney Lee was caught after they failed to provide a copy of the ChatGPT-fabricated case when requested by the court.
Lee was reportedly appealing the dismissal of her client’s medical malpractice lawsuit by a district court. The judges of the 2nd Circuit stated that she had mentioned two decisions in the appeal, one of which, Bourguignon v. Coordinated Behavioral Health Services, was a nonexistent case.
(Photo : MARCO BERTORELLO/AFP via Getty Images)
A photo taken on March 31, 2023 in Manta, near Turin, shows a computer screen with the home page of the artificial intelligence OpenAI web site, displaying its chatGPT robot.
When the court in November could not locate the case, it directed the attorney to provide a copy of the decision in question. In response, she said she could not provide a copy of the ruling.
Before turning to ChatGPT, which recommended the fictitious Bourguignon case, Lee told the court she had had trouble finding a pertinent case. The justices concluded that Lee had not read the ruling she quoted or taken any other action to verify its accuracy.
Read Also: Lawyer Uses ChatGPT for Legal Filing, Cites Non-Existent Cases
Sanctions Against ChatGPT-Fabricated Cases
According to reports, she cited the nonexistent state court ruling in an appeal to bring her client’s lawsuit alleging a Queens doctor performed an improper abortion, back to life. Although the attorney acknowledged that they had included a ChatGPT-recommended case, he or she insisted that there had been no malice or bias against the other side or the legal system.
The US Court of Appeals for the 2nd Circuit found that the actions of attorney Jae Lee of the JSL Law Offices, located in Uniondale, New York, were substantially below what is expected of counsel. Lee was forwarded by the court to a grievance panel, which decides on potential sanctions like fines and suspensions.
A request for comment regarding the 2nd Circuit’s order, which also maintained the dismissal of the original case, was not immediately answered by the doctor defendant’s attorney.
ChatGPT Lawyers
The order is the most recent instance of a lawyer filing a bogus case citation mistakenly using an AI tool. It has been discovered that generative AI systems "hallucinate" information, which means they may create writing that is false but yet compelling.
Recently, two Manhattan attorneys, Steven Schwartz and Peter LoDuca, were penalized $5,000 by US District Judge P. Kevin Castel in June of last year for submitting a court brief created by ChatGPT that contained false statements from cases that never happened.
According to court documents made public last month, Michael Cohen, the former attorney for Donald Trump, inadvertently included fictitious cases produced by Google’s AI technology in a brief supporting the president’s release from post-prison supervision.
Last June, Forbes also reported that the lawyer for a man suing Colombia-based Avianca Airlines in a standard personal injury suit used ChatGPT to prepare a filing. However, in a similar fashion, the chatbot delivered fabricated cases that the attorney presented to the court, prompting the judge to consider sanctions.
Related Article: British Judge Uses ChatGPT to Write Ruling, Hails AI’s Vast Potential
ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
{Categories} *ALL*,_Category: Implications{/Categories}
{URL}https://www.techtimes.com/articles/301445/20240207/new-york-lawyer-caught-using-chatgpt-imaginary-cases.htm{/URL}
{Author}unknown{/Author}
{Image}https://1734811051.rsc.cdn77.org/data/images/full/440234/italy-launches-probe-into-ai-training-practices-over-personal-data-gathering.jpg{/Image}
{Keywords}Tech{/Keywords}
{Source}All{/Source}
{Thumb}{/Thumb}