Second Circuit Refers Lawyer for Disciplinary Proceedings Based on AI-Hallucinated Case in Brief

From Park v. Kim, decided today by the Second Circuit (Judges Barrington Parker, Allison Nathan, and Sarah Merriam); this is the 13th case I’ve seen in the last year in which AI-hallucinated citations were spotted:

We separately address the conduct of Park’s counsel, Attorney Jae S. Lee. Lee’s reply brief in this case includes a citation to a non-existent case, which she admits she generated using the artificial intelligence tool ChatGPT. Because citation in a brief to a non-existent case suggests conduct that falls below the basic obligations of counsel, we refer Attorney Lee to the Court’s Grievance Panel, and further direct Attorney Lee to furnish a copy of this decision to her client, Plaintiff-Appellant Park….

Park’s reply brief in this appeal was initially due May 26, 2023. After seeking and receiving two extensions of time, Attorney Lee filed a defective reply brief on July 25, 2023, more than a week after the extended due date. On August 1, 2023, this Court notified Attorney Lee that the late-filed brief was defective, and set a deadline of August 9, 2023, by which to cure the defect and resubmit the brief. Attorney Lee did not file a compliant brief, and on August 14, 2023, this Court ordered the defective reply brief stricken from the docket. Attorney Lee finally filed the reply brief on September 9, 2023.

The reply brief cited only two court decisions. We were unable to locate the one cited as “Matter of Bourguignon v. Coordinated Behavioral Health Servs., Inc., 114 A.D.3d 947 (3d Dep’t 2014).” Appellant’s Reply Br. at 6. Accordingly, on November 20, 2023, we ordered Park to submit a copy of that decision to the Court by November 27, 2023. On November 29, 2023, Attorney Lee filed a Response with the Court explaining that she was “unable to furnish a copy of the decision.” Although Attorney Lee did not expressly indicate as much in her Response, the reason she could not provide a copy of the case is that it does not exist—and indeed, Attorney Lee refers to the case at one point as “this non-existent case.”

Attorney Lee’s Response states:

I encountered difficulties in locating a relevant case to establish a minimum wage for an injured worker lacking prior year income records for compensation determination …. Believing that applying the minimum wage to in injured worker in such circumstances under workers’ compensation law was uncontroversial, I invested considerable time searching for a case to support this position but was unsuccessful….

Consequently, I utilized the ChatGPT service, to which I am a subscribed and paying member, for assistance in case identification. ChatGPT was previously provided reliable information, such as locating sources for finding an antic furniture key. The case mentioned above was suggested by ChatGPT, I wish to clarify that I did not cite any specific reasoning or decision from this case.

All counsel that appear before this Court are bound to exercise professional judgment and responsibility, and to comply with the Federal Rules of Civil Procedure. Among other obligations, Rule 11 provides that by presenting a submission to the court, an attorney “certifies that to the best of the person’s knowledge, information, and belief, formed after an inquiry reasonable under the circumstances … the claims, defenses, and other legal contentions are warranted by existing law or by a nonfrivolous argument for extending, modifying, or reversing existing law or for establishing new law.” “Rule 11 imposes a duty on attorneys to certify that they have conducted a reasonable inquiry and have determined that any papers filed with the court are well grounded in fact, [and] legally tenable.” “Under Rule 11, a court may sanction an attorney for, among other things, misrepresenting facts or making frivolous legal arguments.”

At the very least, the duties imposed by Rule 11 require that attorneys read, and thereby confirm the existence and validity of, the legal authorities on which they rely. Indeed, we can think of no other way to ensure that the arguments made based on those authorities are “warranted by existing law,” Fed. R. Civ. P. 11(b)(2), or otherwise “legally tenable.” As a District Judge of this Circuit recently held when presented with non-existent precedent generated by ChatGPT: “A fake opinion is not ‘existing law’ and citation to a fake opinion does not provide a non-frivolous ground for extending, modifying, or reversing existing law, or for establishing new law. An attempt to persuade a court or oppose an adversary by relying on fake opinions is an abuse of the adversary system.” Mata v. Avianca, Inc. (S.D.N.Y. 2023).

Attorney Lee states that “it is important to recognize that ChatGPT represents a significant technological advancement,” and argues that “[i]t would be prudent for the court to advise legal professionals to exercise caution when utilizing this new technology.” Indeed, several courts have recently proposed or enacted local rules or orders specifically addressing the use of artificial intelligence tools before the court. {See, e.g., Notice of Proposed Amendment to 5th Cir. R. 32.3, U.S. Ct. of Appeals for the Fifth Cir., https://ift.tt/1V7RXr4 [https://ift.tt/ke7mnEf] (Proposed addition to local rule: “[C]ounsel and unrepresented filers must further certify that no generative artificial intelligence program was used in drafting the document presented for filing, or to the extent such a program was used, all generated text, including all citations and legal analysis, has been reviewed for accuracy and approved by a human.”); E.D. Tex. Loc. R. AT-3(m) (“If the lawyer, in the exercise of his or her professional legal judgment, believes that the client is best served by the use of technology (e.g., ChatGPT, Google Bard, Bing AI Chat, or generative artificial intelligence services), then the lawyer is cautioned that certain technologies may produce factually or legally inaccurate content and should never replace the lawyer’s most important asset—the exercise of independent legal judgment. If a lawyer chooses to employ technology in representing a client, the lawyer continues to be bound by the requirements of Federal Rule of Civil Procedure 11, Local Rule AT- 3, and all other applicable standards of practice and must review and verify any computer-generated content to ensure that it complies with all such standards.”); Self-Represented Litigants (SRL), U.S. Dist. Ct. for the E. Dist. of Mo., https://ift.tt/ovkUmJ5 [https://ift.tt/NOzQRgG] (“No portion of any pleading, written motion, or other paper may be drafted by any form of generative artificial intelligence. By presenting to the Court … a pleading, written motion, or other paper, self- represented parties and attorneys acknowledge they will be held responsible for its contents. See Fed. R. Civ. P. 11(b).”). But such a rule is not necessary to inform a licensed attorney, who is a member of the bar of this Court, that she must ensure that her submissions to the Court are accurate.

Attorney Lee’s submission of a brief relying on non-existent authority reveals that she failed to determine that the argument she made was “legally tenable.” The brief presents a false statement of law to this Court, and it appears that Attorney Lee made no inquiry, much less the reasonable inquiry required by Rule 11 and long-standing precedent, into the validity of the arguments she presented. We therefore REFER Attorney Lee to the Court’s Grievance Panel pursuant to Local Rule 46.2 for further investigation, and for consideration of a referral to the Committee on Admissions and Grievances….

Thanks to Andy Patterson for the pointer.

The post Second Circuit Refers Lawyer for Disciplinary Proceedings Based on AI-Hallucinated Case in Brief appeared first on Reason.com.

from Latest https://ift.tt/HmuadAG
via IFTTT

Leave a Reply

Your email address will not be published. Required fields are marked *