A Texas federal judge is requiring attorneys appearing in his court to certify that they did not use generative AI to draft any portion of their filings or that any language drafted by such AI-powered technology was checked for accuracy by a human being.
In his standing order, Judge Brantley Starr of the Northern District of Texas cited ChatGPT, Harvey.ai and Google’s Bard as examples of generative AI that his directive covers.
The order states that while generative AI platforms are very powerful and have many uses in the law, “legal briefing is not one of them.”
Starr’s directive comes amid the national spotlight a New York lawyer has brought to how legal professionals can improperly use the latest artificial intelligence in their work.
Attorney Steven A. Schwartz utilized ChatGPT to conduct his legal research in a federal case without independently verifying its work, and a brief filed on his client’s behalf ultimately cited six fake cases.
Schwartz now faces potential sanctions in an ongoing personal injury lawsuit filed by his client against the Colombian airline Avianca pending in the Southern District of New York.
Hallucinations
Judge Starr’s “Mandatory Certification Regarding Generative Artificial Intelligence” notes that generative AI platforms can “make stuff up—even quotes and citations.”
The term “hallucinations,” which Starr’s order uses, refers to when AI-powered platforms fabricate information.
Earlier this week, ChatGPT creator OpenAI announced a new strategy for fighting hallucinations.
Meanwhile, even before Schwartz’s mishaps in the New York case, it was a best practice for attorneys to independently review the information produced by AI tools.
Starr’s order about legal filings drafted by generative AI covers quotations, citations, paraphrased assertions and legal analysis, according to a template certificate posted online.
The language is to be checked for accuracy ”using print reporters or traditional legal databases” by a human being before a filing is submitted to the court.
“Any party believing a platform has the requisite accuracy and reliability for legal briefing may move for leave and explain why,” the order states.
Bias
Starr’s order also mentions bias as an issue with generative AI technology. Concerns about AI perpetuating bias in different contexts have been raised by a various groups through the years.
“While attorneys swear an oath to set aside their personal prejudices, biases, and beliefs to faithfully uphold the law and represent their clients, generative artificial intelligence is the product of programming devised by humans who did not have to swear such an oath,” Starr’s order states. “As such, these systems hold no allegiance to any client, the rule of law, or the laws and Constitution of the United States (or, as addressed above, the truth).”
Possible consequences
Starr will strike any filing from an attorney who fails to file a certificate on the docket attesting that they have read the court’s judge-specific requirements.
Attorneys will also be subject to potential Rule 11 sanctions for the contents of filings submitted to the court “regardless of whether generative artificial intelligence drafted any portion of that filing.”
Similar orders
Starr’s standing order on generative AI has already prompted at least one other federal jurist to put forward something similar.
Magistrate Judge Gabriel Fuentes of the Northern District of Illinois recently updated his standing order for civil cases to include a section about generative AI.
“Any party using any generative AI tool to conduct legal research or to draft documents for filing with the Court must disclose in the filing that AI was used, with the disclosure including the specific AI tool and the manner in which it was used,” the new requirement states.
“Just as the Court did before the advent of AI as a tool for legal research and drafting, the Court will continue to presume that the Rule 11 certification is a representation by filers, as living, breathing, thinking human beings, that they themselves have read and analyzed all cited authorities to ensure that such authorities actually exist and that the filings comply with Rule 11(b)(2),” the order also says.
The AI section of his standing order references both Starr’s new requirement and the case involving Schwartz in New York.
“One way to jeopardize the mission of federal courts is to use an AI tool to generate legal research that includes “bogus judicial decisions” cited for substantive propositions of law,” the order states.