An obligation to use AI? Federal judge addresses lawyers on responsible technology use

U.S. Magistrate Judge Maritza Dominguez Braswell addressed a group of Colorado attorneys on Friday with the message that they should be practicing and understanding how to use artificial intelligence because it is encroaching on more and more aspects of society.

“There was a time when you had to be rich to have a cell phone,” she said. “Everybody has a cell phone now. That’s just kind of the life cycle of technology: It’s really expensive for a while, then it’s accessible.”

Braswell spoke at the Colorado Bar Association’s headquarters in Denver. She is one of a trio of Colorado judges who has developed an expertise in AI and regularly speaks about the subject to attorneys, alongside Colorado Supreme Court Justice Maria E. Berkenkotter and Court of Appeals Judge Lino S. Lipinsky de Orlov.

Braswell distributes an AI-focused newsletter to judges across the country and said she is also drafting guidance for lawyers to dispel some of the concerns about using generative AI. Among other things, the guide may give attorneys some insight into how she would react if a lawyer submits a document that contains “hallucinated,” or fake, case citations invented by an AI tool.

“One of the things I would consider is, do you have an AI use policy in place? Because that tells me you are thoughtful about this and maybe it’s a one-off mistake,” she said. “How did you address it when opposing counsel called you? Did you take responsibility right away? Did you offer to cover costs for something counsel had to incur as a result of the check?”

U.S. Magistrate Judge Maritza Dominguez Braswell

Braswell noted the ways in which AI is creeping into new territory, by creating first-draft police reports based on body-worn camera footage or by reviewing procurement bids as an “incorruptible” Albanian cabinet minister.

“It’s also having the effect people thought it would have, which is increased job displacement,” she said.

Braswell encouraged lawyers to delineate high- and low-risk uses of AI so they can increase their competency, with court filings being an inherent high-risk activity. She suggested that the existing rules for professional conduct already provide guidance for proper AI use, but generative AI feels “a little more daunting because the tool is very, very powerful.”

Attorney Katayoun Donnelly, who also spoke at the event, said she represents low-income clients occasionally who are not native English speakers, but they send her case-related questions using AI.

“Some of it is actually good questions they’re asking. I’m enjoying the fact that for everyday people using AI, they’re advocating for themselves,” Donnelly said. She cautioned, however, that AI tools can create inefficiencies in that scenario by posing questions not truly relevant to a case.

Braswell also spoke about the subject in depth with Lipinsky, the Court of Appeals judge, on a podcast episode released last month by the Thomson Reuters Institute. She asked Lipinsky if lawyers may have some obligation at a certain point to be doing their work with generative AI tools.

In instances where judges are analyzing the amount of attorney fees to award a successful litigant, Lipinsky said, “trial judges may start thinking, ‘This task took too much time compared to how much time it would have taken if the legal professional had used an AI resource.'”

Earlier this year, Lipinsky authored the first appellate decision in Colorado putting lawyers and litigants on notice that they may face sanctions if they submit filings with fake citations. At the same time, Lipinsky acknowledged that new lawyers are coming out of law schools with AI proficiency.

Colorado Court of Appeals Judge Lino S. Lipinsky de Orlov, right, takes the microphone from a student in the Green Mountain High School auditorium after hearing oral arguments in two cases as part of a “Courts in the Community” event on Thursday, Feb. 27, 2025. The Colorado Court of Appeals and Supreme Court hold Courts in the Community events multiple times per year in which they conduct oral arguments in real cases before an audience of students. (Stephen Swofford, Denver Gazette)

Braswell wondered if the fear of misusing AI would have a negative effect on self-represented litigants’ ability to adequately present their case in the justice system. Lipinsky responded that he had spoken to law libraries about the possibility of opening up advanced legal AI tools to self-represented parties for free.

“The response I got was, ‘We can’t do that because you need to be trained to use them,” he said. “It is a frustrating dilemma because we don’t want untrained people to be using the legal generative AI tools, so what will they do? They’ll go back to the Chat GPT’s. I don’t fault ChatGPT, but it wasn’t designed for legal research.”

Braswell raised the possibility that if judges do not make use of the same advanced legal tools law firms have, they may be “outpaced” with increasing numbers of AI-assisted filings.

“I do have a concern that we’re not gonna be fast enough, we’re not gonna keep up enough with the legal profession that’s evolving,” she said.

Lipinsky pointed out Colorado’s state judges already have a tool to quickly check a legal filing to ensure the citations are valid. However, he provided two specific examples of AI being used in innovative, but potentially harmful, ways.

First, he suggested that lawyers could ask AI tools to quickly comb the Internet for information on potential jurors to better inform who they should dismiss from a jury trial. Second, Lipinsky relayed a story from a trial judge who suspected a photograph of a battered woman’s face in a domestic violence case was AI generated. When the judge asked the lawyer directly, they withdrew the photo as evidence.

“You certainly don’t want a system,” Lipinsky said, wherein “any video file or still image file or audio file — each side has to put on experts to talk about whether it was or wasn’t AI generated. And if it was AI generated, does it accurately depict what it purports to depict? Can you imagine what would happen to our criminal justice system if at every criminal trial involving a body cam video, the two sides — the prosecutor, the defense lawyer — have to call experts on the veracity of this file?”

Braswell wondered “where does the buck stop” when there is misuse of AI tools in the legal system, and whether the creators of AI tools could conceivably be held liable.

Lipinsky answered that lawyers can face consequences for violating their own professional responsibilities.

“But in terms of liability?” he said. “To be determined.”


PREV

PREVIOUS

Looking to survive winter driving in Colorado? Pack these 27 items in your trunk

Winter driving in Colorado can come with a bit of risk, as slick and snowy roads might mean getting stuck in traffic for hours at a time or even getting stranded. One piece of travel preparation that can ultimately be life-saving in a worst-case scenario involves packing a vehicle with cold weather survival gear – […]

NEXT

NEXT UP

TIAA slates another round of layoffs ahead of Denver exit

Financial services firm TIAA will have another round of layoffs this year. TIAA will begin laying off 31 workers from its downtown office beginning Dec. 1, according to a filing to state officials required by the Worker Adjustment and Retraining Notification Act (WARN). The company is planning to leave its major Denver office building in […]