• Print Page

Professional Growth

Lawyers Face New Challenges in Navigating Legal Jungle of Emerging Technologies

July 28, 2023

By John Murph

The legal profession is notorious for being slow to adopt emerging technologies, but with the rise of artificialJack Marshall teaching in classroom. intelligence (AI) tools in the past few months, law firms should consider hiring more technology experts with legal expertise, said Jack Marshall, legal ethicist and president of ProEthics, Ltd.

Marshall made the argument during the D.C. Bar CLE course “Legal Technology Jungle: The Law’s Greatest Ethics Challenge” in early July. There is precedent for this, Marshall said, citing Rules 5.4(a)(4) and 5.4(b) of the D.C. Rules of Professional Conduct, which allow law firms to have nonlawyers as partners.

“The D.C. Bar plowed new ground with [those rules],” Marshall said. “The rules do not mention specialists in technology, but I would argue [that] technology experts who also have legal expertise are [among] the most valuable members a firm could have. You cannot practice law as effectively as is necessary without being on top of technological developments.”

Use AI Like a Pro in Your Practice

Are you ready to stay ahead of the curve in the ever-evolving landscape of Artificial Intelligence? Join us for a deep dive into how AI impacts the legal profession and what it means for your practice.

Learn More

AI, specifically ChatGPT, has gripped the attention of the legal profession most recently, calling attention to the ethical risks of its usage in the practice of law.

“The American Bar Association saw the following mist coming in artificial intelligence,” Marshall said. “It put out a resolution [in 2019] that stated it encourages courts and lawyers to address the emerging legal and ethical issues related to the usage of artificial intelligence.”

The ABA Science & Technology Law Section cosponsored a resolution adopted this year addressing how attorneys, regulators, and other stakeholders should assess issues of accountability, transparency, and traceability in artificial intelligence. “It was a very optimistic report,” Marshall said of the ABA resolution.

In addition to e-document discovery, predictive analysis, and document drafting, Marshall mentioned some of the more shadowy usage of AI such as company records search to detect bad behavior preemptively and identification of potential wrongdoing based on code words, as well as review of employees’ emails to determine morale.

“This is Minority Report if you ask me,” Marshall quipped. “A robot has determined that Joe is ready to snap like a dry twig. Let’s get him out of the building.”

Certainly, the emergence of ChatGPT has prompted legal ethicists like Marshall to sound the alarm. “[The movie] Terminator had it right,” Marshall said. “There is reason to worry about this stuff. The problem is — and this was not in the ABA resolution — that ChatGPT is not trustworthy. Maybe it will be eventually. But I would argue that anytime a lawyer places their own trust in the hands of AI, I don’t care how good it’s supposed to be, that is a breach because [the lawyer] has no way of checking it. What does a chatbot have to worry about? It’s not going to be disbarred.”

Law schools’ use of ChatGPT to teach legal writing is also a serious problem, Marshall said. “You can’t teach legal writing by handing it off to artificial intelligence … We all want to save time to help us do a better job. But to the extent that it’s being rushed upon us is a problem. Nonetheless, we have to pay attention to it,” he added.

Marshall also touched on unsettled questions around voice-activated virtual assistants such as Alexa. “I won’t have an Alexa in my house,” Marshall said. “And I would advise any lawyer to not have one in their office meeting room, certainly until we know how that [collected information] is going to be held. This is an ongoing threat.”

Skyline