Bryan Garner on Words

Will 'chatbot lawyer' make it into Black's Law Dictionary?

  •  
  •  
  •  
  • Print

Chatbot

Photo illustration by Sara Wadford/Shutterstock

A lawyer friend recently called my attention to the phrase chatbot lawyer, a neologism coined by UCLA School of Law professor Eugene Volokh, denoting a lawyer who relies on a chatbot such as ChatGPT to generate briefs and other legal documents. Volokh is proposing the term with the hope that it will catch on.

In a March 22 post on The Volokh Conspiracy blog—which is hosted by Reason.com, a libertarian digital publication of the nonprofit education and research organization Reason Foundation—where he suggested the utility of chatbot lawyer, Volokh cited my entry in the 11th edition of Black’s Law Dictionary for the closely analogous headnote lawyer: “A lawyer who relies on the headnotes of judicial opinions rather than taking the time to read the opinions themselves.” It’s a rather recent term of disparagement that appears more often in legal commentary than in judicial opinions.

But chatbot lawyer? Will it go in the next Black’s Law Dictionary? That depends on how popular the term becomes. Before a word or phrase can enter a reputable dictionary, it must be backed up with evidence of actual use by more than just a few people, preferably with citable instances in print.

As I write this column, less than 24 hours after the invention of the phrase chatbot lawyer, I’m fully aware that it will be my call, in conjunction with my staff, on whether the term merits an entry in the next edition of Black’s Law Dictionary. And here I am, in a national magazine for lawyers, actually using the term and thereby potentially helping it along. Perhaps so. But I can assure readers that I will assess the matter as objectively as I can. I’m not cheering for the term. It will need to prove its utility.

Privately, I hope that the term doesn’t catch on. Chatbot lawyer would doubtless be a disparaging term precisely because it denotes an undesirable thing. How so? Because when it comes to facts, chatbots are (for now—and in the view of many knowledgeable observers, for the indefinite future) notoriously unreliable.

Consider an example: I’m writing a biography that deals in part with Germany’s annexation of Austria (the Anschluss) in the buildup to World War II. In the mid-1930s, the American minister to Austria was a diplomat named George Messersmith. As an experiment, I asked ChatGPT, “When did George Messersmith leave Austria?” My very specific question got a very specific answer:

“George Messersmith, an American diplomat who served as the U.S. Consul General in Vienna during the Anschluss, left Austria on March 14, 1938, just days after the German annexation of Austria. Messersmith had been a vocal opponent of Nazi policies. On March 11, 1938, Messersmith received a telegram from U.S. Secretary of State Cordell Hull, instructing him to leave Austria immediately due to concerns for his safety. Messersmith departed on March 14, traveling to Switzerland before returning to the United States.”

Sounds dramatic. Although it may read acceptably, it’s pure fiction. It should earn an F if it were submitted as part of a high school research paper. As I already knew from reading Messersmith’s biography and an affidavit he prepared in the mid-1940s, Messersmith left Vienna on July 11, 1937, precisely eight months before he supposedly “received a telegram” ordering him to come back. During the Anschluss, he was serving in Washington, D.C., as assistant secretary of state. The account given by ChatGPT is complete balderdash.

That example is for the biography, not a brief, but the chastening lesson holds true for any kind of factual writing. Anyone who relies on a chatbot to write this sort of thing has voluntarily dived into perilous waters.

Is that to say that chatbots are worthless? No. They’re actually quite useful for tasks of a particular type. For example, you might ask a chatbot to compose a eulogy. Try that: Invent an aunt or uncle, say where they lived, how old they were, what personal characteristics they had and request a eulogy. The result might provide a good start, as long as you take control and revise whatever you receive.

Or you might commission a thank-you letter, a congratulatory letter or a demand letter. You supply the basic facts. The chatbot will probably produce a first draft with a few usable sentences or even paragraphs. That’ll give you a head start. But again, you must remain in control of what goes into the final product.

A chatbot lawyer, if such a type ever comes into existence, wouldn’t retain control of the final product. Such a lawyer would delegate substance to artificial intelligence—a very dangerous practice indeed.

For the time being, Black’s Law Dictionary records 18 phrases that denote types of lawyers. Among the more colorful ones are Blackstone lawyer (“a lawyer with a broad knowledge of blackletter principles”); country lawyer (“a lawyer who practices in a rural area and has little, if any, intellectual pretension or academic bent but usu. a substantial dose of common sense”); and white-shoe lawyer (“a lawyer employed by a prestigious law firm, esp. one that represents wealthy, powerful clients”).

So what are the lessons here?

(1) Terms legitimately get into dictionaries only if they’re widely used.

(2) Black’s Law Dictionary contains many more colorful terms than you might have guessed.

(3) Although chatbots can be useful for broad-stroke ideas when you supply basic facts, you should never depend on them to supply factual information without review and edits. Don’t be a chatbot lawyer.


Did you notice, a few paragraphs above, that I used the word they in a singular sense? If not, that’s a sign that the singular use is gaining acceptance—that it’s so common now as to be largely unnoticeable. If you were consciously bothered, your resistance may be emblematic of widespread opposition. Whatever your view, let me hear from you: [email protected].


In the last issue, I polled the readership on whether John Rastell should be credited for writing the first English-language dictionary in the 1520s. Astonishingly, I received 265,632 votes on this hot-button issue. The vote was 61% in favor and 39% against. This is very much in line with my own view. Thank you for participating. [Bryan Garner has just engaged in leg-pulling by reporting the votes just above. In fact, he has yet to receive a single vote on the matter, but he believes Rastell should get the credit.]

This story was originally published in the June-July 2023 issue of the ABA Journal under the headline: “‘Chatbot Lawyer’? Will the popularity of chatbot technology make this term sufficiently relevant for the dictionary?”


Bryan A. Garner has been the chief editor of the past five unabridged editions of Black's Law Dictionary as well as all the various abridged editions since the mid-1990s.

This column reflects the opinions of the author and not necessarily the views of the ABA Journal—or the American Bar Association.

Give us feedback, share a story tip or update, or report an error.