G-W1QHJL4EFQ
top of page

Why You Shouldn't Use AI to Draft Legal Contracts

  • Michael Hayes
  • Nov 10
  • 6 min read

Why you shouldn't use AI to draft legal contracts - holographic contracts

We might be a bit biased about the topic of using AI to draft legal contracts. Spiller Law is a law firm after all. Drafting and reviewing contracts is a core part of our business.


But it turns out there are lots of reasons not to let AI do many things for you without truly understanding its limitations; drafting legal contracts only happens to be one them. In our opinion, it is one of the most important.


In this article, we will explain why using AI to draft legal contracts is both unacceptably risky and potentially not cost effective for your business or personal matters. As always, reach out and let us know if any questions.



AI is Not Accountable


One of the most overlooked aspects of AI is its lack of accountability. If it steers you wrong, or out and out lies to you, AI bots (and the companies making billions from their use) bear no responsibility if your use of that misinformation results in consequences, social, legal or otherwise. The responsibility lies solely with you, the user.


In early 2023, the tech news site, CNET, trusted by millions of users for product reviews, advice, how-tos and the latest tech news, was found to be using AI to generate some if its content, and over HALF of those articles contained errors.


The AI CNET used cannot be held accountable for the harm in CNET’s reputation or the liability those errors exposed CNET to. Ironically, CNET’s own disclaimer on its site means CNET bears no responsibility either. Consequences for bad decisions users made based on those articles are solely for the users to bear.


ChatGPT also has a disclaimer, or more accurately, a Terms of Use. Buried in there is its accountability disclaimer:


“Output may not always be accurate... You must evaluate Output for accuracy and appropriateness for your use case...”


Imagine if you went to an operating room for surgery and on the wall there was a sign which read: “Surgeon may not always be accurate, evaluate for accuracy for your use case.”


Expertise in the medical and legal fields are not only assumed but are accountable. If a doctor misdiagnoses a patient and prescribes the wrong treatment, they can be sued. If they do it chronically, they can lose their license, go to prison or both.


AI is not accountable. This might be marginally acceptable, if it weren't for the fact that AI is a chronic liar. Which brings us to our next reason not to use AI to draft legal agreements:



AI is Not Accurate


The fact that AI is not 100% accurate is probably known to most people who use it. But what you might not know is that it is impossible for AI to be 100% accurate. That includes everything from chatbots to LLMs to self-driving cars.


The main reason for this is simply that AI does not “understand” the world the way humans do. AI uses algorithms and, in the case of LLMs, pattern recognition from existing data. It has no consciousness, intuition or common sense.


Even IF the AI we use now could achieve 100% accuracy, the world is a dynamic system. Pattern and meaning are changing all the time (see new words added to the Cambridge Dictionary in 2025 alone, ie. what the skibidi are we relying on AI for?!).


The world and its data are so dynamic that whatever brief 100% accuracy AI might achieve would quickly erode until the AI was trained on more up-to-date data. But by the time the training was complete, however, the data would have changed again. Training will always lag behind the dynamism of the data. 100% accuracy is impossible.


The lack of 100% accuracy perhaps doesn’t matter so much when asking a bot to suggest a delicious but affordable meal you can make at home with basic cooking skills (the answer is: garlic butter pasta with vegetables and a fried egg). But when asking medical advice, particularly treatments for symptoms, you can quickly put yourself in danger.


Using AI to draft legal contracts introduces similar dangers. The nature of AI’s lack of accuracy means not just that the answers may have errors, but that they will. Even an AI that has 99% accuracy means that 1 out of every 100 words might be wrong. Or 1 out of every 100 characters.


But what would a single character matter? you ask. It couldn't matter that much, could it?


In the 2017 Maine case of O’Connor v. Oakhurst Dairy, a single missing comma became the source of a lawsuit brought by delivery drivers against a dairy farm. The language was contained in a statutory exemption in Maine’s Wage and Hour Law, which exempted employers from paying overtime to certain workers involved in the production and delivery of certain food products.


The exemption read as follows, which defined as exempt from overtime:


“...[t]he canning, processing, preserving, freezing, drying, marketing, storing, packing for shipment or distribution of: (1) Agricultural produce; (2) Meat and fish products; and (3) Perishable foods.” [emphasis added]


Can you tell where the missing comma should have gone? The key phrase was “packing for shipment or distribution of.” Without a comma, one could argue (as the delivery drivers did) that the phrase referred to “shipment or distribution packing” and does not refer to the actual “distribution” of the goods, making that distribution subject to overtime.


The dairy farm argued that the comma was implied, and the phrase itself described two separate and distinct concepts: “packing for shipment” and “distribution.”


The courts ultimately ruled in the favor of the drivers and the dairy farm had to pay $5 million in overtime wages.


The likelihood that a contract drafted using AI would have a missing comma that would cost you $5 million is astronomically unlikely. But whether it’s a comma, a word or a misstated phrase, it is 100% likely that a contract drafted by AI will have an error of one form or another. And errors in contracts have consequences.


Lawyers are human and can make mistakes like any human. But their training and expertise differ in a fundamental way from AI. Lawyers know what words mean. AI doesn’t.


Depending on the contract, some sections are more important than others, some errors will matter more than others. Because of the dynamism of language and the limitations of a bot trained on patterns, there are nuances the AI might not understand.


That’s not to say one is certain to be doomed if they use AI to draft a contract. But it is to say it’s playing Russian roulette with presumably something which could impact your life in a significant way. That’s the thing about a legal contract: it is an instrument of protection but also one of vulnerability for all parties.


So why not use AI to draft a legal contract then use a human lawyer to review it? Good question. Here’s why you shouldn’t do that either:



AI Cannot Jump Out of a Plane


Hiring a lawyer to draft a contract is a little like learning to skydive. It’s never sink or swim (or, I guess, plummet). The best lawyers take great care to explain what a contract means, why certain language is needed, what that language does and doesn’t do for you, just as a skydiving instructor will walk you through everything from the gear you'll be wearing to the posture you need to assume while in free fall to how to land.


The lawyer preps and packs the parachute, so that when you jump out of the plane together, in tandem, the parachute will deploy reliably and safely and get you both to the ground.


Bringing a lawyer a contract written by AI is like bringing a parachute prepped and packed by someone else to your skydiving instructor and saying: "Put this on and let's go jump out of a plane together."


The lawyer is now not working with materials they know, sourced from providers they trust, prepped and packed the way they believe will get both of you safely to the ground. A conscientious lawyer will NOT rely on a single word of a contract prepared by someone else, AI or not. That parachute must be unpacked, checked, gone over piece by piece, buckle by buckle, line by line, word by word. Both of your lives depend on it.


It is a tremendous amount of additional work, and it will cost you much more than simply letting the lawyer use their own materials they’ve used a hundred times without fail.



You Shouldn't Use AI to Draft Legal Contracts


Legal contracts are too important to trust anyone but a qualified attorney and member of the Bar to draft. The one exception might be if you don't want to use a contract but want to educate yourself on a particular agreement, the meaning and intent of various sections, why certain language would be used and not any other.


AI can be a great educational tool in the abstract. But for any practical purpose, legal or otherwise, caveat consumptor: user beware! Hire a lawyer and let them do their job. It will cost less and will be much more certain to protect you and your interests.





Spiller Law is a San Francisco business, entertainment and estate planning law firm. We serve clients in the San Francisco Bay Area, Silicon Valley, Los Angeles, and California. Feel free to arrange a free consultation using the Schedule Appointment link on our website. For other questions, call our offices at 415-991-7298.

The information provided in this article is for general informational purposes only and should not be construed as legal advice or opinion. Readers are advised to consult with their legal counsel for specific advice.

Comments


Commenting on this post isn't available anymore. Contact the site owner for more info.
bottom of page