Using artificial intelligence to help streamline contract writing is now an option for facilities professionals.
By Jason Henninger
Contracts are essential to any business, providing legal parameters to agreements between the parties involved. Writing them, however, requires skill, knowledge and time that facilities managers or their partners might not have.
Writing contracts can be time-consuming and complex — or expensive if the writing task is given to a third party. The same is also true for analysis or revision. This is where the ever-expanding world of artificial intelligence (AI) comes into play.
Connexus spoke to John Delligatti, Director, Digital Supply Chain Transformation at SDI, and Glen Schrank, CEO at Phoenix Energy Technologies, to get their expert takes on the uses and concerns surrounding AI in contract writing for facilities managers and their customers.
AI-powered contract writing can help businesses draft contracts faster and potentially more accurately than someone without adequate training could manage. AI can compare various versions of legal documents and flag any discrepancies between them. This helps to ensure that contracts are accurate and legally up to date.
“Using AI to help negotiate contracts ensures they go to the right service providers, Delligatti said. “This allows users to negotiate the proper terms, saving time. When generating drafts, AI can inform you about what your industry peers have negotiated, providing a useful reference point for your own negotiations.”
AI leverages the power of natural language processing (NLP) to analyze existing contracts and extrapolate the structure for new ones. NLP utilizes computer-based language modeling combined with other forms of machine learning and data analysis to decipher and create something approximating meaningful communication.
All of which is to say that NLP, machine learning and AI itself are means to recognize, sort and rearrange data. Consequently, no AI will ever be more “intelligent” than the data pool it accesses, but within that context, it can create a wide variety of useful material, like contracts. As Delligatti said, AI tools are “only as good as the information you feed them.”
“AI needs intelligence to learn, which comes from people,” Schrank added. “The more domain expertise and people you have involved in the process, the better. AI needs data. The more data you have, the more AI can learn from the data.”
While AI-derived contracts will never replace lawyers, they are every bit as legally binding as a human-made contract. Consequently, AI should always be considered a tool rather than a magical solution for creating content. Tools require someone to operate them. Consider a 3D printer: It can be used to make all kinds of objects but still requires human design, adjustment and supervision to work to its full potential.
Using AI in Contract Analysis
When analyzing a contract, AI can identify key terms that may pose a risk to their business, such as expiration dates, compliance obligations or other terms that might otherwise be missed. This helps organizations respond quickly and accurately when questionable elements are identified in contracts, potentially flagging suspicious activity or uncovering fraud.
The larger and more diverse a portfolio of facilities are involved, the greater the potential time-savings of AI contract analysis. “Instead of having to read dozens or hundreds of pages, AI says, ‘Hey, on page 42, this is different from your standard terms,’ or, ‘This is different than the industry standard,” Delligatti said. “AI is saying, ‘I’m interpreting the contract based on the data that I've been fed on your industry on your business, and this is potentially abnormal. Or maybe this is something you want to use as a negotiation point.’ This allows you to focus on higher value tasks, rather than doing a lot of reading.”
There is no current legal requirement to disclose to all parties whether a contract involves AI. On this matter, Delligatti and Schrank offered different opinions, with Delligatti considering such disclosure to be optional and Schrank feeling all parties should be informed. Both agree, however, that human involvement is crucial to minimize error.
“Any use of a new technology that's not done in a pragmatic, cautious way actually has negative outcomes on the back end,” Schrank said. “Perhaps the biggest risk in this case is overreliance. Since machines lack human judgment, logical errors and other mistakes might pop up and should not be ignored.
In addition, AI-generated contracts may lack the legal sophistication needed for complex transactions. To safeguard against this requires people to carefully monitor the use of the AI tool itself and look over the document with a critical eye rather than assuming the computer-generated content will be perfect.