Publish date

4 December 2023

AI: Avoiding and managing disputes – preventing and mitigating the risk of contractual disputes

Organisations of all sizes, across sectors, are making increasing use of artificial intelligence (AI). Although the use of AI has the potential to drive efficiencies and to improve product or service delivery, it can also expose those who make use of it in their operations, as well as the providers of AI systems, to a range of potential legal risks and liabilities.

Some of the risks arise from:

  • The rapid pace at which AI is moving, and our ability to understand its application, decision making processes and limitations
  • The introduction of new regulations, and uncertainties around how the courts might apply existing legal principles relating to contract law and negligent acts or omissions
  • Questions over ownership and usage rights of the intellectual property (IP) in the AI itself, but also the data it uses and the material it generates.

Contract disputes with suppliers of AI systems

What if you contract with a supplier of AI and the system does not perform as expected? The law implies various terms in commercial supply contracts under the Sale of Goods Act 1979, including that any “goods” supplied by a trader meet their description, are of satisfactory quality and fit for purpose.

However, AI might not necessarily meet the definition of “goods” to bring these terms into play. If AI is supplied on a physical storage medium, then it might qualify, but if you are simply provided with software then these terms will not be implied.

Some of the uncertainty can be removed by putting a written contract in place that details the specification and outcomes required. Even with a detailed and carefully thought out contract, however, there may still be scope for disagreement.

Does the contract contain a clear specification that is measurable? How do the parties then go about actually measuring whether the AI meets that specification?

There may be questions over whether defects in the AI system itself, or the data used to train it and to generate the output, or applications to which it is connected, is the cause of any issues – this has the potential to create a blame game.

Contract disputes with service providers who use AI

Liability disputes might arise for businesses who use AI to deliver their services, when it fails to function as expected.

Unless properly excluded, the Supply of Goods and Services Act 1982 imposes a contractual obligation on a service provider to carry out the service with reasonable care and skill. The question of whether it was reasonable for the service provider to use AI when providing the service, and what measures were in place to oversee the service delivery, might be contentious.

However, the assessment will be different depending on the nature of the service in issue i.e. does it have the potential to impact on life and limb, or are the implications of a mistake purely financial and, if so, on what scale?

Again, a bespoke agreement will provide the best chance of limiting the scope for disagreement about whether the service provided meets what is required, but it cannot remove the risk entirely.

Mitigating the risk of contractual disputes

Some of the considerations that might mitigate the risk of contractual disputes arising in supply contracts, and for service providers using AI, include:

  • Ensuring the specification or scope of work are clear, sufficiently detailed and tailored. How will the system work, what output is required, who will be responsible for each stage of the process and how will all this be measured?
  • Allocating risks and anticipating the types of liability that may arise. For example, should indemnities be sought or provided in relation to the possible infringement of third-party IP rights, confidential information or data breaches? The same applies for warranties on performance and outputs, and the imposition (or acceptance) of exclusions or limitations of liability that might be appropriate
  • Assessing the need to comply with any specific legal or regulatory frameworks, such as data protection / privacy, and ensuring there are adequate safeguards in place, and that there is a system of monitoring and regular review for compliance
  • Will training be required to ensure that those training, using or operating the system fully understand how it works and what its limitations are? Aligned to that, ensuring processes and procedures are in place for checking and monitoring the system will be important
  • How the relationship will come to an end and what needs to happen once the relationship comes to an end should also be addressed. Is a dispute resolution process required that imposes an obligation on the parties to try to resolve disagreements through prescribed channels before litigation? Will there be post termination / completion obligations, and does there need to be provision for handover or ongoing assistance?

Obviously there are many other factors that need to be considered, but the important point to emphasise is that investing the time and effort in producing, or understanding, a contractual proposal, can help to reduce the chances of a contract dispute arising, and how to effectively manage one if it does arise.

In part 2 of this series, we will be looking at the potential for non-contractual claim liabilities to arise, principally under the law of negligence.

If you have any questions about the topics raised in this article, please get in touch.



Heathervale House reception

Keep up to date with our newsletters and events