• Sunday, April 28, 2024
businessday logo

BusinessDay

AI policies and governance: A guide for in-house lawyers – Part II

AI policies and governance: A guide for in-house lawyers – Part II

“This article was sourced from Inner Temple Library”.

What else do in-house legal teams need to consider when contracting for AI? What protections should be included?
Contracting for a new AI system is much like contracting for any new IT solution. All the usual considerations around scope of your licence to use the system and appropriate indemnities and allocation of liabilities apply. But there are some additional points you’ll need to think about:

Data protection – if the system will process any personal data, then it’s a priority to map out how the business will comply with its data protection obligations. It’s likely that a data protection impact assessment (DPIA) will be needed. You’ll need to think about privacy notice updates if the system will result in personal data being collected for new purposes and processed in different ways.

Consider whether the AI third-party provider is a data controller or processor for data protection law purposes. Usually, in relation to IT solutions, it’s a standard controller-to-processor relationship – but the complexities of how AI systems are trained and kept up to date means this may need a more thorough analysis.

IP and confidentiality – you’ll want to seek comfort that the input data on which the AI system has been trained doesn’t misuse the confidential information or trade secrets of another person, or infringe their IP rights and that the processes used to develop the system are legally and ethically sound. However, it’s clear that, because of the uncertainty around IP infringement in relation to training data, many AI system suppliers will be reluctant to give this protection at this stage.

Stopping internal data being used for model training – for obvious reasons it’s crucial to ensure that the AI system isn’t trained on any data provided by your business or its suppliers. Keeping your data in your hands and out of the AI’s learning process protects confidentiality for both your business and those that you work with.

Who owns the data – we’ve already discussed the current legal challenges surrounding ownership of output. It’s essential that, between you and the supplier, your contract is clear about who has the rights over the output and what each party can do with it.

Model maintenance – AI systems need to be kept up-to-date and trained on up-to-date data. The contract needs to be clear on how this will be achieved and who’ll take the lead on this important issue. This is likely to be an additional support service in some contracts.
What are the key points an AI staff policy should cover?

Different policies and instructions will be needed for a publicly available AI system like ChatGPT from an enterprise version that’s been commissioned by the business.

Not all businesses allow the use of publicly available AI systems, but those that do need to be clear about what can and can’t be done with them. As already discussed, confidential information and IP shouldn’t be inputted into a publicly available AI system, and staff need to understand that these systems aren’t always accurate. Care needs to be taken both in terms of the prompts used to generate an output and the reliance on that output.

A business implementing an enterprise AI system should develop specific instructions based on its capabilities and how the business intends it to be used. Typically, when a business implements a new software system, its use is confined to its functionality. AI systems can be different in that their uses can be wide and varied, some of which can be low-risk and others high-risk. Depending on the kind of system you implement you’ll need to be clear about what tasks it can be used for.

Other areas to cover include an outline of the benefits and risks, the consequences of non-compliance with the policy, details of available training and other support, the use of AI systems on personal devices, and monitoring.