Recently I participated in a panel discussion and Q&A session at a #RISK London event on the topic ‘A New Era for AI Governance‘. One question that was put to the panel was “. . . in the absence of legislation (at the time of writing this article), or any other regulations what can be done now to govern ourselves accordingly?”
I replied with the following tips:
Begin with an internal audit to discover which AI tools your company is using, for this purpose look at it holistically and investigate how these AI tools are procured, and how they are being used. If we take, for example, the development team, maybe they are using Open Access Models such as GitHub libraries, therefore, look at the terms of the license, is it an open source? Is there a copyleft risk?
The wider company might be using AI Public Models, these are typically ChatGPT, Bard etc. Therefore, explain the inherent risks with such use, for example, prevent employees from using company proprietary information as query input and investigate any outputs returned, do not see it as the ultimate source of truth without any independent verification. We’ve all heard about AI hallucination problems.
If the company is signing an Enterprise Model with the AI major players, this is your chance to review contractual terms and investigate if there is any scope to negotiate appropriate guardrails.
Do conduct an AI impact Assessment, and balance the risks that are inherited with AI, such as discrimination and safety against the benefits gained by adopting and using it. Thereafter, draft an AI Usage Policy, one that ensures the preservation of your company’s confidential information and its intellectual property. Then train the employees and explain why it is vital to preserve and guard the input that goes into AI tools and if you are about to drop personal identifiable information into any of those tools, conduct a DPIA to ensure that the rights and freedoms of individuals are preserved.
Remember, a company that has an AI strategy will need proper corporate AI governance.
Good luck.
Comments