Governance Insights 2026: A Preview of 2026

AI in the Boardroom As a general rule, company personnel should not upload confidential material to publicly available AI software (e.g., an open chatbot to which you can upload documents and run queries). Without appropriate protocols in place, such information could be accessed by the software developer’s personnel and, where the software trains itself on the information, could be incorporated into the pool of data that is used to produce results for other users. If a user does not have a reasonable expectation of privacy from the software, then it is unlikely that the confidentiality of the information would remain protected. Without confidentiality, any privilege attaching to the information could also be waived. It is critical, then, that directors and counsel leverage only AI software that has been approved by the company’s legal and IT teams. Such enterprise software should offer best-in-class cybersecurity and come with contractual terms that ensure that the developer’s personnel will not have access to the information and that such information will not be used to train the AI. As our understanding of the operation of AI models evolves so too will the contractual terms used to protect the confidentiality of information uploaded to them. Even where enterprise-approved software is used, automatic transcription and recording features for meetings and calls should be disabled in the ordinary course. For one, such software may stifle conversation and impede thoughtful deliberation. But more importantly, for board meetings, the official, board-approved minutes should generally be the only record of the meeting. Minutes can prove critical to demonstrating the board’s deliberations, establishing a business judgment defence and evidencing that directors met their standard of care. Discretion and judgment are essential to preparing a faithful record that serves that purpose. Even if a transcript is relied upon only as a first draft or an aide-memoire, such material may be discoverable in litigation and could undermine the settled minutes, and copies may be shared outside the zone of confidentiality and privilege. For these reasons, the default should be to have these recording features turned off. A chatbot log presents similar challenges and should be used with caution. Of course, outside the ordinary course (such as where litigation holds, discovery or similar legal requirements for data retention apply) different considerations may be at play and counsel should be closely involved in ensuring compliance. In-house counsel and directors should consider establishing or supplementing an existing AI policy to address these considerations. This policy should require that counsel and directors use only enterprise- approved technology that has been vetted for strict confidentiality and privacy requirements (including data-storage restrictions). It should also address the risks of using recording software and ensure that, when it is used, appropriate safeguards are in place, including that any relevant privacy consent needed in respect of retained personal information has been obtained. Software licences should contain contractual safeguards that address confidentiality, data residency and indemnifications for data breaches and unauthorized use or access.

Regular director training and reminders at the outset of board meetings may also prove welcome and necessary.

11

Governance Insights 2026

Powered by