The Clio Legal Insights Report for 2026 has exposed a stark governance breakdown in UK law firms: 89% of legal professionals deploy AI tools, yet only 7% of clients remember being informed. The disconnect between firm-reported disclosure rates (81%) and actual client awareness reveals a profession-w
Regulatory Watch  Legal Services

The scale of AI adoption in UK law is now undeniable. The Clio Report shows that 89% of legal professionals use AI tools, with 70% having adopted them within the past year—a rate of uptake that rivals the broader financial services sector. Yet this rapid deployment has outpaced institutional governance frameworks. Most troublingly, while 81% of firms claim they disclose AI use to clients, only 7% of clients recall receiving such disclosure. This yawning gap between firm perception and client experience is not a communications problem—it is a regulatory risk. Under the SRA Code of Conduct for Solicitors, Registered European Lawyers and Registered Foreign Lawyers (Part A and B), solicitors must act in accordance with paragraphs 1.4 (act in a way that upholds the constitutional principle of the rule of law and proper administration of justice) and 7.1-7.5 (comply with information requirements and provide clients with clear, accessible information about their matter). AI-assisted legal work that is not transparently disclosed breaches these obligations. Trovix Watch alerts law firms to SRA guidance updates, but many firms appear unaware that current disclosure obligations already extend to material use of AI in their work.

The governance picture darkens further when formal policies are examined. The Clio data reveals that 17% of UK law firms have no formal AI policy despite actively encouraging or permitting AI use among their fee-earners. This creates a compliance void: without documented AI policies, firms cannot evidence compliance with SRA Code requirements around competence (paragraph 3.1), client confidentiality (paragraph 6), data protection (GDPR and Data Protection Act 2018), or conduct in carrying out work (paragraph 4). Partners and principals are exposed to regulatory investigations if AI-assisted work causes client harm and the firm has no governance framework to demonstrate due diligence. The absence of formal policy is particularly alarming because it suggests AI adoption is happening at operational level (fee-earners experimenting with ChatGPT, Claude and other generalist tools) without supervision or audit. Trovix Audit can help map AI usage across matter workflows and create governance dashboards, but the fundamental issue is that many firms have not yet created the institutional structures within which such tools can operate safely.

The SRA's regulatory expectations on this issue are already clear, though not yet fully articulated as formal rules. The SRA's 2023 guidance on technology and innovation emphasised that firms deploying new technologies must conduct impact assessments, ensure competence of users, and maintain client confidentiality and data security. The regulator's position has only hardened: in parallel with the Treasury Committee's warnings on AI governance in financial services (January 2026), the SRA has signalled that it expects law firms to adopt formal AI policies and impact assessment processes. The Clio data suggests that at least 17% of firms are currently exposed to regulatory action if the SRA chooses to enforce these expectations through its thematic review programme. Additionally, many firms may be breaching GDPR Article 22 rights (automated decision-making) and ICAEW AML standards if AI is used in client onboarding or AML/KYC processes without explicit client consent. The transparency deficit revealed by Clio is not merely a trust issue—it is a conduct and regulatory breach waiting to be prosecuted.

For individual solicitors and partners, the governance gap creates personal liability under SM&CR-equivalent frameworks. The SRA's sole practitioner and partnership governance standards require that principal solicitors and partners take reasonable steps to ensure compliance across their practice. A principal who permits AI use without formal policy, impact assessment or documented client consent is exposed to SRA enforcement action and potential disciplinary sanctions. The Clio Report should prompt an immediate audit within every UK law firm: What AI tools are currently in use? Which client matters involve AI-assisted work? Have clients been told? Is there a documented AI policy? What data security and GDPR controls are in place? Trovix Watch provides real-time alerts to SRA guidance and enforcement cases, allowing firms to stay abreast of evolving regulatory expectations. But the deeper work is institutional: law firms need to move from ad-hoc AI experimentation to governed, documented, transparently disclosed AI use. The profession's reputation for trustworthiness—central to the rule of law principle embedded in the SRA Code—depends on closing the disclosure gap now.

Source: Law Gazette

Related Trovix product:

Book a demo →