If you can't tell me right now which AI tools every attorney at your firm is using, what client data they've input, and whether they've verified their output -- you have an audit problem. Only 38% of law firms have written AI policies, and fewer than 15% conduct any form of regular AI usage review. The gap between 'we have AI tools' and 'we govern AI tools' is where malpractice risk, privilege waiver, and ethical violations live.
An annual AI audit isn't bureaucracy -- it's risk management. Here's the framework.
Why You Need an AI Audit (Even If You Think You Don't)
Three scenarios that an audit catches before they become problems:
Scenario 1: Shadow AI. Associates using personal ChatGPT accounts to draft client memos. They didn't mean to violate policy -- they just didn't know there was a policy, or the approved tool was slow that day. A 2025 survey found that 47% of lawyers who use AI have used consumer-grade tools for client work at least once. Your firm is probably no exception.
Scenario 2: Drift from policy. You launched with a training program and verification requirements 8 months ago. Since then, verification has gotten 'lighter.' Some attorneys skip the citation-checking step on routine matters. The tool's terms of service changed and nobody noticed. Policies erode without active enforcement.
Scenario 3: Unapproved tools. A paralegal found a $30/month contract analysis tool and started using it. It's not on the approved list. Nobody vetted its data handling. Client information is flowing through an unreviewed third party. This happens at every firm that doesn't actively monitor AI tool usage.
The audit catches all three. It's not about punishment -- it's about identifying gaps, updating policy, and ensuring your firm's AI practices match your firm's AI policy.
The Annual AI Audit Framework: 5 Domains
Structure your audit around five domains:
Domain 1: Tool Inventory What AI tools are approved? What tools are actually being used? (These are often different lists.) Audit every device, software subscription, and browser extension. Include: tool name, vendor, tier (consumer/enterprise), DPA status, last security review date.
Domain 2: Data Handling Compliance Are DPAs current and signed for every approved tool? Have vendor terms of service changed since last review? Is data retention configured correctly? Are zero-retention options enabled where available? Is client data being input into unapproved tools?
Domain 3: Verification Practices Are attorneys following the verification workflow? Pull a sample of 10-20 AI-assisted work products and check whether citations were verified, analysis was reviewed, and documentation exists. Interview 5-10 attorneys about their actual (not theoretical) verification process.
Domain 4: Ethical Compliance Are court disclosure requirements being met? Are clients being informed of AI use per engagement letter language? Are billing practices compliant (actual time, no AI-hour inflation)? Has any attorney submitted unverified AI output?
Domain 5: Training and Awareness Have all attorneys completed required AI training? Have new hires been trained? Has the training been updated to reflect tool and policy changes? Can attorneys articulate the firm's AI policy when asked?
The Audit Process: Step by Step
Week 1: Preparation - Assign the audit lead (ideally: managing partner, risk committee chair, or general counsel -- not IT alone) - Distribute an anonymous AI usage survey to all attorneys and staff - Request current DPAs, SOC 2 reports, and vendor contracts from admin/IT - Pull a random sample of 20 matters from the last 6 months for work product review
Week 2: Data Collection - Review survey results -- focus on tool usage patterns, shadow AI indicators, and verification practices - Audit IT records for AI tool subscriptions, licenses, and browser extensions - Review the 20 sampled matters for AI disclosure, verification documentation, and billing compliance - Check vendor DPAs against current terms of service (vendors change terms; DPAs may be outdated)
Week 3: Interviews and Analysis - Interview 5-10 attorneys across practice groups about their actual AI workflows - Interview IT/legal ops about tool management and access controls - Identify gaps between policy and practice - Compile findings into a risk-rated report (critical, moderate, low)
Week 4: Remediation and Reporting - Present findings to managing partner/management committee - Update AI policy based on findings - Address critical gaps immediately (unapproved tools, lapsed DPAs, missing verification) - Schedule follow-up training for identified deficiencies - Set next audit date (annual minimum, semi-annual recommended for large firms)
Total time investment: 40-60 hours of partner/senior attorney time. For firms over 50 attorneys, consider engaging an outside consultant for the first audit.
The Audit Checklist
Print this. Use it. Check every box.
Tool Inventory: - [ ] Complete list of all AI tools used at the firm (approved and unapproved) - [ ] Enterprise vs. consumer tier verified for each tool - [ ] DPA signed and current for every approved tool - [ ] Unapproved tools identified and either approved (with DPA) or blocked - [ ] Browser extensions and mobile apps included in inventory
Data Security: - [ ] SOC 2 Type II reports current (within 12 months) for all approved vendors - [ ] Data retention settings verified (zero-retention enabled where available) - [ ] Vendor terms of service reviewed for changes since last audit - [ ] SSO/SAML configured for all tools that support it - [ ] Access controls verified (RBAC, user provisioning/deprovisioning)
Verification Practices: - [ ] Random sample of AI-assisted work products reviewed (minimum 20) - [ ] Citation verification documented in sampled matters - [ ] No instances of unverified AI output submitted to courts or clients - [ ] Verification workflow matches current policy requirements
Ethical Compliance: - [ ] Court AI disclosure requirements met in all filings - [ ] Client AI disclosure included in engagement letters - [ ] Billing records show actual time (no AI-hour inflation) - [ ] No privileged information input into consumer AI tools
Training: - [ ] All attorneys have completed required AI training - [ ] New hires trained within first 30 days - [ ] Training materials updated to reflect current tools and policy - [ ] Training completion documented and filed
Who's Responsible: The AI Governance Structure
An audit is only useful if someone owns the results. Here's the governance structure that works:
The AI Committee (or designee): At firms with 20+ attorneys, create a 3-5 person AI committee: managing partner (or delegate), one litigation partner, one transactional partner, IT director, and risk/compliance person. At smaller firms, designate one partner as the AI governance lead.
Responsibilities: - Own the AI policy and annual review - Approve/reject new AI tools (with vendor due diligence) - Commission and review the annual audit - Respond to AI-related incidents - Report to the management committee quarterly
The AI Champion (per practice group): A practicing attorney who: uses AI daily, supports colleagues, provides feedback to the AI committee, and flags issues before they become problems. This is a peer support role, not a governance role.
IT/Legal Operations: Manage tool access, monitor for unapproved tools, maintain DPA files, configure security settings, and support the audit data collection process. They're the operational layer, not the decision-making layer.
Managing Partner: Ultimate accountability for AI governance under Rules 5.1 and 5.3. Can't delegate this responsibility -- can only delegate the execution. If a privilege waiver or malpractice claim arises from AI use, the managing partner is accountable.
This structure costs nothing beyond allocated time. It turns AI governance from an abstract concern into an operational reality with named owners and clear accountability.
The Bottom Line: Only 38% of firms have AI policies and fewer than 15% audit AI usage. An annual audit across five domains -- tool inventory, data handling, verification practices, ethical compliance, and training -- takes 40-60 hours and catches shadow AI, policy drift, and unapproved tools before they become malpractice claims. Assign a governance owner, use the checklist, and audit at least annually.
AI-Assisted Research. This piece was researched and written with AI assistance, reviewed and edited by Manu Ayala. For deeper takes and the perspective behind the research, follow me on LinkedIn or email me directly.
