The Texas Bar's June 2025 AI vendor due diligence guidance and the ACC's model evaluation template finally give law firms a concrete framework for evaluating AI tools. Before these, firms were guessing. Now there's a checklist — and if you're not using it, you're accepting risk you haven't measured.

Most law firms adopt AI tools the way consumers buy software: they read the marketing page and click "agree." That approach violates ABA Opinion 512's competence and confidentiality requirements. The ethical obligation is clear — you must evaluate an AI vendor's data practices, security posture, and contractual terms before any client data touches their system. Here's exactly what to check.


No-Training Clauses: The Non-Negotiable Starting Point

If your AI vendor's terms of service allow them to use your inputs as training data, every prompt containing client information is a confidentiality breach. This isn't theoretical — consumer-grade AI tools default to training on user inputs.

The requirement: your contract must include an explicit no-training clause stating the vendor will not use your firm's inputs, outputs, or any derivative data to train, fine-tune, or improve their models or any third party's models. Don't accept vague language like "we may use anonymized data to improve our services." Anonymization of legal documents is unreliable — case details, party names, and fact patterns can be re-identified. The ACC template provides specific contract language. If a vendor won't sign a no-training clause, they're telling you something about their business model. Walk away.

Data Residency and Processing Location

Where does your client data go when it enters the AI system? Most law firms can't answer this question for their current AI tools. The Texas Bar guidance requires firms to understand data residency — the physical location where data is stored and processed.

This matters for three reasons. First, data sovereignty: some client matters involve data that can't leave certain jurisdictions (particularly relevant for EU clients under GDPR and the AI Act). Second, government access: data stored in certain countries may be subject to government surveillance or compelled disclosure without your knowledge. Third, subpoena risk: data stored on vendor servers in the US may be discoverable in litigation against the vendor. Your checklist should require: primary storage location, processing locations, whether data transits through additional jurisdictions, backup storage locations, and the vendor's response to foreign government data requests.

Breach Notification and Incident Response

When (not if) your AI vendor has a security incident, how quickly will you know? Most standard AI vendor agreements have no breach notification timeline, or they default to 72 hours — far too slow for client data involving privileged communications.

The Texas Bar checklist requires firms to evaluate: notification timeline (demand 24 hours maximum for incidents involving legal data), what constitutes a reportable incident, whether the vendor distinguishes between breaches affecting your data versus their general systems, the vendor's incident response plan, and your firm's right to conduct independent forensic assessment. The ACC template adds: whether the vendor carries cyber liability insurance, the coverage amount, and whether your firm is named as an additional insured. If the vendor's breach exposes your client's privileged information, you need to know who's paying for the fallout.

Privilege Preservation and Ethical Walls

This is the evaluation category most firms skip, and it's the most dangerous gap. If your AI vendor's system processes data from multiple law firms, how does it prevent cross-contamination of privileged information?

The checklist items: Does the vendor maintain logical separation between client tenants? Can one firm's data influence outputs for another firm's queries (even through model fine-tuning or retrieval databases)? Does the vendor's architecture support ethical walls within your firm — so conflict-screened attorneys can't access AI outputs from restricted matters? What happens to your data when you terminate the contract — is it actually deleted, or merely "deactivated"? Demand written certification of data deletion upon termination, with a specific timeline and method. The ACC template requires vendors to confirm they can provide audit logs showing who accessed what data and when — critical for privilege disputes.

Malpractice Insurance Alignment and Liability

Here's the question nobody asks during vendor selection: does your professional liability insurance actually cover errors caused by this AI tool? Most firms discover the answer is "unclear" only after something goes wrong.

Your vendor evaluation must include a malpractice alignment check. Review your current professional liability policy for AI exclusions — some policies exclude claims arising from "automated decision-making" or "technology-assisted services." Contact your insurer and ask specifically whether AI-assisted legal work is covered. Then evaluate the vendor's own liability provisions: What does the vendor's limitation of liability look like? Most AI vendor agreements cap liability at 12 months of fees paid — trivial compared to a malpractice claim. Does the vendor indemnify you for errors in their output? Almost never. Document this gap. When your malpractice carrier asks what due diligence you performed before adopting an AI tool, this checklist is your answer.

The Bottom Line: AI vendor due diligence isn't a tech procurement exercise — it's an ethical obligation under ABA Opinion 512 and multiple state bar guidance documents. If you can't answer basic questions about how your AI vendor handles client data, you haven't met the competence or confidentiality bar. The Texas Bar checklist and ACC template give you the framework. Use them before your next renewal.

AI-Assisted Research. This piece was researched and written with AI assistance, reviewed and edited by Manu Ayala. For deeper takes and the perspective behind the research, follow me on LinkedIn or email me directly.