Most legal AI vendors will tell you anything to close the deal — here's the 10-point checklist that cuts through the pitch. Security, accuracy, and training data transparency are the three non-negotiables. Everything else is important but secondary. If a vendor can't give you clear answers on those three, walk away regardless of the demo.
The legal AI market is flooded with vendors, and most managing partners don't have the technical background to separate genuine capability from polished demos. The vendor knows this. They'll show you the best-case scenario in the demo and bury the limitations in the fine print. This checklist gives you the specific questions that reveal what a tool actually does versus what the sales team says it does.
The Three Non-Negotiables: Security, Accuracy, Training Data
Security: Ask specifically: "Is our data used to train your model? Is our data stored? For how long? Where are your servers? Do you have SOC 2 Type II certification? Can we get a BAA for HIPAA compliance?" If they hedge on any of these, that's your answer. Accuracy: Ask for error rates on specific tasks. "What's your hallucination rate on case citations?" If they can't give you a number, they haven't measured it. Training data: "What legal data was your model trained on? Are you licensed to use it? If I draft a document in your system, does it become training data for other firms' outputs?" These three questions eliminate 50% of vendors immediately because they can't answer them clearly.
The 10-Point Evaluation Checklist
1. Data security and privacy — SOC 2, encryption, data residency, training data isolation. 2. Accuracy and hallucination rates — measured, documented, and specific to legal tasks. 3. Training data transparency — what data, whose data, licensed or scraped. 4. Integration — does it work with your existing tools (Clio, NetDocuments, Microsoft 365)? 5. Compliance — HIPAA BAA available? State bar ethics guidance followed? Court-specific requirements met? 6. Support and training — dedicated legal support team, not generic tech support. 7. Pricing transparency — per-user? Per-matter? Hidden overage charges? 8. Bias and fairness — tested for bias in legal outcomes? Documented? 9. Data retention — how long are your inputs stored? Can you delete them? 10. Liability — who's responsible when the AI is wrong? What does the contract say about indemnification?
Red Flags That Should Kill the Deal
"Our AI never hallucinates" — every AI hallucinates. If they claim otherwise, they're either lying or don't understand their own product. No SOC 2 certification — this is table stakes for any tool handling legal data. Vague pricing — "contact us for pricing" often means they'll charge whatever they think you'll pay. No trial period — if they won't let you test it on real (redacted) work before committing, the demo was better than the product. Training on your data by default — some tools use your inputs to improve their model unless you opt out. This should be opt-in, not opt-out. No legal team on staff — if the company doesn't employ lawyers who understand legal workflows, the product was built by engineers guessing what lawyers need.
How to Run a Meaningful Pilot
Don't evaluate AI tools based on demos — run a 30-day pilot with these parameters: Select 3-5 specific tasks the tool claims to handle (e.g., contract review, brief drafting, research). Assign 2-3 attorneys with different skill levels to use it. Measure baseline — how long do these tasks take without AI? Track pilot metrics — time to complete with AI, quality of output (rated by senior attorney), number of corrections needed, user satisfaction. Calculate ROI — hours saved x billing rate vs. subscription cost. A tool that saves 1 hour per attorney per day at $300/hour billing rate is worth $6,000/month per attorney. If the subscription is $500/month, that's a 12x ROI. If it saves 15 minutes per day, the math changes entirely.
Questions to Ask During the Sales Process
Ask the salesperson: "What's the most common reason firms cancel after the trial period?" (reveals real limitations). "Can I talk to a firm my size in my practice area that's been using this for 6+ months?" (reveals real-world experience). "What's on your roadmap for the next 12 months?" (reveals whether the current product meets your needs or they're selling futures). Ask the technical team: "What model are you built on? How do you fine-tune it?" (reveals technical depth). "What happens to my data if I cancel?" (reveals data practices). Ask your own team after the pilot: "Would you use this every day? What's annoying about it? What's missing?" The attorneys who'll actually use the tool daily should have veto power.
The Bottom Line: Security, accuracy, and training data transparency are the three non-negotiables — if a legal AI vendor can't give you clear, specific answers on those, walk away.
AI-Assisted Research. This piece was researched and written with AI assistance, reviewed and edited by Manu Ayala. For deeper takes and the perspective behind the research, follow me on LinkedIn or email me directly.
