top of page
Search

AI in Legal Practice: Current Temperature Check

  • Writer: Nick Thayer
    Nick Thayer
  • Mar 26
  • 5 min read

Written by Nick Thayer, Lex Tecnica Law Clerk


The legal profession is split. Some lawyers are using AI for drafting, translation, timekeeping, and citation checks. Others are cautious because the tools can still be wrong and the lawyer remains responsible. Both views are reasonable. AI can help, but it does not remove the duty to verify.



A quick look back to make sense of today

When calculators arrived, schools argued about them until rules and training caught up. The same thing happened with personal computers in law offices. AI is following that pattern. We are in the stage where firms, courts, and bars are setting expectations, and where adoption depends on culture and policy as much as it does on features.


What I’m seeing across firms right now


1) Useful, but not automatic time savers. AI speeds up first drafts of memos, motions, letters, emails, and summaries. It is also good at translation and tone adjustment. Lawyers still need to verify law, facts, and citations. The biggest relief today shows up in administrative work, not final legal analysis.


2) Accuracy is the line that matters. No tool can guarantee perfect accuracy. If a system can hallucinate a case or misstate a fact, the lawyer must check it. Vendors generally will not accept responsibility for legal errors in your work. That is the core reason many lawyers treat AI as a helpful assistant that always needs supervision.


3) Cost and contracts require discipline. Many tools are expensive. Product cycles are fast. Firms that avoid lock-in, run short pilots, and demand clear roadmaps are getting better outcomes. One-year terms are the norm for a reason.


4) Culture makes or breaks adoption. Fear often comes from loss of perceived control. Lawyers get comfortable when they see exactly how a tool supports specific tasks. Adoption rises when leaders expect usage, pair skeptics with power users, and make success visible in weekly meetings.


5) Clear roles for custom and vendor tools. Some firms build or tailor tools on top of models from OpenAI, Anthropic, or Google because vendor pricing does not fit their use cases. Others mix vendor products with internal connectors and ask vendors to maintain the integrations. The common thread is to pick partners who will still be here in 18 months and who will adjust the product to your workflow.


6) Real, simple entry points. Translation is a clean way to start because most practice groups need it and the risks are manageable. Timekeeping assistance is another because it reduces daily drag and improves billing hygiene. Email summarization and meeting-note cleanup are similar quick wins.


7) In-house is moving faster. Corporate legal teams use AI to do a strong first pass on research and memos, then escalate fewer and better-framed issues to outside counsel. Local models are an option for some teams that want everything to stay on the machine, though they require more technical support.


8) Policy beats hype. Every firm needs an AI usage policy that names approved tools, bans unverified AI citations, explains data handling, and requires training. This also reduces “shadow AI,” where people use public tools without safeguards.



Where the help shows up today


  • Drafting and formatting. Turn bullets into a memo, a memo into a client email, or a rough paragraph into a clean outline. The lawyer still edits and checks authorities.

  • Citation and record checks. Tools can hyperlink sources, flag missing or suspect citations, and point to record support. Treat this as a second set of eyes.

  • Administrative relief. Time capture, time-entry narratives, email and meeting summaries, and task lists. This reduces write-offs and daily time reconstruction.

  • Matter organization. Timelines, issue lists, and quick briefs that help a team align before deep work begins.

  • Training and onboarding. Concrete examples tied to a practice group’s daily work demystify AI and lower resistance.


Why the time savings are limited today


  • Verification is mandatory. If a tool can be wrong, you have to check it. That review narrows the raw time savings, especially on high-stakes filings.

  • Ethics do not change. Competence, confidentiality, communication, and fees still apply. Lawyers must know what a tool retains, whether chats are used for training, and how to disclose use to clients or courts if required.

  • Billing model friction. Hourly work can hide or punish efficiency unless pricing changes. Fixed-fee and portfolio work see clearer gains.

  • Tool sprawl and fatigue. Too many tools with unclear value burns time. Short pilots, clear goals, and real usage metrics protect energy and budget.

  • Adoption needs structure. Buying licenses does not create usage. Teams adopt when leaders assign specific tasks to be done with AI and then share wins and lessons.


What cautious lawyers are asking for

  • Accuracy guarantees that matter. Until a vendor accepts responsibility for errors in legal work product, lawyers will verify everything. Companies rarely accept that level of risk and likely will not, which means verification remains a lawyer’s job.

  • Transparent data handling. Opt-outs from model training, clear retention limits, and firm-level controls.

  • Audit trails. Source links, citation maps, and record pointers that make it fast to check the work.

  • Pricing that matches use. Costs that scale with outcomes or seats that actually get used.


A simple playbook firms can run now


  1. Remove fear with real demos. Show exactly how the tool supports a task a lawyer already does. Explain what it cannot do.

  2. Map workflows by practice area. Pick one low-nuance, high-volume step per group as the first target.

  3. Write and sign a policy. Approved tools, data rules, verification rules, and disclosure guidance.

  4. Start with volunteers. Pilot with lawyers who want the tools. Track usage and one concrete outcome, like “20 percent faster to first draft of X.”

  5. Prevent shadow AI. Make approved tools easy to reach and safe by default.

  6. Align incentives. Recognize people who adopt and teach. Expect some attrition from those who refuse.

  7. Keep contracts to a year. Reassess with real usage data.

  8. Work with partners, not just vendors. Ask for configuration, integrations, and access to someone technical.

  9. Share wins weekly. Put prompts, outputs, and measured results on the agenda.

  10. Price like a business. Charge in a way that covers tech and is clear to clients.


Notes from recent rooms that reflect this moment


  • Many lawyers want to use AI but do not see big time savings beyond drafting because they still verify everything.

  • The risk of sanctions or reputational harm outweighs any promise of speed if you do not verify.

  • Some firms are building custom tools or asking vendors to integrate via API when pricing or features do not fit.

  • Adoption improves when firms state clear expectations, appoint power users, and treat AI as a normal part of the job.

  • Translation is a strong starting point because every practice group runs into it.

  • Innovation burnout is real. The market is crowded and will consolidate. Short contracts help.

  • In-house teams are saving money with strong first passes and better internal memos before sending work outside.

  • Lawyers are unlikely to lose jobs to AI, but they can lose work to lawyers who use it well.


What the next 12 to 24 months likely bring


  • More policies, more training, and fewer unknowns about data handling.

  • Better citation tracing and record linking that shorten the review pass.

  • Clearer pricing models and firmer expectations from clients about how AI influences fees.

  • Consolidation among vendors, which reinforces the value of short contracts.

  • Continued responsibility resting with lawyers for what gets filed and told to clients.


Bottom line


Treat AI as an assistant that writes quickly, organizes well, and needs supervision. Use it for repeatable tasks where it already performs. Set policy, prevent shadow use, and price your services so the economics make sense. The standard in legal practice is still zero tolerance for fabricated law and facts. Until tools consistently meet that standard, verification is not optional. However, I firmly believe that the point where we can start trusting more in outputs is approaching.



This article has been reviewed and approved for legal accuracy by Kathleen Ireland, Esq and Scott Whitworth, Esq. It is intended for informational purposes only and does not constitute legal advice.

 
 
 

Comments


bottom of page