Avoiding Sanctions: Verification, Disclosure, and Compliance
The 2026 sanctions wave is real and growing. In the first quarter of 2026 alone, US courts imposed more than $145,000 in AI hallucination sanctions, including a $110,000 penalty in Oregon and Nebraska's first license suspension over AI-generated filings. By February 2026, US judicial opinions involving lawyers accused of filing AI-hallucinated content were running at roughly 30 to 50 cases per month. These are not theoretical risks.
This lesson covers what is actually getting lawyers sanctioned, how to avoid it, and what the current state of disclosure rules looks like across federal and state courts.
What You'll Learn
- The exact failure patterns that lead to sanctions in 2026
- Standing orders and disclosure rules across federal and state courts
- A verification and disclosure compliance checklist
- How to recover if you discover a hallucination after filing
What Actually Gets Lawyers Sanctioned
The 2025 and early 2026 sanctions cases share a small number of fact patterns. Knowing them is half the battle.
Pattern 1 β Fabricated case citations. The original Avianca case set the template in 2023, and it has been repeated dozens of times. A lawyer asks ChatGPT or a similar tool for cases on a topic; the tool invents plausible-sounding case names and citations; the lawyer files them without verifying. Sanctions follow.
Pattern 2 β Misquoted real cases. A real case exists, but the quoted text does not appear in it, or the case stands for something different than the brief claims. These are sanctionable too, even though the case is real.
Pattern 3 β Boilerplate certifications. Some lawyers add boilerplate certifications that AI was not used or that everything was verified, while the actual filing contains hallucinations. Courts treat these as aggravating.
Pattern 4 β Failure to investigate after a warning. A judge questions a citation in court; the lawyer assures the court it is real; later it turns out to be hallucinated. The failure to investigate after being put on notice multiplies the sanction.
Pattern 5 β Discovery responses with invented data. A 2026 trend: AI-drafted discovery responses citing facts or documents that do not exist in the record. These trigger Rule 26 and Rule 11 problems simultaneously.
Federal Court Disclosure Rules
The federal landscape in 2026 is patchwork. There is no uniform federal rule yet, but proposals are pending.
Standing orders. A growing number of federal district judges have standing orders that require either (a) certification that AI was not used, (b) certification that all AI-generated content was verified, or (c) disclosure of any AI use in the filing. Check the standing order for every judge before every filing. Westlaw and Bloomberg Law both maintain searchable databases of these orders.
Rule 11. Rule 11 has been the workhorse of AI sanctions. Submitting a paper requires certification that the filing has factual basis and is warranted by existing law. Fabricated citations violate both prongs.
Rule 26(g). Discovery filings carry their own certification requirement. AI-generated interrogatory responses or document requests with invented citations or factual claims trigger Rule 26(g) sanctions.
Local rules. Many districts have added local rules in 2025 and 2026 requiring AI use disclosure or verification certification. Read your local rules. They change frequently.
State Court Rules
State court rules vary significantly. Highlights as of mid-2026:
- California. No statewide rule, but multiple state bar opinions require verification and discourage AI use without competence. Some county-level courts have local AI rules.
- New York. State bar guidance requires AI competence and verification. Several appellate divisions have rejected briefs with AI-generated false citations.
- Texas. State bar has published AI ethics guidance. Some federal districts in Texas have aggressive standing orders.
- Florida. State supreme court has issued ethics guidance on AI use.
The pattern across states: required competence, required verification, no blanket prohibitions. Disclosure requirements vary.
The UK Picture
For UK-based readers, the picture is also evolving. The Solicitors Regulation Authority and the Bar Standards Board have issued guidance emphasizing competence, supervision, and client confidentiality. UK courts have begun issuing warnings β and in some cases costs orders β against parties filing AI-hallucinated content. Civil Procedure Rules do not yet have an explicit AI disclosure rule, but Practice Direction obligations to certify accuracy apply to AI-generated content the same as any other.
The practical effect for UK litigators is the same as for US litigators: verify everything, document your process, and disclose where required.
The Compliance Checklist
A single checklist that, used consistently, would prevent virtually every sanction case from 2025 to mid-2026.
Before drafting.
- Confirm the AI tool you are about to use is in your firm's approved list.
- Confirm the matter sensitivity matches the tool tier.
- Note the tool, version, and date in your matter notes (this is your audit trail).
During drafting.
- Use grounded tools (Westlaw Precision AI, Lexis+ ProtΓ©gΓ©, Clearbrief) for anything involving authority.
- For any ungrounded LLM output, use [CITATION NEEDED] placeholders rather than asking for citations.
- Mark draft sections with comments showing which were AI-assisted.
Before filing.
- Verify every citation in Westlaw, Lexis, or Bloomberg Law. Click the link, open the case, confirm it exists.
- Verify every quotation against the cited source. Use the find function in the case to confirm verbatim wording.
- Verify every record cite. Open the cited document and confirm the page.
- Run a citation verification tool (Clearbrief) across the entire filing.
- Check the judge's standing order and local rules for any AI disclosure requirement.
- Read the entire filing yourself. Sign the certification only after the read.
After filing.
- Preserve the audit trail β tool name, version, date, prompts, drafts.
- If a court raises a citation question, investigate immediately. Do not assure the court without checking.
When You Discover a Hallucination After Filing
If you discover after filing that your brief contains a hallucinated citation or quote, you have a small window to limit damage.
The right move:
- Stop and investigate. Confirm the problem. Identify how it got through your verification process.
- Notify opposing counsel. Professional courtesy and an early signal of good faith.
- File a notice of correction. Withdraw the offending citation. State plainly what happened and what process change has been made.
- Notify the court directly if appropriate. Some judges require this; some standing orders specifically address it.
- Document the corrective process. This becomes evidence of good faith if the court considers sanctions.
The lawyers who get the worst sanctions in 2026 are not the ones who made the original mistake. They are the ones who doubled down or failed to investigate. Self-correction within hours of discovery is the single most important factor in mitigation.
The Disclosure Conversation with Clients
Many clients in 2026 ask their lawyers explicitly: "Are you using AI on my matter?" The professional answer is yes, here is how, and here is how we control the risks.
A useful client-facing summary:
- We use specific legal-grade tools (name them) for research, document review, and drafting assistance.
- All AI-generated content is verified by a human attorney before any filing.
- We do not enter privileged or sensitive matter content into consumer chatbots.
- We maintain audit trails of our AI use.
- We comply with all court disclosure rules.
Clients who hear this answer almost universally become more comfortable, not less.
Key Takeaways
- The 2026 sanctions wave is large and growing β verify everything before filing.
- Five fact patterns drive most sanctions: fabricated citations, misquoted real cases, false boilerplate certifications, failure to investigate after notice, and AI-generated discovery responses.
- Federal and state rules are patchwork β always check the judge's standing order and local rules.
- Use a 14-step compliance checklist on every filing, every time.
- If you discover a hallucination after filing, self-correct immediately and document the process. Self-correction is the single biggest mitigation factor.

