AI Ethics, HIPAA & PHI Safety for Dentists
This is the most important lesson in the course. The legal and ethical risks of using AI in a dental practice come down to one thing: protecting patient health information. Get this right and AI is a transformative tool. Get it wrong and you face HIPAA fines, board complaints, malpractice exposure, and damaged patient trust. The principles below are non-negotiable.
What You'll Learn
- The HIPAA framework as it applies to AI tools in a dental practice
- A practical de-identification checklist
- Which AI tools have signed BAAs and which have not
- A simple office policy you can adopt this week
- The ethical considerations beyond HIPAA — informed consent, AI-generated content disclosure, and patient communication boundaries
HIPAA in 90 Seconds
The Health Insurance Portability and Accountability Act (HIPAA) governs protected health information (PHI) — anything in a patient's record that, alone or combined with other data, identifies the patient. The 18 HIPAA identifiers include: name, address, email, phone, DOB, SSN, MRN, account number, photos, and any unique identifier.
If you (a "covered entity") share PHI with a third party (a "business associate"), the business associate must sign a Business Associate Agreement (BAA) with you. The BAA legally obligates them to safeguard the PHI to HIPAA standards.
The implication for AI: consumer ChatGPT, Claude.ai, Gemini, and Perplexity have NOT signed a BAA with you. Pasting PHI into them is a HIPAA violation. There are no exceptions.
What "PHI" Actually Means in Daily Dental Use
Examples of PHI you must NOT paste into a non-BAA AI tool:
- "Mary Smith, DOB 4/12/1962, MRN 8745..."
- "John Garcia at 245 Maple St..."
- "[patient phone] 555-7843"
- A photo of the patient's face from your intraoral camera
- A radiograph file with embedded patient metadata
- A treatment plan PDF with the patient's name in the header
- An insurance card image
- A medical history form with the patient's identifying details
Examples of de-identified information that IS safe to paste:
- "A 58-year-old female patient presents with..."
- "Patient A, no PMH of significance, on metformin and lisinopril..."
- "Tooth #19, recurrent decay under MOD amalgam, distolingual cusp undermined..."
- A radiograph image stripped of metadata (and the patient is not visually identifiable in it — a bitewing typically is not)
A Practical De-Identification Checklist
Before you press paste, run this checklist (literally — print it and tape it next to the AI-using workstation):
- ☐ No name in the text
- ☐ No DOB, age below 89, or age over 89 (HIPAA requires aggregating 90+)
- ☐ No MRN, account number, SSN
- ☐ No address, zip code (smaller than first 3 digits), email, phone
- ☐ No identifiable photo
- ☐ No combination of pieces that uniquely identifies (e.g., "the only patient in town on this rare medication")
- ☐ No insurance subscriber ID
- ☐ No exact dates (use month/year ranges)
If all eight boxes are ticked, you may paste into your AI tool.
Tools That Have BAA Options
The list of AI tools with HIPAA-compliant BAA options is growing. As of 2026, the following typically offer BAA arrangements:
- ChatGPT Enterprise (OpenAI) — verify with OpenAI sales
- ChatGPT Team / Business — BAA may be available; verify with OpenAI
- Claude for Enterprise (Anthropic) — verify with Anthropic for healthcare customers
- Microsoft Copilot for Healthcare and Life Sciences — included in many Microsoft 365 healthcare plans with BAA
- Google Workspace with Healthcare Industry Plan — BAA available
- Dental-specific AI tools (Bola AI, ClickNotes, AutoChartAI, Pearl, Overjet, VideaHealth, Diagnocat, Modento, Dental Intelligence, Weave, etc.) — many sign BAAs by default; always confirm in writing
Always verify in writing. A vendor saying "we are HIPAA-compliant" in a sales conversation is not the same as you holding a signed BAA. Insist on the document.
A Simple Office Policy You Can Adopt This Week
Print this. Sign it. Have every team member sign it. Keep it in the office handbook.
AI Use Policy — [Practice Name]
- Approved AI tools for our office are: [list — e.g., ChatGPT Plus for de-identified work, Bola AI under signed BAA for clinical voice-to-SOAP, etc.].
- No team member shall paste any patient-identifying information into any AI tool that is not on the approved-with-BAA list.
- Before pasting any clinical or patient-related information into any AI tool, the team member shall run the de-identification checklist.
- AI-generated chart notes, narratives, and patient communications must be reviewed by a licensed clinician (for clinical content) or designated team member (for non-clinical) before being signed, sent, or filed.
- The office maintains a list of signed BAAs with AI vendors at: [location].
- Violations of this policy will be documented and addressed per the office's progressive discipline policy.
- This policy is reviewed annually.
Signed: [Owner], [Date]
Ethical Considerations Beyond HIPAA
Patient consent for AI-assisted documentation. Best practice is a one-line addition to your standard patient intake disclosure: "We use AI tools to help draft documentation and communications. All clinical content is reviewed and signed by your dentist." You don't need patient consent to use AI on de-identified data, but transparency builds trust.
AI-generated content in marketing. Patients increasingly recognize and resent obvious AI-written content. Edit AI drafts. Add the personal touches. Never publish raw AI output as if it were a real human voice.
Clinical responsibility. AI can draft a SOAP note, but the dentist signs the chart. AI can draft a narrative, but the dentist is responsible for the accuracy of the claim. AI can draft a treatment plan explanation, but the dentist is responsible for the recommendation and the informed consent.
Patient relationship. Do not let AI replace warmth at the chair. The 30-second human conversation with the patient before the procedure is more healing than a 5-page handout.
What to Do If You've Already Made a Mistake
If you (or a team member) realize that PHI was pasted into a non-BAA AI tool:
- Do not panic. Document what was pasted and when, and to which tool.
- Notify the practice owner / privacy officer immediately.
- Consult your practice attorney or an HHS HIPAA resource on whether the disclosure rises to the level of a reportable breach. Many minor incidents involving small amounts of PHI to a no-BAA tool may need to be logged but not reported externally — but this is a legal judgment.
- Update the office policy and retrain.
- Move forward more carefully.
A Final Word
AI is a profoundly useful tool for dentists. The dentists who win with it are the ones who treat the HIPAA boundary as inviolable from day one. There is no shortcut. Every prompt, every paste, every workflow runs through the de-identification check or the BAA-signed-tool gate. Build the habit early. The rest of the gains follow.
Key Takeaways
- Consumer ChatGPT, Claude.ai, Gemini, and Perplexity have NOT signed BAAs — never paste PHI into them
- Use the 8-point de-identification checklist before every paste
- Maintain a written list of signed BAAs with AI vendors and a written office AI policy
- For PHI workflows, use ChatGPT Enterprise/Business, Claude for Enterprise, Microsoft Copilot Healthcare, or BAA-signed dental-specific tools — always verify in writing
- Be transparent with patients, edit AI drafts to keep them human, and remember that clinical responsibility stays with the dentist

