Data Privacy, Compliance, and Client PII in Advisory AI
You work in one of the most heavily regulated corners of the economy, and you hold some of the most sensitive data your clients possess: Social Security numbers, account numbers, balances, health and family situations, estate plans. Using AI well means being deliberate about what you put into it. This lesson gives you the rules and the habits.
What You'll Learn
- What counts as client PII and confidential information you must protect
- How consumer AI tools handle your data, and what "enterprise" tiers change
- A practical framework for anonymizing prompts
- The compliance obligations that apply to AI-assisted work in an advisory firm
What You're Protecting
In an advisory practice, "sensitive" is broader than just account numbers. Treat all of the following as confidential:
- Direct identifiers: name, address, phone, email, Social Security or tax ID, date of birth
- Account data: account numbers, custodian, balances, holdings, transaction history
- Financial situation: income, net worth, debts, business interests, compensation
- Life details: marital status, divorce, health conditions, disabilities, children's situations, inheritances, anything shared in confidence during planning conversations
- Internal information: your firm's models, fee schedules, compliance notes, and business plans
You are bound by privacy rules (in the US, Regulation S-P and your firm's privacy policy, plus state laws), your client agreements, and your fiduciary duty of confidentiality. Pasting any of the above into a tool your firm has not vetted is the AI-era version of leaving a client file open on a hotel-lobby table.
How Consumer AI Tools Handle Your Data
When you use the standard, free, or low-cost individual versions of ChatGPT, Claude, or Gemini, your prompts are sent to the provider's servers to generate a response. Historically, some providers used consumer conversations to help improve their models unless you opted out; policies vary and change, and many now offer settings to disable training on your chats. The safe assumption: anything you type into a consumer AI tool may be retained and could, in some configuration, be used to improve the service.
That changes with enterprise and business tiers — ChatGPT Enterprise/Team, Claude for Work/Team, Gemini in Google Workspace, Microsoft 365 Copilot. These typically come with contractual commitments that your data is not used for training, plus admin controls, retention settings, and sometimes the ability to keep data within your organization's tenant. Many broker-dealers and larger RIAs have signed agreements like these and have an approved tool. If your firm has one, use it — and only it — for anything touching client information. If your firm has not addressed AI yet, raise it with compliance before you put client data anywhere.
The Anonymization Habit
For everyday drafting on a consumer tool, the simplest safe practice is to strip identifiers before you paste and add them back after. The AI does not need to know who the client is to do good work.
Before (do not do this):
Draft a follow-up email for John Smith, SSN 123-45-6789, account #4456-7781 at Schwab, $1.43M balance, who lives at 14 Elm St...
After (do this):
Draft a follow-up email for a retired client, age 67, roughly $1.4M across IRA and taxable accounts, conservative allocation, who wants to increase monthly withdrawals from $5,000 to $6,000. We discussed the impact on plan longevity and agreed to revisit in six months.
Same useful output, none of the exposure. Then you paste the draft into your email and add the client's name yourself.
A quick checklist before you hit enter on any prompt:
- No names, addresses, phone numbers, or emails
- No Social Security or tax ID numbers
- No account numbers or custodian-specific identifiers
- Round or generalize dollar figures ("about $1.4M," not "$1,432,118.04")
- Generalize anything that could re-identify someone (a niche profession in a small town, a specific employer plus a specific role)
- For documents: redact the cover page and any account numbers before uploading
When in doubt, don't paste it. The time you save is not worth a privacy incident.
Compliance Obligations Don't Disappear Because "AI Did It"
Regulators have been explicit: AI-assisted work is your work. A few principles to operate by:
- Supervision applies. An AI draft is like a draft from a junior employee — it must be reviewed and approved by someone responsible before it goes out. Your firm's supervisory procedures should cover AI use.
- Books-and-records rules still apply. If AI helped produce a client communication, that communication (the final version you sent) must be retained like any other. Many firms also want a record of material AI involvement.
- Marketing rules still apply. The SEC Marketing Rule governs advisor advertising regardless of how it was drafted. And "AI washing" — overstating your use of AI, or claiming AI-driven results you can't support — has drawn enforcement. Describe your AI use accurately or not at all.
- No giving the AI the wheel. AI does not make suitability or fiduciary determinations, does not select recommendations, and is not a substitute for licensed judgment. It assists; you decide and you sign.
- Vendor diligence. If you adopt an AI-powered tool (a notetaker, a planning assistant), that's a vendor relationship — review its data handling, security, and contractual terms the way you would any other.
A good rule of thumb: before using AI for a task, ask "would I be comfortable explaining exactly how I used it to my compliance officer and, if needed, an examiner?" If yes, proceed. If you hesitate, that's your answer.
Key Takeaways
- Protect a wide circle of client information — identifiers, account data, financial details, life circumstances, and your firm's internal data — not just account numbers.
- Assume anything typed into a consumer AI tool may be retained; only enterprise/business tiers come with the contractual data protections regulated firms need. Use your firm's approved tool for anything touching client data, and if there isn't one, ask compliance first.
- Build the anonymization habit: strip names, SSNs, and account numbers, round dollar figures, generalize re-identifying details, and add specifics back yourself afterward.
- Compliance obligations — supervision, recordkeeping, the Marketing Rule, fiduciary judgment, vendor diligence — apply fully to AI-assisted work. "The AI did it" is not a defense.

