Data Security and Confidentiality: What NOT to Put in AI Tools
This is possibly the most important lesson in this course. The productivity gains from AI are real — but so are the risks of handling confidential financial data incorrectly. Getting this wrong can breach client confidentiality, violate GDPR, expose trade secrets, or create regulatory liability.
The Core Risk
When you type information into a public AI tool like ChatGPT or Claude, that information:
- Is transmitted to the AI provider's servers
- May be used to improve the model (in some configurations)
- Is subject to the provider's data retention policies
- Could theoretically be accessed if the provider experienced a security breach
Most consumer-facing AI tools are not designed to meet enterprise data security requirements by default.
What You Should NEVER Put Into Public AI Tools
Absolute no-list:
- Customer or client financial data with identifying information
- Unpublished financial results (material non-public information — MNPI)
- Individual employee salary or compensation data
- Supplier pricing terms that are confidential
- M&A information, deal details, or acquisition targets
- Personally identifiable information (PII) covered by GDPR
- Bank account details or payment information
- Passwords, API keys, or access credentials
- Confidential investor information
The Anonymisation Approach
You can often get the AI assistance you need while protecting confidentiality — by anonymising the data first.
Instead of:
"Our revenue for Q3 was £12.4m, up from £10.8m last year..."
(This reveals your actual company performance)
Use:
"A company's revenue for Q3 was £X, up from £Y last year..."
(Replace actual figures with placeholders if the specific numbers are sensitive)
For client data:
Replace client names with generic descriptions ("a retail client," "Client A"). Replace specific figures with approximate ranges if the precise numbers aren't needed for your question.
For employee data:
Never include names, salaries, or personal details. Describe scenarios generically: "an employee earning in the £60k-£80k range" rather than naming the person.
Enterprise AI Tools: The Safer Alternative
Several AI providers offer enterprise versions with stronger data protections:
ChatGPT Team/Enterprise: Data is not used for model training by default. Zero data retention option available.
Claude.ai Teams: Similar protections, data isolated from training.
Microsoft Copilot for Microsoft 365: Data stays within your Microsoft 365 tenant and is subject to your organisation's security policies.
Google Workspace AI Features: Similarly governed by your organisation's Google Workspace agreement.
If your organisation has deployed one of these enterprise tools, prefer them for work with sensitive data.
Check Your Company Policy First
Before using any AI tool for work, find out:
- Does your company have an approved list of AI tools for work use?
- Are there categories of data that are explicitly prohibited?
- Is there an enterprise version of an AI tool available to you?
- Who should you contact if you're unsure?
Many large organisations and financial services firms have strict policies on this. Using a non-approved tool with client data could be a disciplinary or regulatory matter.
Red Flags in Your Own Workflow
Ask yourself these questions:
- "Would I be comfortable if my client saw exactly what I pasted into this tool?"
- "Would I be comfortable if this prompt appeared in a data breach?"
- "Is this information material and non-public?"
- "Does this information identify a specific individual?"
If yes to any of these, anonymise first or don't use AI for this specific task.
The Practical Rule
A simple way to think about it: If the data would require special handling in an email or document (encryption, NDA, restricted distribution), apply the same caution before putting it into an AI tool.
Your Turn
Review the last 5 prompts you've sent to an AI tool (or imagine the type of prompts you'd typically send). For each one, ask: was there confidential information that I should have anonymised? Build this check into your workflow.
Discussion
Sign in to join the discussion.

