Privacy & What Not to Share
ChatGPT is a powerful tool, but it's important to understand what happens with your data and what information you should keep private. This lesson covers privacy best practices for safe and responsible use.
Understanding ChatGPT's Data Handling
What Happens to Your Conversations
By default, OpenAI may use your conversations to:
- Improve model performance
- Train future models
- Analyze for safety and abuse prevention
Important controls:
- You can opt out of data training in settings
- ChatGPT Team and Enterprise plans have stronger privacy by default
- Temporary chat mode prevents saving to your history
Where Your Data Goes
| Plan | Training Data Usage | History Retention |
|---|---|---|
| Free | May be used (can opt out) | 30 days minimum |
| Plus | May be used (can opt out) | 30 days minimum |
| Team | Not used for training | Controlled by admin |
| Enterprise | Not used for training | Full control |
What Never to Share
Personal Identifiers
Never share:
- Social Security numbers
- Government ID numbers
- Passport numbers
- Driver's license numbers
- Credit card numbers
- Bank account numbers
Even if you need help with a financial calculation, use placeholder numbers:
Bad: "Help me calculate taxes on my SSN 123-45-6789"
Good: "Help me understand how to calculate self-employment tax on $75,000 income"
Authentication Credentials
Never share:
- Passwords
- API keys
- Access tokens
- Private encryption keys
- Two-factor authentication codes
- Security questions and answers
Sensitive Business Information
Be cautious with:
- Trade secrets
- Unreleased product details
- Confidential strategies
- Internal financial data
- M&A discussions
- Legal matters in progress
Private Information About Others
Don't share:
- Other people's personal details without consent
- Private conversations
- Health information about others
- Identifying details about clients or customers
Regulated Data
If you work with protected data:
- HIPAA - Protected health information
- FERPA - Student education records
- GDPR - EU personal data
- PCI DSS - Payment card data
- SOC 2 - Customer data under audit controls
Check with your organization before using ChatGPT with any regulated data.
Safe Alternatives
Use Anonymized Data
Instead of real data:
Use Placeholders
For examples and templates:
Describe Instead of Sharing
Instead of pasting sensitive content:
Use Representative Examples
Instead of real data:
"I have a dataset with customer ages (25-65), income levels ($30K-$150K), and purchase history.
Here's a synthetic sample that represents the patterns: [made-up but realistic data]"
Privacy Settings and Controls
Opting Out of Training
- Go to Settings
- Navigate to "Data controls"
- Toggle "Improve the model for everyone" off
Note: This prevents your data from training future models but conversations may still be reviewed for safety.
Using Temporary Chat
For sensitive one-off conversations:
- Start a new chat
- Enable "Temporary chat" mode
- Conversation won't be saved to your history
Managing Chat History
- Delete individual conversations (click ... and delete)
- Use Settings > Data controls to clear all history
- Export your data if you want a backup
Account Security
Protect your ChatGPT account:
- Use a strong, unique password
- Enable two-factor authentication if available
- Don't share your login credentials
- Log out on shared devices
Organizational Considerations
Personal Account vs. Work
Consider whether personal ChatGPT accounts are appropriate for work tasks:
| Use Case | Personal Account | Work Account |
|---|---|---|
| Learning and practice | Yes | Yes |
| Personal projects | Yes | Yes |
| Client work | Maybe with care | Better option |
| Confidential business | No | With policies |
| Regulated data | No | Only with approval |
Creating Usage Policies
If you're setting policies for your organization:
Privacy Checklist Before Sharing
Before pasting anything into ChatGPT, ask:
- Does this contain personal identifiers? (names, SSN, etc.)
- Are there authentication secrets? (passwords, API keys)
- Is this confidential business information?
- Does this involve other people's private information?
- Is this regulated data? (HIPAA, GDPR, etc.)
- Would I be comfortable if this were public?
If you answer "yes" to any of these, anonymize, generalize, or don't share.
When You've Made a Mistake
If you accidentally shared sensitive information:
- Delete the conversation immediately
- Change compromised credentials (passwords, API keys)
- Report to relevant parties (security team, affected individuals)
- Learn from it - Set up better habits
Exercise: Practice Privacy-Safe Prompting
Rewrite this unsafe prompt to be privacy-safe:
Key Takeaways
- Never share personal identifiers, credentials, or regulated data
- Anonymize data when you need help with real scenarios
- Use placeholders instead of actual sensitive values
- Know your settings - Opt out of training if desired
- Consider your context - Personal vs work accounts matter
- When in doubt, don't share - Ask yourself if you'd be okay with it being public
Privacy-aware usage protects you, your organization, and others. Make it a habit to pause and think before pasting anything sensitive.

