The Short Answer
No consumer AI product is HIPAA compliant. Not ChatGPT Plus. Not Claude Pro. Not Gemini. Not the free version of anything.
HIPAA compliance for AI tools requires two things: a signed Business Associate Agreement (BAA) with the vendor, and an implementation that meets HIPAA's administrative, physical, and technical safeguard requirements. Consumer AI products offer neither.
If you're a healthcare provider, dental practice, or any organization that handles Protected Health Information (PHI), you need to be on an Enterprise tier or a dedicated healthcare platform. Here's exactly what's available today.
HIPAA-Eligible AI Platforms
These platforms offer BAAs and have the compliance infrastructure to support HIPAA-covered use cases:
4
Major platforms with BAA availability
ZDR
Zero-data-retention options on Azure & Vertex
BAA
Required before any PHI touches any AI system
Azure OpenAI Service
Microsoft's Azure OpenAI Service is covered under the Microsoft BAA. You get access to GPT-4, GPT-4o, and other models through Azure's compliance framework. Key features for healthcare:
- BAA available as part of Microsoft's standard enterprise agreements
- Zero-data retention option (prompts and completions are not stored)
- SOC 1/2/3, ISO 27001, HITRUST certified
- Virtual network isolation and customer-managed encryption keys
- Audit logging for all API calls
Azure OpenAI is probably the most mature HIPAA-eligible AI platform today, mostly because it inherits Azure's deep healthcare compliance infrastructure. If your practice already runs Microsoft 365, this is the natural starting point.
Google Vertex AI
Google Cloud's Vertex AI platform is covered under Google Cloud's BAA. You get Gemini models and other AI capabilities within Google Cloud's compliance boundary. See our Gemini for Business guide for the full breakdown.
- BAA available through Google Cloud agreements
- HIPAA, HITRUST, SOC 2 certified
- Zero-data retention configuration available
- Customer-managed encryption (CMEK)
- VPC Service Controls for network isolation
Amazon Bedrock
AWS's managed AI service gives you access to multiple foundation models (Claude, Llama, Titan, and others) through AWS's compliance framework.
- BAA available through AWS Business Associate Addendum
- HIPAA eligible when configured correctly
- Data is not used for model training
- PrivateLink support for network isolation
- AWS CloudTrail audit logging
ChatGPT Enterprise
OpenAI's Enterprise tier offers a BAA for HIPAA-covered use cases. This is the only ChatGPT tier with a BAA. Not Plus, not Team. See our ChatGPT for Business guide for the full tier comparison.
- BAA available (must be requested and signed separately)
- Data excluded from model training
- SOC 2 Type II certified
- SSO/SCIM for identity management
- Dedicated instance option for data isolation
What a BAA Actually Covers
A Business Associate Agreement is a legal contract between a healthcare provider (the "covered entity") and a vendor that handles PHI on their behalf (the "business associate"). In plain terms:
What the Vendor Agrees To
Business Associate obligations
- ✓ Protect PHI per HIPAA standards: encryption, access controls, breach notification, training
- ✓ Use PHI only for contracted services
- ✓ Share liability: vendor is on the hook for breaches
- ✓ Notify you within a specific timeframe if PHI is compromised
Without a BAA
What you're risking
- ✗ Processing PHI through any AI tool is a HIPAA violation, full stop
- ✗ No legal framework governing how vendor handles patient data
- ✗ No breach notification requirements
- ✗ No financial liability for the vendor if data is exposed
Common HIPAA AI Mistakes
These are real scenarios we've seen in healthcare practices. Every one of them is a HIPAA violation:
REAL VIOLATIONS WE'VE SEEN
Pasting patient names into free ChatGPT: "Summarize this patient's treatment history." The patient's name, diagnosis, and treatment details are now in OpenAI's training pipeline.
Uploading medical records to AI tools: Dragging a PDF of lab results into Claude or ChatGPT to get a summary. Even on paid tiers without a BAA, this is non-compliant.
Using AI-generated clinical responses without review: AI can hallucinate medical information. Every AI-generated response involving patient care must be reviewed by a qualified professional.
Sharing patient data in AI-generated emails: Using AI to draft patient communications and including PHI in the prompt.
Screenshot sharing: Taking screenshots of AI conversations that contain PHI and sharing them via text or non-encrypted channels.
Implementation Checklist
If you've decided to bring AI into a healthcare setting, here's the process we recommend:
Select a HIPAA-eligible platform
Choose from the platforms listed above. Don't try to make a consumer product work.
Execute a BAA
Sign the Business Associate Agreement with the vendor before any PHI touches the system.
Conduct a risk assessment
Document what data will be processed, by whom, and for what purpose. Cover AI-specific use cases.
Configure the platform for HIPAA
Enable audit logging, encryption at rest and in transit, access controls, and data retention policies.
Create an AI acceptable use policy
Define what can and cannot be entered into AI tools. See our AI Policy Template.
Train all staff and document it
HIPAA requires evidence that training occurred. Cover AI-specific scenarios.
Implement monitoring
Audit logs should be reviewed regularly for unauthorized access or misuse.
Review quarterly
AI capabilities and compliance requirements change fast. Your policy and implementation should keep pace.
Industry Spotlight: Dental Practices
Dental practices are one of our core client segments at Gladiator IT, so we'll speak directly to that context.
Most dental practices are small businesses: 5-30 employees, one to three locations, running on a mix of dental practice management software (Dentrix, Eaglesoft, Open Dental) and general business tools (Microsoft 365 or Google Workspace).
The practical recommendation for most dental practices: use a Team-tier AI tool for general business tasks with de-identified data. If you need AI that touches patient information, invest in a BAA-covered platform.
The AI use cases we see most in dental:
Safe Without BAA
Use de-identified data on Team tier
- ✓ Treatment plan explanations: draft patient-friendly procedure descriptions
- ✓ Front desk templates: appointment confirmations, recall reminders, onboarding
- ✓ General business tasks: marketing copy, internal communications
Requires BAA Platform
Any use involving patient identifiers
- ✓ Insurance claim drafting: narratives with patient identifiers
- ✓ Clinical note summarization: patient records for referrals or case reviews
- ✓ Personalized patient comms: automated follow-ups with patient context
BOTTOM LINE
For a broader view of compliance requirements beyond HIPAA, see our AI Compliance Guide. Or take our AI Readiness Assessment to get a recommendation tailored to your practice. Need implementation help? Schedule a discovery call with our team.