HIPAA and AI: What Clinical Users Need to Know

HIPAA

HIPAA

Feb 23, 2025

Feb 23, 2025

Sophia Carter

Sophia Carter

Here are key considerations for HIPAA-aligned use of AI in clinical documentation and workflow support.

-HIPAA still applies fully to AI tools

Using AI with protected health information (PHI) does not change your HIPAA obligations. Any AI system that accesses, uses, or discloses PHI must follow the same rules on permitted uses and disclosures as any other technology. This applies whether the tool supports documentation, summarization, or clinical operations.

-You must have a Business Associate Agreement (BAA) with AI vendors

If an AI tool will interact with PHI on your behalf, the vendor must sign a BAA and commit to HIPAA safeguards such as encryption, access controls, and audit logging. Without a BAA, using an AI service for PHI is a compliance risk.

-AI does not change the minimum necessary rule

HIPAA’s minimum necessary standard still applies. That means even when using AI, systems should only access the PHI that is strictly needed for the relevant task. Broad ingestion of PHI for model training or processing without limits increases compliance risk.

-Governance and oversight of AI use is increasingly encouraged

Recent national guidance emphasizes that organizations should define clear policies about which AI tools are approved and how they are used, with responsibility assigned for evaluation and monitoring. This helps ensure safe, compliant deployment of AI across clinical and operational areas.

-OCR and regulators are watching discrimination and bias issues

Beyond privacy, federal guidance cautions that AI tools must be assessed for risks of discriminatory impacts and unfair output when used in clinical support roles. This adds an additional compliance lens when evaluating AI systems used for clinical workflows.

-Security requirements are being strengthened

Although proposed Security Rule changes are not finalized, HIPAA-related guidance continues to elevate expectations for technical security — including encryption, access management, logging, and incident response. This affects any system handling PHI, including AI that processes or stores patient data.

What This Means for Clinicians Using AI

Be deliberate about PHI access:
Before using any AI tool that sees or generates content based on patient data, confirm a BAA is in place and that the tool’s technical safeguards (encryption, role-based access, logs) meet HIPAA requirements.

Limit scope of AI use:
Where possible use de-identified data, or restrict PHI passed into AI systems to what is essential for the task (e.g., summaries or document organization rather than raw PHI ingestion).

Document governance:
Have written policies about who approves AI tools, how they are monitored, and what reviews are conducted to ensure ongoing compliance and security.

Keep clinical judgment central:
AI should assist with organization, synthesis, and workflow support — final interpretation and decisions remain under clinician control.