Compliance5 min read

AI Voice and HIPAA: What Every Practice Should Verify

If you're a dentist, chiropractor, medical practice, or mental health provider, your AI voice receptionist handles Protected Health Information (PHI) from the first word. "HIPAA compliant" is a phrase every vendor uses — here's what actually has to be true under the hood, and the red flags that should make you walk.

Not legal advice. This guide is a practical checklist for practice owners evaluating AI voice vendors. For a compliance review specific to your practice, work with your HIPAA officer or a qualified healthcare attorney.
01

Get the BAA in writing before you share any PHI

HIPAA requires a Business Associate Agreement (BAA) with any vendor that touches PHI on your behalf. For an AI voice agent, that includes call audio, transcripts, and any patient details the agent collects.

Ask for the BAA during evaluation, not after signing. If the vendor hesitates or says "we're working on that," treat it as a hard stop.

02

Confirm encryption — at rest and in transit

Call audio and transcripts must be encrypted both while being transmitted and while stored. Minimums worth asking about:

  • TLS 1.2 or higher for all network traffic
  • AES-256 for stored recordings and transcripts
  • Encryption keys managed by the vendor, not shared
03

Confirm your data is not training a model

Many general-purpose AI services use customer data to improve their models by default. For PHI, this is a non-starter. Ask the vendor to state explicitly, in writing, that call audio and transcripts are not used for model training — and that any third-party LLM provider they use (OpenAI, Anthropic, Google, etc.) is bound by the same restriction through their enterprise agreement.

04

Ask about retention and deletion

HIPAA doesn't dictate exact retention periods, but your own policy should. Ask:

  • How long are call recordings retained by default?
  • Can retention be shortened on your request?
  • How is a patient's right-to-delete request honored across recordings and transcripts?
  • When the contract ends, what happens to your call history — deleted or returned?
05

Audit the access controls

Who inside the vendor's organization can listen to your calls? The answer should be "a small number of named engineers under an audited access policy," not "whoever on support has a login." Ask for:

  • Role-based access controls
  • Access logging (who listened to what, when)
  • SSO/MFA required for admin accounts
06

Verify the integration chain

Your AI receptionist probably writes into your CRM and scheduling system. Every hop is a BAA surface. Make sure the CRM (GHL, HubSpot, etc.) is also under BAA if it stores PHI, and that data flowing between systems is encrypted end to end.

07

Red flags — walk away

  • "We're SOC 2 compliant" offered as a substitute for a BAA. SOC 2 is a control framework; it is not HIPAA.
  • Recordings stored on a generic cloud bucket with no vendor BAA in place.
  • Vague language about training data ("we may use anonymized data to improve the service").
  • No way to delete a specific caller's data on request.
  • Contract terms that assign HIPAA liability back to you for the vendor's systems.

How Nova handles it

BAA available on request. Recordings and transcripts are encrypted at rest and in transit. No training on your call data. Retention configurable per practice. If you want the specifics in writing before evaluating Nova, book a call.