This is an illustrative scenario based on common patterns we see across SMBs. Not a real client engagement. Names, figures, and details are constructed to reflect typical situations in this industry.
Common starting state
A 25-person specialty healthcare practice—physicians, nurse practitioners, front-desk staff, and a billing team—is in a position that many practices don't recognize as precarious. They're not using AI in any formal sense. But their EHR has added AI-powered note assistance. Their scheduling software has a chatbot. One of the physicians uses an AI transcription tool for patient notes that they found independently and pay for personally.
The practice is HIPAA-covered. That means any tool that touches protected health information (PHI)—which is most of what they do—requires a Business Associate Agreement (BAA) with the vendor. A BAA is a legal contract that says, in essence, the vendor promises to handle PHI according to HIPAA requirements and will notify you if there's a breach.
The EHR vendor has a BAA. The scheduling software does not. The AI transcription tool the physician is using personally: no BAA, no documentation, no review—and it processes patient dictations in full.
Risks identified
Healthcare AI audits surface a specific class of problem: the tools are often less dangerous than the process gaps around them.
Unsanctioned tool with PHI access. The physician's personal transcription tool is the highest-severity finding in this scenario. PHI is flowing to a vendor with no BAA, which is a potential HIPAA breach regardless of whether any data was misused. Under HIPAA, lack of a BAA is itself a violation.
EHR vendor AI features without review. The AI note assistance the EHR added is likely covered under the existing BAA—but "likely" isn't a compliance answer. The practice needs to verify their current BAA covers AI features and data processing, not just the original product scope. Vendors add AI features to existing products faster than they update their BAA language.
Scheduling chatbot with appointment data. Patient appointment data—name, date, reason for visit, contact info—is PHI. If the scheduling software's chatbot collects or processes that data without a BAA, it's the same exposure as the transcription tool.
Staff training gap. Front-desk staff interact with scheduling and patient contact systems daily. They have not received any guidance on what AI tools are in use, what data they process, or what to do if something seems wrong.
What we'd typically recommend
HIPAA requires specific, documented controls. The recommendations here aren't optional enhancements—they're baseline requirements.
-
Get a BAA or remove the tool. For each AI-adjacent tool in the stack, the practice needs one of two things: a signed BAA from the vendor, or confirmation that the tool doesn't touch PHI. For the personal transcription tool, the physician either signs up for an enterprise plan that includes a BAA, or stops using it for patient dictations. There's no third option.
-
Review existing BAA scope. Pull the EHR vendor BAA and confirm it explicitly covers AI features. If it doesn't, request an updated one. This is a standard ask—EHR vendors have done thousands of these.
-
Audit the scheduling vendor. Contact the vendor in writing, ask whether their chatbot processes PHI, and ask for their BAA. Document the response. If they don't offer a BAA, the chatbot feature needs to be turned off or replaced.
-
Identify safe AI workflows. Not everything in a healthcare practice touches PHI. Administrative drafting—grant summaries, marketing copy, staff meeting agendas, supply vendor emails—can use AI tools freely. The practice should identify these explicitly so staff know where AI assistance is appropriate.
-
Update training with a PHI-boundary focus. A brief, practical briefing: what is PHI, what tools are approved for workflows that touch it, what tools are approved for workflows that don't, and how to report if something seems wrong. This doesn't need to be a full HIPAA refresher—it needs to be specific to AI.
Outcome to expect
The two safe workflows identified in this scenario are: AI drafting for administrative communications (vendor emails, scheduling reminders written by staff rather than auto-generated, internal memos) and AI assistance for literature research that uses no patient data. Both are low-risk, genuinely time-saving, and don't require BAAs because no PHI flows through them.
The three avoided risks are the transcription tool, the scheduling chatbot, and any use of general-purpose AI tools for clinical documentation without a BAA in place. Each of those could have resulted in a reportable HIPAA incident.
The timeline for getting to a documented compliant position in a 25-person practice is typically four to six weeks: two weeks to complete the vendor reviews, one week to draft and disseminate the policy, and two to three weeks for the BAA paperwork to be returned by vendors.
The practical outcome isn't that the practice uses more AI—it's that they use AI in the places where it's safe to do so, with documentation that shows they thought it through.