AI and HIPAA: How to Save Time Without Risking Your Practice

by

TLDR: The Golden Rule of AI & HIPAA “If you wouldn’t post it on a billboard, don’t paste it into ChatGPT.” Unless you have a signed Business Associate Agreement (BAA) with an AI vendor, you must assume that everything you type into the chat box is being read, stored, and used to train the system. When in doubt, keep client data out.

If you run a private practice, you know the drill: HIPAA (Health Insurance Portability and Accountability Act) is the federal law designed to protect Patient Health Information (PHI). But in 2026, complying with HIPAA isn’t just about locking filing cabinets or using a secure email server.

We are living in the age of AI. Tools like ChatGPT and automated schedulers are revolutionizing how we run businesses, promising to save us hours of admin time. But there is a friction point: HIPAA was built for a world of paper records and locked doors. AI is “probabilistic”—it guesses, it predicts, and crucially, it learns.

As a web designer who works exclusively with private practice owners, I see the backend of many businesses. I’ve watched the shift from simple contact forms to complex AI integrations, and I’ve seen how easy it is to accidentally cross a line you didn’t know was there.

Before you integrate that new “time-saving” tool, you need to understand the new rules of the road. Here is everything you need to know about navigating the intersection of AI and HIPAA without putting your license—or your clients—at risk.

HIPAA and Why It Matters in the Age of AI

PHI isn’t just a diagnosis. It’s any piece of information—a name, an IP address, or even a session date—that can be linked to an individual’s health status.

HIPAA was built for a “deterministic” world (where input A always leads to output B). But, Artificial Intelligence is “probabilistic.” It guesses, it predicts, and—crucially—it learns.

When you introduce AI into your practice, you are introducing a system that devours data to get smarter. If that data is your client’s private history, you are walking a regulatory tightrope. With the January 2025 updates to the HIPAA Security Rule, the government has made it clear: encryption and strict access controls are no longer optional “addressable” features—they are requirements.

Is ChatGPT HIPAA compliant for private practices?

Using the free version of ChatGPT to summarize your client notes is a HIPAA violation.

Just because a tool is popular, doesn’t mean it’s safe. Most general-use AI platforms (like standard ChatGPT, Claude, or Gemini) state in their terms of service that they use input data to train their models. If you feed a client’s trauma or health history into a public chatbot, that data potentially becomes part of the brain of the AI, technically accessible to others.

To be compliant, you must verify that the AI vendor will sign a Business Associate Agreement (BAA). This is a legally binding contract where the vendor promises to protect PHI and—most importantly—promises not to use your data to train their public models. If they won’t sign a BAA, they aren’t for you.

Using AI in a Clinical Setting

As a web designer who helps health practitioners and therapists build out their online presence, I see the backend of many systems. The risks I see aren’t usually malicious; they are accidental.

  • Data Leakage and “Shadow AI”: This happens when well-meaning staff use unapproved AI tools to speed up work. For example, using a random browser extension to record a telehealth session. If that extension sends audio to a non-compliant server, you’ve breached HIPAA.
  • Hallucinations impacting Integrity: AI “hallucinates,” meaning it confidently invents facts. If an AI scribe invents a diagnosis that you didn’t say, and that gets into the Electronic Health Record (EHR), you have corrupted the integrity of the medical record. This creates liability for malpractice and false claims.
  • The “Mosaic Effect”: You might think, “I removed the client’s name, so it’s safe.” Not necessarily. AI is powerful enough to cross-reference anonymized data with public datasets (like voter rolls or social media) to re-identify individuals. This is called the Mosaic Effect, and it renders simple “name scrubbing” insufficient.

Common Mistakes Private Practice Owners Make with AI

1. The “Pixel War” Trap: Using Tracking Pixels on Social and Websites

Yes, they can. The FTC and the Office for Civil Rights (OCR) have recently cracked down on this. If you have a Meta (Facebook) Pixel or Google Analytics tracker on your site, it is collecting your visitor’s IP Address. While an IP address seems harmless, the government argues that if an IP address is captured while a user is on a page like “Treatment for Postpartum Depression,” that combination reveals a health condition.

This is a huge issue in web design right now. If you have a “Schedule a Consultation” page and you are using a standard Facebook (Meta) Pixel or Google Analytics tracker, you might be in trouble. Recent class-action lawsuits (like the one against Kaiser Permanente) and FTC rulings have determined that transmitting a user’s IP address alongside their intent to seek healthcare (e.g., visiting an “Oncology” page) counts as a PHI disclosure.

The Fix: You need to switch to “Server-Side Tracking” or use HIPAA-compliant analytics tools that anonymize data before it reaches Facebook/Google.

The Rule: If you are a covered entity (which most private practices are), you cannot use standard tracking pixels on pages where health services are researched or booked unless that tracker signs a BAA (which Facebook and Google generally will not do for standard analytics).

2. Trusting “Anonymization” Too Much

Inputting session notes into an AI and asking it to “rewrite this without names” is risky. The moment you input the data, the breach has likely already occurred if the platform saves chat history for training.

3. Ignoring the “Draft-Only” Rule

Some practitioners let AI write emails or notes and send them without review. If the AI exhibits bias (a violation of Section 1557 of the ACA) or hallucinates a medication, you are liable. AI should only ever produce a draft.

4. The “Browser Extension” Blindspot

This is definitely the most common violations I see. You might be careful not to paste notes into ChatGPT, but do you have an AI writing assistant (like a grammar checker or “co-pilot”) installed as a Chrome extension? Many of these extensions have permission to “read and change data on all websites.” If you have your EHR (like SimplePractice or TherapyNotes) open in one tab and that extension is active, it may be reading the patient data on your screen to offer “writing suggestions.” Use a separate browser (or a “Incognito” window without extensions) strictly for clinical work.

5. The “Silence” Trap (Lack of Transparency)

Some practitioners use AI scribes or chatbots but are afraid to tell their clients because they fear pushback. If a client finds out later that a machine was listening to their trauma narrative without their explicit consent, the trust is broken forever. Furthermore, new state laws (like in California and Utah) and updated ethical codes are increasingly mandating that you must disclose when AI is being used. Update your informed consent forms to include a specific “AI Technology” clause and discuss it verbally during intake.

6. The “Small Fish” Fallacy

“I’m just a solo practitioner; the OCR (Office for Civil Rights) isn’t looking at me.” Automated enforcement doesn’t care how small you are. The “Pixel” lawsuits often target specific technologies, not just specific companies. If a plaintiff’s attorney runs a script that detects a tracking pixel on 1,000 different therapy websites, you get swept up in the net regardless of your size. Treat your solo practice data with the same rigor as a hospital. Security by obscurity is not a strategy.

What counts as a HIPAA violation when using AI?

It’s not just about “hacking.” You have violated HIPAA if:

  1. Inputting PHI: You paste a client’s email or symptoms into a non-BAA tool.
  2. Lack of Integrity: The AI “hallucinates” (invents) a symptom in a client’s record, and you fail to catch it before saving. HIPAA requires you to maintain accurate records; an unverified AI note is a compliance breach.
  3. Unintentional Training: You use a tool that claims to be private but actually uses your data to train its “foundation model,” effectively teaching the AI about your client’s trauma.

What AI tools can healthcare providers use safely?

You don’t have to ban AI from your private practice to be safe. In fact, ignoring AI might mean you’re working harder than you need to. The trick is to treat AI like a “Digital Intern”—one that is incredibly smart but has not signed a confidentiality agreement.

You can safely give this intern general tasks, creative projects, and public-facing work. You just never, ever let them see a patient chart.

Here is a practical breakdown of how you can use AI right now to save hours every week, risk-free.

1. Marketing & Content Creation ( The “Green Light” Zone)

This is the safest place to use tools like ChatGPT, Claude, or Jasper because you are dealing with public information, not private health data.

  • Brainstorming & Ideation: stare at a blank screen no more. Ask AI: “I am a dietitian focusing on intuitive eating. Give me 10 blog post titles that address holiday eating anxiety without sounding judgment-y.”
  • SEO & Meta Data: As a web designer, I see so many practitioners skip this because it’s boring. Paste your (generic) blog post into AI and ask: “Write a 160-character meta description for this post that includes the keyword ‘anxiety therapy in Chicago’.”
  • Social Media Captions: You can describe an image you took (e.g., “a photo of a calm office with a plant”) and ask AI to write an engaging Instagram caption about the importance of creating a safe space.
  • The “Frankenstein” Rule: If you want to write a case study, do not upload a real client’s story and ask AI to disguise it. Instead, ask AI to generate a fictional scenario: “Create a persona of a 30-year-old struggling with work-life balance and burnout, and list 3 common coping mechanisms they might try.” This keeps your real clients completely out of the equation.

2. Operational Efficiency (The “Backend” Zone)

Running a private practice is 50% clinical work and 50% unbilled admin. AI can handle the unbilled part if you keep the data generic.

  • Polishing Difficult Emails: Need to enforce your cancellation policy but afraid of sounding harsh? Draft the email without the client’s name or specific details.
    • Prompt: “Rewrite this email to sound firm but empathetic: ‘You missed your appointment again and I have to charge you the full fee per my policy.'”
    • Result: You get a professional script you can copy-paste into your secure email system, adding the client’s name after you leave the AI tool.
  • Excel & Spreadsheet Formulas: If you are trying to track expenses or analyze your referral sources in Excel but hate math, ask AI.
    • Prompt: “Write an Excel formula that calculates the percentage of total inquiries that came from ‘Psychology Today’.”
  • Policy Generation: Need a new “Social Media Policy” for your practice website? AI can draft a robust template in seconds. You just need to review it to ensure it matches your actual boundaries.

3. Clinical Support (The “Reference” Zone)

You can use AI as a super-powered search engine or a creative clinical partner, provided you are pulling information out rather than putting client info in.

  • Psychoeducation Metaphors: Stuck on how to explain a complex concept?
    • Prompt: “Explain the ‘Window of Tolerance’ using a metaphor that would make sense to a teenage gamer.”
    • Prompt: “Give me 3 simple analogies to explain how insulin resistance works to a client who loves gardening.”
  • Worksheet & Resource Creation: Need a handout for a session in 10 minutes?
    • Prompt: “Create a checklist of 5 grounding techniques for panic attacks that can be done in a public place.”
    • Prompt: “Draft a weekly meal planning template that focuses on adding nutrients rather than restricting calories.”
  • Intervention Ideas: If you are feeling stuck with a general presentation (e.g., “grief that feels like anger”), ask AI for interventions. Just ensure you don’t input who the client is.

4. Clinical Documentation (The “Proceed with Caution” Zone)

This is the holy grail of time-saving: AI scribes that listen to your sessions and write your SOAP notes. This is the only category where you MUST spend money. Free tools here are illegal.

To do this safely, you must move away from “open” tools (like the public ChatGPT) and use specialized “closed” tools designed for healthcare.

  • The “BAA” Mandate: You must use a tool that will sign a Business Associate Agreement (BAA). This is non-negotiable.
  • Zero Data Retention (ZDR): Look for tools that explicitly promise ZDR. This means the AI processes the audio in real-time to create the text, and then poof—the audio recording is permanently deleted. It never sits on a server where a hacker could find it.
  • The Workflow:
    1. Turn on the HIPAA-compliant app (e.g., Heidi Health, Freed, DeepScribe) at the start of the session (with client consent).
    2. The AI listens and transcribes.
    3. The AI formats the note into SOAP or DAP format.
    4. You review and edit the note for accuracy (correcting any hallucinations).
    5. You copy the final note into your EHR (SimplePractice, Jane, etc.) and delete it from the AI app if it doesn’t auto-delete.

How to Use AI Responsibly in Private Practice

If you want to sleep well at night, follow this “Responsible Use” framework:

  1. The BAA Test: Does the vendor sign a BAA? If no, no PHI touches it.
  2. Zero Training Clause: Ensure your contract explicitly states that your data will not be used to train the vendor’s foundation models.
  3. Human-in-the-Loop (HITL): Never let AI be the final decision-maker. You must review every output.
  4. Transparency: New laws, like those popping up in California, require you to disclose if a chatbot is AI. Be honest with your clients: “This chat is automated.”

Questions About AI and HIPAA

Can I use ChatGPT to write client notes if I don’t use names?

Generally, no. Unless you are on the Enterprise plan with a signed BAA and have configured “Zero Data Retention,” standard ChatGPT uses your inputs for training. Even without names, the clinical context can be re-identified.

Is it safe to use AI chatbots on my website?

It is risky. If the chatbot is provided by a third party that records the conversation for their own “product improvement,” you could be liable under wiretapping laws (CIPA) or HIPAA. Ensure the chatbot vendor signs a BAA and does not retain the data.

What kind of AI tools actually support HIPAA compliance?

Look for “Enterprise” versions of major platforms (Microsoft Azure, AWS) or specific Health-Tech SaaS products designed for practitioners (e.g., AI Scribes like DeepScribe or Heidi Health) that openly advertise their security protocols and BAA availability.

What happens if I accidentally input PHI into a non-compliant AI tool?

This is a data breach. You may need to follow your breach notification protocols, which could involve notifying the client and the Office for Civil Rights (OCR), depending on the scale and risk of compromise.

Use AI Smarter, Not Riskier

AI is an incredible asset for small business owners. It can give you back hours of your week and help you compete with larger organizations. But in healthcare, trust is your currency.

Don’t let the allure of efficiency blind you to the reality of data privacy. Focus your AI use on content creation and business operations, and be incredibly picky about the tools you let near your client data.

Jessica Freeman is a Web Designer and SEO Strategist exclusively for private practice owners. With a background and degree in design, she helps therapists, dietitians, and practitioners stop chasing clients and start attracting them. Jess doesn’t just build “pretty” websites, her websites are designed to rank on Google and fill your client orster. When not auditing websites or geeking out over conversion rates, you can find her drinking Diet Dr Pepper and reading the latest thriller novel on the couch.

I build high-impact websites for health pros so they can spend less time on social.

PODCAST

RECENT POSTS

WEB DESIGN SERVICES

SEO SERVICES