← Back to Articles
Mental Health Privacy HIPAA On-Device AI Therapy Confidentiality

Mental health care is built on a single, sacred promise: what you say here stays here. Therapists, psychologists, psychiatrists, and counselors spend years building trust with their clients. That trust enables vulnerable disclosures that would never happen in any other professional relationship—admissions of trauma, suicidal ideation, substance abuse, relationship struggles, and deeply personal fears.

Now, a growing number of mental health professionals are adopting AI transcription tools to streamline their clinical documentation. The appeal is obvious: instead of spending 15–20 minutes after each session writing progress notes, an AI tool can generate a transcript, extract key themes, and even draft clinical summaries. But here's the problem—most of these tools send your patient's most intimate disclosures to the cloud.

And that changes everything.

The Unique Sensitivity of Therapy Data

Not all meeting data is created equal. A sales pipeline review or a product roadmap sync is sensitive in a business context, but therapy session data occupies an entirely different category. According to a Wired investigation into mental health app privacy practices, the data generated in therapy contexts is among the most sensitive information that exists—more personal than financial records, more revealing than medical histories.

Consider what a single therapy session might contain:

Now imagine that entire transcript sitting on a cloud server operated by a third-party AI company, accessible to engineers, potentially used for model training, and subject to data breaches. The thought alone should make every clinician reconsider their workflow.

HIPAA's Special Protection for Psychotherapy Notes

Federal law recognizes the extraordinary sensitivity of therapy content. Under the HIPAA Privacy Rule's provisions for mental health, psychotherapy notes receive heightened protection that goes beyond standard Protected Health Information (PHI).

Specifically, psychotherapy notes are defined as a therapist's personal notes documenting the contents of a counseling session—and they are segregated from the medical record. Under HIPAA:

⚠️ The Cloud Compliance Gap

When a therapist uses a cloud-based AI transcription service, the raw audio and resulting transcript are transmitted to and processed on remote servers. This means a third-party business associate now has access to content that HIPAA treats with the highest level of protection. Most cloud transcription services do not have Business Associate Agreements (BAAs) specific to psychotherapy notes—and many explicitly exclude mental health content from their compliance guarantees.

How Cloud Transcription Services Handle Your Data

Let's look at what actually happens when you use popular cloud transcription tools in a clinical setting.

Audio Upload and Retention

Services like Otter.ai and Fireflies.ai require audio to be uploaded to their cloud infrastructure for processing. Otter.ai's privacy policy states that they collect and store "audio recordings, transcriptions, and related content" and may retain this data even after you delete your account, for legal and business purposes.

For a therapist, this means your patient's raw audio—their voice describing trauma, their tone when discussing suicidal thoughts—exists on servers you don't control, subject to policies you didn't negotiate.

Data Usage and Model Training

Many cloud AI services reserve the right to use uploaded content to improve their models. As TechCrunch has reported, the fine print in AI company terms of service frequently grants broad rights to use "content" for "service improvement"—a euphemism for training data. Your patient's therapy session could literally become training data for a commercial AI product.

Third-Party Access

Fireflies.ai's privacy policy discloses that data may be shared with "service providers, business partners, and affiliates." In a therapy context, this means entities the patient has never heard of—and never consented to—may process their most intimate disclosures.

As we explored in our article on workplace surveillance and AI transcription, the gap between what users expect and what cloud services actually do with data is enormous. In a clinical context, that gap becomes an ethical chasm.

Ethical Obligations Beyond the Law

Legal compliance is the floor, not the ceiling. Mental health professionals are bound by ethical codes that impose obligations above and beyond what HIPAA requires.

APA Ethics Code

The American Psychological Association's Ethical Principles (Standard 4.01) states that psychologists must take "reasonable precautions to protect confidential information." Using a cloud service that retains, analyzes, or potentially exposes therapy content raises serious questions about whether "reasonable precautions" have been met.

Informed Consent Challenges

If a therapist uses a cloud transcription tool, genuine informed consent would require explaining to the patient:

  1. Their raw audio will be transmitted to a third-party company
  2. That company may retain the audio and transcript indefinitely
  3. The data may be used to train AI models
  4. Engineers and employees may access the content for quality assurance
  5. The data is subject to subpoena and government requests
  6. A data breach could expose everything discussed in therapy

How many patients would consent to therapy under those conditions? The mere act of disclosing these risks could undermine the therapeutic alliance—the foundational trust that makes therapy effective.

The Breach Scenario: When Therapy Data Leaks

Data breaches in healthcare are not hypothetical. According to the HHS Breach Portal, hundreds of healthcare data breaches affecting 500+ individuals are reported every year. In 2024 alone, over 133 million healthcare records were compromised.

Now imagine a breach at a cloud transcription service that processes therapy sessions. The leaked data wouldn't be billing codes or appointment dates—it would be raw transcripts of people's deepest psychological vulnerabilities. The consequences are devastating:

This isn't abstract. In 2020, the Finnish psychotherapy center Vastaamo suffered a data breach that exposed entire therapy session notes for tens of thousands of patients. Hackers individually blackmailed patients, threatening to publish their therapy transcripts. Multiple patients died by suicide. It remains one of the most devastating privacy breaches in history.

On-Device Transcription: The Only Ethical Path

There is a way to capture the productivity benefits of AI transcription without the privacy catastrophe: on-device processing.

When transcription happens entirely on your device—never leaving your iPhone, iPad, or Mac—the threat model collapses. There is no cloud server to breach, no third-party with access, no training data pipeline consuming your patient's words, and no data retention policy to worry about.

🛡️ How Basil AI Protects Therapy Notes

Basil AI processes all audio using Apple's on-device Speech Recognition framework—the same technology that powers Siri's local processing. Here's what that means for mental health professionals:

Practical Workflows for Mental Health Professionals

Here's how therapists and counselors can integrate on-device AI transcription into their clinical workflow without compromising confidentiality:

1. Session Documentation

Start Basil AI at the beginning of a session. The app transcribes in real time on your device. After the session, review the transcript, extract clinically relevant content for your progress notes, and delete the raw transcript. The clinical note goes into your EHR; the raw audio and transcript never exist anywhere but your device.

2. Supervision and Consultation

Trainees and early-career therapists often record sessions for clinical supervision. With on-device transcription, the supervisor can review transcripts without either party worrying about cloud exposure. This is particularly valuable as we discussed in our article about privacy in educational settings—supervision shares many of the same confidentiality dynamics.

3. Group Therapy

Group sessions introduce additional complexity because multiple patients' data is intertwined. Cloud processing of group therapy audio would expose every participant's disclosures to every risk simultaneously. On-device processing eliminates this concern entirely.

4. Assessment and Intake

Comprehensive psychological assessments and intake interviews can run 2–4 hours. Basil AI's 8-hour recording capability handles these extended sessions easily, and the AI-generated summaries help clinicians draft assessment reports faster.

The State Licensing Board Risk

Beyond HIPAA and ethical codes, mental health professionals face practice-ending consequences from state licensing boards for confidentiality breaches. A licensing complaint triggered by improper use of cloud transcription tools could result in:

No productivity tool is worth your license. If you use AI transcription in clinical practice, the only defensible approach is one where patient data never leaves your possession.

The Future of Clinical Documentation

The mental health field is moving toward AI-assisted documentation—that trend is irreversible. The question is not whether AI will help clinicians write notes, but whether that AI respects the fundamental confidentiality that makes therapy possible.

On-device AI represents the convergence of productivity and ethics. You get real-time transcription, intelligent summaries, and documentation assistance—all without a single byte of patient data touching a cloud server.

For mental health professionals, the choice should be clear: your patients trusted you with their most vulnerable moments. That trust deserves technology that is worthy of it.

Protect Your Patients' Confidentiality

Basil AI processes everything on your device. No cloud. No third parties. No risk to the therapeutic relationship. Try the privacy-first AI note-taker built for professionals who handle sensitive information.