Therapy is built on one non-negotiable foundation: confidentiality. A patient who walks into a session trusts that their darkest fears, traumas, and vulnerabilities stay between them and their therapist. For decades, this trust was upheld by locked filing cabinets and strict ethical codes.
Now, a wave of AI transcription tools promises to revolutionize clinical documentation. Record the session, get instant notes, save hours on paperwork. It sounds transformative—and it is. But there’s a problem that most therapists haven’t considered: where does that audio go?
According to a Wired investigation into AI therapy tools, many cloud-based services retain patient audio and transcripts on remote servers, often with vague data retention policies. For a profession governed by the strictest confidentiality standards in any industry, this is a crisis hiding in plain sight.
The Unique Sensitivity of Therapy Session Data
Not all meeting transcripts are created equal. A product standup or marketing brainstorm contains information that’s certainly confidential, but a therapy transcript contains something far more dangerous if exposed:
- Diagnoses and treatment plans that could affect employment, insurance, and custody battles
- Admissions of substance abuse that carry legal consequences in some jurisdictions
- Suicidal ideation disclosures that are among the most sensitive data imaginable
- Descriptions of abuse or trauma that could re-traumatize patients if leaked
- Relationship conflicts that could be weaponized in divorce proceedings
A breach of this data doesn’t just violate privacy—it can destroy lives. And yet, therapists are adopting cloud AI transcription tools at a staggering rate without understanding what happens to that audio once it leaves their device.
HIPAA: The Law Is Clear, Even If Vendors Aren’t
The HIPAA Privacy Rule establishes national standards for the protection of individually identifiable health information. Psychotherapy notes receive even stronger protections than standard medical records under HIPAA—they cannot be disclosed even to the patient’s insurance company without explicit authorization.
Many popular transcription services either refuse to sign BAAs or offer them only on enterprise plans that cost thousands per year. And even with a BAA in place, a data breach at the vendor still exposes your patients. As the HHS Breach Notification Rule makes clear, therapists are ultimately responsible for safeguarding their patients’ information regardless of which vendors they use.
What Cloud Transcription Services Actually Do With Your Audio
Let’s trace what happens when a therapist hits “record” on a typical cloud-based AI transcription tool:
- Audio capture: The raw audio file is created on the device
- Cloud upload: The audio is transmitted to remote servers (often AWS, Google Cloud, or Azure)
- Processing: The audio is processed by AI models on those servers
- Storage: Both the audio and resulting transcript are stored on cloud infrastructure
- Retention: Data may be retained for days, months, or indefinitely depending on the vendor
- Training: Some vendors reserve the right to use your data to improve their AI models
Consider Otter.ai’s privacy policy, which grants them broad rights to process and store your content on their servers. Or Fireflies.ai’s privacy policy, which similarly involves cloud storage and third-party processing infrastructure. For a therapist, every one of these steps represents a point where a patient’s most intimate disclosures could be exposed, subpoenaed, or stolen.
A TechCrunch report on healthcare data breaches found that breaches affecting healthcare organizations reached record levels, with cloud infrastructure being a primary attack vector. Mental health data is particularly valuable on the black market because it contains such deeply personal information.
The Ethical Dimension: Beyond Legal Compliance
For therapists, HIPAA compliance is the floor, not the ceiling. The APA Ethics Code and equivalent codes for counselors, social workers, and psychiatrists all impose duties of confidentiality that go beyond what the law requires.
“Psychologists have a primary obligation and take reasonable precautions to protect confidential information.” — APA Ethical Principles of Psychologists, Standard 4.01
“Reasonable precautions” in 2026 means understanding where your digital tools send patient data. A therapist who uses a cloud AI transcription service without fully understanding its data practices isn’t just risking a HIPAA fine—they’re violating the ethical foundation of the therapeutic relationship.
As we explored in our article on cloud risks in remote work settings, VPNs and encryption don’t eliminate the fundamental problem: if data exists on someone else’s server, it’s vulnerable.
Informed Consent Complications
Most therapists who record sessions obtain written consent from their patients. But does that consent cover sending audio to a cloud server operated by a third-party technology company? In most cases, patients assume their therapist is the only person who will hear the recording. They don’t expect their session to be processed by an AI model running on Amazon’s servers.
True informed consent for cloud AI transcription would require telling patients: “I’m going to send a recording of everything you say to a company’s servers, where it will be processed by AI, stored for an undefined period, and potentially used to train future AI models.” How many patients would consent to that?
The On-Device Solution: Privacy by Architecture
There’s a fundamental difference between “privacy by policy” and “privacy by architecture.” Cloud services promise privacy through policies and contracts. On-device processing delivers privacy through engineering: the data simply never leaves your device.
- Audio is captured and transcribed entirely on your iPhone, iPad, or Mac
- Apple’s on-device Speech Recognition processes everything locally using the Apple Speech framework
- No audio or transcript ever touches a cloud server
- Transcripts are saved to Apple Notes via your private iCloud account
- You can delete everything instantly—and it’s actually deleted
- 8-hour continuous recording covers even extended therapy sessions
With on-device processing, the informed consent conversation becomes simple: “I’m recording this session on my device. The recording and transcript stay on my device. No third party will ever have access.”
Real-World Scenarios Where On-Device Matters
Scenario 1: The Couples Therapy Session
During a couples therapy session, one partner discloses an affair. This information, if leaked from a cloud server, could end up in divorce proceedings, custody battles, or workplace harassment claims. With on-device transcription, the therapist has complete control over this sensitive disclosure.
Scenario 2: The Adolescent Patient
A 16-year-old patient discusses their gender identity with their therapist. In states with varying laws around minor healthcare, this information being stored on a cloud server creates legal and safety risks that are difficult to quantify. On-device processing ensures the minor’s disclosures remain protected.
Scenario 3: The Court-Ordered Evaluation
A therapist conducting a forensic evaluation records sessions for accuracy. If those recordings are stored on a cloud server, opposing counsel could potentially subpoena the cloud provider directly, bypassing the therapist’s ability to assert privilege. On-device storage keeps the therapist in control of the privilege assertion.
These scenarios illustrate why our article on compliance in regulated industries applies equally to mental health professionals—and perhaps even more urgently.
Practical Workflow: Using Basil AI for Therapy Notes
Here’s how a therapist can use on-device AI transcription ethically and effectively:
- Before the session: Obtain written consent for recording, specifying that all processing happens on-device with no cloud transmission
- During the session: Launch Basil AI and say “Hey Basil” to begin recording. The app runs quietly in the background
- After the session: Review the transcript and AI-generated summary on your device. Edit as needed
- Documentation: Export key notes to Apple Notes for integration with your clinical documentation workflow
- Deletion: Delete the recording immediately after extracting your notes, or retain it under your control on your encrypted device
This workflow saves therapists an estimated 15–30 minutes per session on documentation—time that can be redirected to patient care—without any compromise on confidentiality.
What to Look for in Any AI Tool for Therapy
If you’re a mental health professional evaluating AI transcription tools, ask these questions:
- Where does processing happen? If the answer is “our secure cloud,” that’s a red flag. Secure servers still get breached.
- Will you sign a BAA? If not, the tool is not HIPAA-compliant. If yes, understand that a BAA transfers liability but not risk.
- What’s your data retention policy? “We delete after 30 days” means your patient’s deepest secrets sit on a server for a month.
- Is my data used for model training? If yes, your patient’s disclosures become training data for AI.
- Can I truly delete everything? Cloud deletion is often soft deletion. On-device deletion is real deletion.
The Future of AI in Mental Health Documentation
AI-assisted clinical documentation is inevitable. The administrative burden on therapists is unsustainable, contributing to burnout rates that threaten the profession. But the path forward must be privacy by design, not privacy by promise.
Apple’s investment in on-device AI through Apple Intelligence signals that the industry is moving toward local processing. This is exactly the direction mental health technology needs to go: powerful AI capabilities that never compromise patient confidentiality.
Therapists don’t need to choose between efficiency and ethics. On-device AI transcription delivers both.
← Back to all articles