Is Your Therapy App Actually Private? What BetterHelp, Talkspace, and AI Mental Health Apps Do With Your Data
A guide to understanding why therapy apps and AI mental health chatbots operate outside HIPAA, what BetterHelp, Talkspace, Woebot, and similar apps collect and share, how to audit your settings, and what to look for if you want stronger legal protection for your mental health data.
Why therapy apps sit outside the law that most people assume protects them
Most people assume that disclosing depression, anxiety, or trauma history inside a health-related app carries the same legal protections as disclosing it to a doctor. It does not. Understanding where the legal line actually falls determines what protections you have and who can enforce them.
- HIPAA applies to covered entities: licensed healthcare providers, hospitals, health insurers, and their direct business associates. The legal test is whether an organization provides healthcare services and transmits protected health information as part of that service
- A licensed therapist in private practice who bills your insurance: HIPAA-covered. A mental health app you download from the App Store: not covered, regardless of how health-focused its branding is or how sensitive the information it collects
- The phrase "HIPAA compliant" used in app marketing almost always refers to the infrastructure the company uses (encrypted servers, secure data storage), not the company's legal obligations. A company can store data on HIPAA-grade infrastructure and still share your mental health disclosures with advertisers. These are entirely separate things
- If an app connects you with a licensed therapist who bills your insurance, that specific therapist-patient relationship involves a covered entity. The app platform itself still may not be, and which part of the service your data flows through determines what protections apply
- In March 2023, the FTC announced a $7.8 million settlement with BetterHelp, the largest online therapy platform in the US. The FTC found that BetterHelp had collected sensitive mental health information from users, including information indicating depression, anxiety, and prior treatment history, and shared it with Facebook and Snapchat for advertising targeting
- BetterHelp had explicitly told users that their health data would never be shared for advertising. The settlement included a permanent ban on sharing health data for advertising and required BetterHelp to notify users and provide refunds to affected members
- The case illustrated the structural problem: consumer apps with no HIPAA obligation rely entirely on their own privacy policies rather than legal requirements. When a policy is violated, the FTC is the enforcement mechanism, not HIPAA. Regulatory action takes years, and enforcement comes after harm has already occurred
- BetterHelp has updated its privacy practices since the settlement. The underlying issue remains: any consumer app outside HIPAA can make and revise privacy promises, and users have no legal remedy equivalent to HIPAA's patient rights
What these apps actually collect and where it goes
The data that mental health apps collect goes beyond what users typically expect. Intake questionnaires, usage patterns, and session content each create separate data streams, and AI chatbots operate under their own additional set of data practices.
- Intake questionnaires on most platforms ask about current symptoms, diagnoses, medications, trauma history, and prior treatment before you are matched with a therapist. This is among the most sensitive personal data that exists, and it is collected before you have agreed to work with a specific provider
- On text-based platforms (BetterHelp, Talkspace), your therapy sessions are stored as text transcripts on company servers. Depending on each platform's data retention policy, this content may persist after you cancel your account
- Usage metadata is collected separately from session content: how often you open the app, session frequency, which features you use, how long sessions last, whether you use mood tracking or journaling tools. This behavioral data can indicate patterns about your mental health independent of what you actually wrote
- Before the FTC settlement, BetterHelp's app contained the Facebook Pixel SDK, which automatically sent device identifiers and app usage behavior to Facebook when users engaged with the app. This is a standard advertising industry practice that most users are unaware of
- AI therapy chatbots (Woebot, Wysa, Youper, and others) are not therapy, are not conducted by licensed therapists, and are not HIPAA-covered under any circumstances. They are software products that simulate therapeutic conversation using natural language processing
- Everything you type into an AI mental health chatbot is stored on that company's servers. Depending on the platform, this data is used for product improvement, published research, and training the underlying AI models. Woebot Health has published peer-reviewed research using aggregated user conversation data
- Woebot's privacy policy states that conversation data may be used for research and product development with opt-in consent for identifiable data. Users can request deletion. There is no option to use the app in a mode where no conversation data reaches Woebot's servers
- Wysa operates under comparable policies. Crisis Text Line, a peer support service, drew significant attention in 2022 when it was revealed that anonymized conversation data had been shared with its AI spinoff to train conversation models. The practice was subsequently ended, but it illustrated that even crisis and peer support services are not immune to this pattern
Audit and change your settings in the app you use now
Regardless of whether you plan to switch apps, the steps below reduce what your current platform holds and shares. Start with your privacy settings, then request a data export to see exactly what the company has stored under your account.
- Review privacy settings: go to Account Settings, then Privacy. Review which categories of data sharing are enabled and turn off any related to marketing or advertising. Post-FTC, BetterHelp added explicit controls for advertising data use
- Request a data export: go to Account Settings, then Privacy, then Request My Data. You will receive an export of what BetterHelp holds under your account. Review it, the contents may be more extensive than you expect, including intake questionnaire responses and session transcripts
- Delete your account: go to Account Settings, then Close Account. BetterHelp's post-FTC policy states that personal data is deleted following account closure, subject to any legal retention requirements. California residents can invoke CCPA rights for deletion; EU users can invoke GDPR Article 17
- Note: deleting your account does not necessarily mean all session records are removed immediately. Certain records may be retained for legal or compliance purposes. The data export request, submitted before deletion, is the best way to document what was held
- Talkspace: go to Settings, then Account, then Privacy Settings to review data sharing controls. Talkspace operates as a technology platform connecting users with licensed therapists. The therapist relationship may carry HIPAA obligations, but Talkspace's own platform data practices are governed by its privacy policy, not by HIPAA
- AI chatbots (Woebot, Wysa, Youper): go to each app's Settings, then Privacy or Data. Most offer a deletion request via in-app form or email to a privacy contact. Woebot: Settings, then Privacy, then Delete My Data. Wysa: contact privacy@wysa.io with a deletion request
- Device permissions: mental health apps rarely need access to your contacts, location, or camera. On iPhone, go to Settings, then the app name, and review each permission. On Android, go to Settings, then Apps, then the app name, then Permissions. Revoke anything not required for the app to function
- Turn off any in-app analytics or feedback sharing that is not required for core functionality. These toggles are usually in a Privacy or About section within the app's settings
How to choose an app with stronger protections, or move to actual therapy
If what you are disclosing warrants strong privacy protection, consumer apps are the wrong tool. This section covers what to look for in an app's privacy policy if you intend to keep using one, and the alternatives that carry legal obligations rather than just policy promises.
- The four questions that matter: Does the company share your data with third-party advertisers? Does the company sell your data? Can you delete your account and have your data removed? How long is data retained after account deletion?
- Avoid apps whose privacy policy does not answer these questions directly. Vague language such as "we may share with trusted partners for business purposes" is a meaningful red flag, not standard boilerplate to ignore
- Look for apps that explicitly state no data sharing with advertisers, no sale of personal data, a specific data retention period, and a clear deletion process. Apps subject to GDPR (European companies operating in the US) often have more precise deletion policies because EU law requires it
- Check the privacy policy date. A policy last updated in 2019 or 2020 has not been revised to reflect post-BetterHelp-FTC industry standards and is likely inadequate. Look for a policy revised in 2023 or later
- A licensed therapist in private practice or a group practice using a HIPAA-compliant platform (SimplePractice, Therapy Brands, TherapyNotes) is a covered entity under HIPAA. Your session notes, diagnoses, and treatment records are protected health information with full legal backing
- Under HIPAA, a licensed provider cannot share, sell, or disclose your records without your written authorization, with narrow exceptions for safety or legal requirements. A consumer app has no equivalent legal constraint
- If cost is a barrier: many employers offer Employee Assistance Programs (EAPs) that cover 6 to 8 free sessions per year with HIPAA-covered providers. Open Path Collective at openpathcollective.org connects clients with licensed therapists at reduced rates of $30 to $80 per session
- Telehealth therapy through a licensed practice is HIPAA-covered if the provider uses a platform with a Business Associate Agreement (BAA). Ask your provider which video platform they use and whether it has a BAA with their practice. This single question confirms whether the session is legally protected
The BetterHelp FTC case confirmed what privacy researchers had been documenting for years: mental health apps operate outside the legal framework most people assume protects their most sensitive disclosures. The audit steps in Section 3 are worth completing now, regardless of whether you plan to switch. Requesting a data export from your current app tells you what the company actually holds under your account. If the content of what you are disclosing requires strong privacy protection, Section 4 outlines the path to options that carry legal obligations rather than policy promises. The practical dividing line is simple: a licensed therapist with a HIPAA-compliant practice is legally bound. A consumer app is not, and the only enforcement mechanism when a policy is violated is regulatory action that comes years after the fact.