{"id":55395,"date":"2026-03-10T11:33:22","date_gmt":"2026-03-10T15:33:22","guid":{"rendered":"https:\/\/www.kaspersky.com\/blog\/?p=55395"},"modified":"2026-03-10T11:36:08","modified_gmt":"2026-03-10T15:36:08","slug":"mental-health-apps-issues-2026","status":"publish","type":"post","link":"https:\/\/www.kaspersky.com\/blog\/mental-health-apps-issues-2026\/55395\/","title":{"rendered":"Brain drain: vulnerabilities in mental health apps"},"content":{"rendered":"<p>In February 2026, the cybersecurity firm Oversecured published a report that makes you want to factory reset your phone and move into a remote cabin in the woods. Researchers <a href=\"https:\/\/www.bleepingcomputer.com\/news\/security\/android-mental-health-apps-with-147m-installs-filled-with-security-flaws\/\" target=\"_blank\" rel=\"noopener nofollow\">audited<\/a> 10 popular Android mental health apps \u2014 ranging from mood trackers and AI therapists to tools for managing depression and anxiety \u2014 and uncovered\u2026 1575 vulnerabilities! Fifty-four of those flaws were classified as critical. Given the download stats on Google Play, as many as 15 million people could be affected. The real kicker? Six out of the ten apps tested explicitly promised users that their data was \u201cfully encrypted and securely protected\u201d.<\/p>\n<p>We\u2019re breaking down this scandalous \u201cbrain drain\u201d: what exactly could leak, how it\u2019s happening, and why \u201canonymity\u201d in these services is usually just a marketing myth.<\/p>\n<h2>What was found in the apps<\/h2>\n<p>Oversecured is a mobile app security firm that uses a specialized scanner to analyze APK files for known vulnerability patterns across dozens of categories. In January 2026, researchers ran ten mental health monitoring apps from Google Play through the scanner \u2014 and the results were, shall we say, \u201cspectacular\u201d.<\/p>\n<table width=\"605\">\n<tbody>\n<tr>\n<td rowspan=\"2\" width=\"229\">App Type<\/td>\n<td rowspan=\"2\" width=\"75\">Installs<\/td>\n<td colspan=\"4\" width=\"301\">Security vulnerabilities<\/td>\n<\/tr>\n<tr>\n<td width=\"75\">High-severity<\/td>\n<td width=\"75\">Medium-severity<\/td>\n<td width=\"75\">Low-severity<\/td>\n<td width=\"75\">Total<\/td>\n<\/tr>\n<tr>\n<td width=\"229\">Mood &amp; habit tracker<\/td>\n<td width=\"75\">10M+<\/td>\n<td width=\"75\">1<\/td>\n<td width=\"75\">147<\/td>\n<td width=\"75\">189<\/td>\n<td width=\"75\">337<\/td>\n<\/tr>\n<tr>\n<td width=\"229\">AI therapy chatbot<\/td>\n<td width=\"75\">1M+<\/td>\n<td width=\"75\">23<\/td>\n<td width=\"75\">63<\/td>\n<td width=\"75\">169<\/td>\n<td width=\"75\">255<\/td>\n<\/tr>\n<tr>\n<td width=\"229\">AI emotional health platform<\/td>\n<td width=\"75\">1M+<\/td>\n<td width=\"75\">13<\/td>\n<td width=\"75\">124<\/td>\n<td width=\"75\">78<\/td>\n<td width=\"75\">215<\/td>\n<\/tr>\n<tr>\n<td width=\"229\">Health &amp; symptom tracker<\/td>\n<td width=\"75\">500k+<\/td>\n<td width=\"75\">7<\/td>\n<td width=\"75\">31<\/td>\n<td width=\"75\">173<\/td>\n<td width=\"75\">211<\/td>\n<\/tr>\n<tr>\n<td width=\"229\">Depression management tool<\/td>\n<td width=\"75\">100k+<\/td>\n<td width=\"75\">0<\/td>\n<td width=\"75\">66<\/td>\n<td width=\"75\">91<\/td>\n<td width=\"75\">157<\/td>\n<\/tr>\n<tr>\n<td width=\"229\">CBT-based anxiety app<\/td>\n<td width=\"75\">500k+<\/td>\n<td width=\"75\">3<\/td>\n<td width=\"75\">45<\/td>\n<td width=\"75\">62<\/td>\n<td width=\"75\">110<\/td>\n<\/tr>\n<tr>\n<td width=\"229\">Online therapy &amp; support community<\/td>\n<td width=\"75\">1M+<\/td>\n<td width=\"75\">7<\/td>\n<td width=\"75\">20<\/td>\n<td width=\"75\">71<\/td>\n<td width=\"75\">98<\/td>\n<\/tr>\n<tr>\n<td width=\"229\">Anxiety &amp; phobia self-help<\/td>\n<td width=\"75\">50k+<\/td>\n<td width=\"75\">0<\/td>\n<td width=\"75\">15<\/td>\n<td width=\"75\">54<\/td>\n<td width=\"75\">69<\/td>\n<\/tr>\n<tr>\n<td width=\"229\">Military stress management<\/td>\n<td width=\"75\">50k+<\/td>\n<td width=\"75\">0<\/td>\n<td width=\"75\">12<\/td>\n<td width=\"75\">50<\/td>\n<td width=\"75\">62<\/td>\n<\/tr>\n<tr>\n<td width=\"229\">AI CBT chatbot<\/td>\n<td width=\"75\">500k+<\/td>\n<td width=\"75\">0<\/td>\n<td width=\"75\">15<\/td>\n<td width=\"75\">46<\/td>\n<td width=\"75\">61<\/td>\n<\/tr>\n<tr>\n<td width=\"229\"><strong>Total<\/strong><\/td>\n<td width=\"75\"><strong>14.7\u041c+<\/strong><\/td>\n<td width=\"75\"><strong>54<\/strong><\/td>\n<td width=\"75\"><strong>538<\/strong><\/td>\n<td width=\"75\"><strong>983<\/strong><\/td>\n<td width=\"75\"><strong>1575<\/strong><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<div class=\"wp-caption aligncenter\">\n<p class=\"wp-caption-text\">Vulnerabilities found in the 10 tested mental health apps. <a href=\"https:\/\/www.bleepingcomputer.com\/news\/security\/android-mental-health-apps-with-147m-installs-filled-with-security-flaws\/\" target=\"_blank\" rel=\"noopener nofollow\">Source<\/a><\/p>\n<\/div>\n<h2>The anatomy of the flaws<\/h2>\n<p>The discovered vulnerabilities are diverse, but they all boil down to one thing: giving attackers access to data that should be under lock and key.<\/p>\n<p>For starters, one of the vulnerabilities allows an attacker to access any internal activity of the app \u2014 even that never intended for external eyes. This opens the door to hijacking authentication tokens and user session data. Once an attacker has those, they essentially could gain access to a user\u2019s therapy records.<\/p>\n<p>Another issue is insecure local data storage with read permissions granted to any other app on the device. In other words, that random flashlight app or calculator on your smartphone could potentially read your cognitive behavioral therapy (CBT) logs, personal notes, and mood assessments.<\/p>\n<p>The researchers also found unencrypted configuration data baked right into the APK installation files. This included backend API endpoints and hardcoded URLs for Firebase databases.<\/p>\n<p>Furthermore, several apps were caught using the cryptographically weak <em>java.util.Random<\/em> class to generate session tokens and encryption keys.<\/p>\n<p>Finally, most of the tested apps lacked root\/jailbreak detection. On a rooted device, any third-party app with root privileges could gain total access to every bit of locally stored medical data.<\/p>\n<p>Shockingly, of the 10 apps analyzed, only four received updates in February 2026. The rest haven\u2019t seen a patch since November 2025, and one hasn\u2019t been touched since September 2024. Going 18 months without a security patch is a lifetime in this industry \u2014 especially for an app housing mood journals, therapy transcripts, and medication schedules.<\/p>\n<p>Here\u2019s a quick reminder of just how dangerous the misuse of this type of data gets. In 2024, the tech world was rocked by a <a href=\"https:\/\/www.kaspersky.com\/blog\/cve-2024-3094-vulnerability-backdoor\/50873\/\" target=\"_blank\" rel=\"noopener nofollow\">sophisticated attack on XZ Utils<\/a>, a critical component found in virtually every operating system based on the Linux kernel. The attacker successfully pressured the maintainer into handing over code commit permissions by exploiting the developer\u2019s public admission of burnout and a lack of motivation to carry on with the project. Had the attack been completed, the damage would have been mind-boggling given that roughly 80% of the world\u2019s servers run on Linux.<\/p>\n<h2>What could leak?<\/h2>\n<p>What do these apps collect and store? It\u2019s the kind of stuff you\u2019d likely only share with a trusted clinician: therapy session transcripts, mood logs, medication schedules, self-harm indicators, CBT notes, and various clinical assessment scales.<\/p>\n<p>As far back as 2021, complete medical records were <a href=\"https:\/\/capsuletech.com\/blog\/stolen-patient-records-a-hot-commodity-on-the-dark-web\" target=\"_blank\" rel=\"noopener nofollow\">selling on the dark web for US$1000 each<\/a>. For comparison, a stolen credit card number goes for anywhere between US$5 and US$30. Medical records contain a full identity package: name, address, insurance details, and diagnostic history. Unlike a credit card, you can\u2019t exactly \u201creissue\u201d your medical history. Furthermore, medical fraud is notoriously difficult to spot. While a bank might flag a suspicious transaction in hours, a fraudulent insurance claim for a phantom treatment can go unnoticed for years.<\/p>\n<h2>We\u2019ve seen this movie before<\/h2>\n<p>The Oversecured study isn\u2019t just an isolated horror story.<\/p>\n<p>Back in 2020, Julius Kivim\u00e4ki <a href=\"https:\/\/www.bbc.com\/news\/articles\/c97znd00q7mo\" target=\"_blank\" rel=\"noopener nofollow\">hacked the database of the Finnish psychotherapy clinic Vastaamo<\/a>, making off with the records of 33\u00a0000 patients. When the clinic refused to cough up a \u20ac400\u00a0000 ransom, Kivim\u00e4ki began sending direct threats to patients: \u201cPay \u20ac200 in Bitcoin within 24 hours, or else your records go public\u201d. Ultimately, he leaked the entire database onto the dark web anyway. At least two people died by suicide, and the clinic was forced into bankruptcy. Kivim\u00e4ki was eventually sentenced to six years and three months in prison, marking a record-breaking trial in Finland for the sheer number of victims involved.<\/p>\n<p>In 2023, the U.S. Federal Trade Commission (FTC) slapped the <a href=\"https:\/\/www.securityweek.com\/betterhelp-customers-begin-receiving-refund-notices-from-7-8m-data-privacy-settlement-ftc-says\/\" target=\"_blank\" rel=\"noopener nofollow\">online therapy giant BetterHelp<\/a> with a US$7.8 million fine. Despite stating on their sign-up page that your data was strictly confidential, the company was caught funneling user info \u2014 including mental health questionnaire responses, emails, and IP addresses \u2014 to Facebook, Snapchat, Criteo, and Pinterest for targeted advertising. After the dust settled, 800\u00a0000 affected users received a grand total of\u2026 US$10 each in compensation.<\/p>\n<p>By 2024, the FTC set its sights on the <a href=\"https:\/\/thehackernews.com\/2024\/04\/ftc-fines-mental-health-startup.html\" target=\"_blank\" rel=\"noopener nofollow\">telehealth firm Cerebral<\/a>, tagging them with a US$7 million fine. Through <a href=\"https:\/\/www.kaspersky.com\/blog\/web-beacons-explained-and-how-to-stop-them\/47281\/\" target=\"_blank\" rel=\"noopener nofollow\">tracking pixels<\/a>, Cerebral leaked the data of 3.2 million users to LinkedIn, Snapchat, and TikTok. The haul included names, medical histories, prescriptions, appointment dates, and insurance info. And the cherry on top? The company sent promotional postcards (sans envelopes) to 6000 patients, which effectively broadcasted that the recipients were undergoing psychiatric treatment.<\/p>\n<p>In September 2024, security researcher Jeremiah Fowler <a href=\"https:\/\/www.wired.com\/story\/confidant-health-therapy-records-database-exposure\/\" target=\"_blank\" rel=\"noopener nofollow\">stumbled upon an exposed database belonging to Confidant Health<\/a>, a provider specializing in addiction recovery and mental health services. The database contained audio and video recordings of therapy sessions, transcripts, psychiatric notes, drug test results, and even copies of driver\u2019s licenses. In total, 5.3 terabytes of data, 126\u00a0000 files, or 1.7 million records were sitting there without a password.<\/p>\n<h2>Why anonymity is an illusion<\/h2>\n<p>Developers love to drop the line: \u201cWe never share your personal data with anyone.\u201d Technically, that might be true \u2014 instead, they share \u201canonymized profiles\u201d. The catch? De-anonymizing that data isn\u2019t exactly rocket science anymore. Recent <a href=\"https:\/\/arxiv.org\/abs\/2602.16800\" target=\"_blank\" rel=\"noopener nofollow\">research<\/a> highlights that using LLMs to strip away anonymity has become a routine reality.<\/p>\n<p>Even the \u201canonymization\u201d process itself is often a mess. A <a href=\"https:\/\/techpolicy.sanford.duke.edu\/data-brokers-and-the-sale-of-americans-mental-health-data\/\" target=\"_blank\" rel=\"noopener nofollow\">study by Duke University<\/a> revealed that data brokers are openly hawking the mental health data of Americans. Out of 37 brokers surveyed, 11 agreed to sell data linked to specific diagnoses (like depression, anxiety, and bipolar disorder), demographic parameters, and in some cases, even names and home addresses. Prices started as low as US$275 for 5000 aggregated records.<\/p>\n<p>According to the <a href=\"https:\/\/www.mozillafoundation.org\/en\/blog\/shady-mental-health-apps-inch-toward-privacy-and-security-improvements-but-many-still-siphon-personal-data\/\" target=\"_blank\" rel=\"noopener nofollow\">Mozilla Foundation<\/a>, by 2023, 59% of popular mental health apps failed to meet even the most basic privacy standards, and 40% had actually become less secure than the previous year. These apps allowed account creation via third-party services (like Google, Apple, and Facebook), featured suspiciously brief privacy policies that glossed over data collection details, and employed a clever little loophole: some privacy policies applied strictly to the company\u2019s website, but not the app itself. In short, your clicks on the site were \u201cprotected\u201d, but your actions within the app were fair game.<\/p>\n<h2>How to protect yourself<\/h2>\n<p>Cutting these apps out of your life entirely is, of course, the most foolproof option \u2014 but it\u2019s not the most realistic one. Besides, there\u2019s no guarantee you can actually nuke the data already collected \u2014 even if you delete your account. We previously covered the <a href=\"https:\/\/www.kaspersky.com\/blog\/how-to-remove-yourself-from-data-brokers-people-search-sites\/54209\/\" target=\"_blank\" rel=\"noopener nofollow\">grueling process of scrubbing your info from data broker databases<\/a>; it\u2019s possible, but prepare for a headache. So, how can you stay safe?<\/p>\n<ul>\n<li><strong>Check permissions before you hit \u201cInstall\u201d.<\/strong> In Google Play, navigate to <em>App description \u2192 About this app \u2192 Permissions<\/em>. A mood tracker has no business asking for access to your camera, microphone, contacts, or precise GPS location. If it does, it\u2019s not looking out for your well-being \u2014 it\u2019s harvesting data.<\/li>\n<li><strong>Actually read the privacy policy. <\/strong>We get it \u2014 nobody reads these multi-page manifestos. But when a service is vacuuming up your most intimate thoughts, it\u2019s worth a skim. Look for the red flags: does the company share data with third parties? Can you manually delete your records? Does the policy explicitly cover the app itself, or just the website? You can always feed the policy text into an AI and ask it to flag any privacy deal-breakers.<\/li>\n<li><strong>Check the last updated date.<\/strong> An app that hasn\u2019t seen an update in over six months is likely a playground for unpatched vulnerabilities. Remember: six out of the 10 apps Oversecured tested hadn\u2019t been touched in months.<\/li>\n<li><strong>Disable everything non-essential in your phone\u2019s privacy settings.<\/strong> Whenever prompted, always select \u201cask not to track\u201d. When an app pleads with you to enable a specific type of tracking \u2014 claiming it\u2019s for \u201cinternal optimization\u201d \u2014 it\u2019s almost always a marketing ploy rather than a functional necessity. After all, if the app truly won\u2019t work without a certain permission, you can always go back and toggle it on later.<\/li>\n<li><strong>Don\u2019t use \u201cSign in with\u2026\u201d services.<\/strong> Authenticating via Facebook, Apple, Google, or Microsoft creates additional identifiers and gives companies a golden opportunity to link your data across different platforms.<\/li>\n<li><strong>Treat everything you type like a public social media post.<\/strong> If you wouldn\u2019t want a random stranger on the internet reading it, you probably shouldn\u2019t be typing it into an app with over 150 vulnerabilities that hasn\u2019t seen a patch since the year before last.<\/li>\n<\/ul>\n<blockquote><p>What else you should know about privacy settings and controlling your personal data online:<\/p>\n<ul>\n<li><a href=\"https:\/\/www.kaspersky.com\/blog\/geolocation-data-broker-leak\/53050\/\" target=\"_blank\" rel=\"noopener nofollow\">Geolocation data brokers: what they do and what happens when they leak<\/a><\/li>\n<li><a href=\"https:\/\/www.kaspersky.com\/blog\/how-to-remove-yourself-from-data-brokers-people-search-sites\/54209\/\" target=\"_blank\" rel=\"noopener nofollow\">Why data brokers build dossiers on you, and how to stop them doing so<\/a><\/li>\n<li><a href=\"https:\/\/www.kaspersky.com\/blog\/deleting-digital-footprints\/54591\/\" target=\"_blank\" rel=\"noopener nofollow\">How to disappear from the internet<\/a><\/li>\n<li><a href=\"https:\/\/www.kaspersky.com\/blog\/minimizing-digital-footprints-2025\/53762\/\" target=\"_blank\" rel=\"noopener nofollow\">How to shrink your digital footprint<\/a><\/li>\n<li><a href=\"https:\/\/www.kaspersky.com\/blog\/disable-mobile-app-ad-tracking\/53096\/\" target=\"_blank\" rel=\"noopener nofollow\">How smartphones build a dossier on you<\/a><\/li>\n<\/ul>\n<\/blockquote>\n<input type=\"hidden\" class=\"category_for_banner\" value=\"premium-generic\">\n","protected":false},"excerpt":{"rendered":"<p>We&#8217;re diving into why mental health apps have become a headache for their users, and how to minimize the risks of medical data leaks.<\/p>\n","protected":false},"author":2775,"featured_media":55396,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[1788],"tags":[105,109,920,961,1429,43,4261,268],"class_list":{"0":"post-55395","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-privacy","8":"tag-android","9":"tag-apps","10":"tag-health","11":"tag-leaks","12":"tag-medicine","13":"tag-privacy","14":"tag-telehealth","15":"tag-vulnerabilities"},"hreflang":[{"hreflang":"x-default","url":"https:\/\/www.kaspersky.com\/blog\/mental-health-apps-issues-2026\/55395\/"},{"hreflang":"en-in","url":"https:\/\/www.kaspersky.co.in\/blog\/mental-health-apps-issues-2026\/30257\/"},{"hreflang":"en-ae","url":"https:\/\/me-en.kaspersky.com\/blog\/mental-health-apps-issues-2026\/25335\/"},{"hreflang":"ar","url":"https:\/\/me.kaspersky.com\/blog\/mental-health-apps-issues-2026\/13272\/"},{"hreflang":"en-gb","url":"https:\/\/www.kaspersky.co.uk\/blog\/mental-health-apps-issues-2026\/30129\/"},{"hreflang":"es-mx","url":"https:\/\/latam.kaspersky.com\/blog\/mental-health-apps-issues-2026\/29052\/"},{"hreflang":"es","url":"https:\/\/www.kaspersky.es\/blog\/mental-health-apps-issues-2026\/31933\/"},{"hreflang":"it","url":"https:\/\/www.kaspersky.it\/blog\/mental-health-apps-issues-2026\/30538\/"},{"hreflang":"ru","url":"https:\/\/www.kaspersky.ru\/blog\/mental-health-apps-issues-2026\/41416\/"},{"hreflang":"tr","url":"https:\/\/www.kaspersky.com.tr\/blog\/mental-health-apps-issues-2026\/14381\/"},{"hreflang":"fr","url":"https:\/\/www.kaspersky.fr\/blog\/mental-health-apps-issues-2026\/23733\/"},{"hreflang":"pt-br","url":"https:\/\/www.kaspersky.com.br\/blog\/mental-health-apps-issues-2026\/24823\/"},{"hreflang":"de","url":"https:\/\/www.kaspersky.de\/blog\/mental-health-apps-issues-2026\/33287\/"},{"hreflang":"ru-kz","url":"https:\/\/blog.kaspersky.kz\/mental-health-apps-issues-2026\/30383\/"},{"hreflang":"en-au","url":"https:\/\/www.kaspersky.com.au\/blog\/mental-health-apps-issues-2026\/36013\/"},{"hreflang":"en-za","url":"https:\/\/www.kaspersky.co.za\/blog\/mental-health-apps-issues-2026\/35671\/"}],"acf":[],"banners":"","maintag":{"url":"https:\/\/www.kaspersky.com\/blog\/tag\/privacy\/","name":"privacy"},"_links":{"self":[{"href":"https:\/\/www.kaspersky.com\/blog\/wp-json\/wp\/v2\/posts\/55395","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.kaspersky.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.kaspersky.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.kaspersky.com\/blog\/wp-json\/wp\/v2\/users\/2775"}],"replies":[{"embeddable":true,"href":"https:\/\/www.kaspersky.com\/blog\/wp-json\/wp\/v2\/comments?post=55395"}],"version-history":[{"count":3,"href":"https:\/\/www.kaspersky.com\/blog\/wp-json\/wp\/v2\/posts\/55395\/revisions"}],"predecessor-version":[{"id":55398,"href":"https:\/\/www.kaspersky.com\/blog\/wp-json\/wp\/v2\/posts\/55395\/revisions\/55398"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.kaspersky.com\/blog\/wp-json\/wp\/v2\/media\/55396"}],"wp:attachment":[{"href":"https:\/\/www.kaspersky.com\/blog\/wp-json\/wp\/v2\/media?parent=55395"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.kaspersky.com\/blog\/wp-json\/wp\/v2\/categories?post=55395"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.kaspersky.com\/blog\/wp-json\/wp\/v2\/tags?post=55395"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}