Mental health apps have become increasingly popular in recent years, especially due to the rise of telehealth during the coronavirus pandemic.
However, there is a problem: Data privacy is being compromised in the process.
“Data is extremely profitable in the digital space,” Darrell West, a senior fellow at the Brookings Institution, told Yahoo Finance. “That’s how companies are making their money. Many large companies derive a large portion of their revenue from advertising. People want to target their ads to specific people and specific problems. So the risk is that if you have a mental health condition related to depression, there will be companies trying to market medication to people who suffer that way.”
In 2023 the Federal Trade Commission ordered mental health platform BetterHelp, owned by Teladoc (TDOC), to pay consumers a $7.8 million fine for sharing their mental health data for advertising purposes with Facebook (META) and Snapchat (SNAP) after after. previously promised to keep the information private.
Cerebral, a telehealth startup, admitted last year that it disclosed sensitive patient information to companies such as Google (GOOG, GOOGL), Meta, TikTok, and other third-party advertisers. This information included patient names, dates of birth, insurance information, and patient responses to mental health self-assessments via the app.
And mental health app Talkspace’s privacy policy specifically states that the company “can use inferences about you when you take its signature questionnaire that asks about things like your gender identity, sexual orientation, and whether you suffer from depression ( everything you have “ever been offered a privacy policy) for marketing purposes, including customized ads.”
Mental health app policies looked like a ‘money grab’
Overall, according to the Mozilla Foundation’s Privacy Not Included online buyer’s guide, only two of the 27 mental health apps available to users met Mozilla’s privacy and security standards in 2023: PTSD Coach, a free self-help app created by US Department of Veteran Affairs, and Wysa, an app that offers an AI chatbot and chat sessions with live therapists.
Mozilla began evaluating these apps in 2022 due to their dramatic increase during the height of the coronavirus pandemic.
“We were concerned that the companies might not be prioritizing privacy where privacy seemed to be critical,” Privacy Without Coverage program director Jen Caltrider told Yahoo Finance.
The vast majority of apps fell short of Mozilla’s privacy and security standards in both 2022 and 2023, due to how they control user data, track records, secure private information or use artificial intelligence.
“It seemed like a money grab, taking advantage of vulnerable people in a bad situation, and it felt really sick,” Caltrider said.
Telehealth is a ballooning industry
A report by Grand View Research estimated that the global telehealth market was valued at approximately $101.2 billion in 2023 and is projected to grow by 24.3% on a compound annual basis from 2024 to 2030.
North America accounts for the largest market share in the world at 46.3%, although other countries are also rapidly adopting telehealth.
Mental health apps are also projected to grow significantly between 2024 and 2030, with Grand View Research estimating a compound annual growth rate of 15.2% after the global mental health app market reaches $6.2 billion in 2023.
This also means that there are many more opportunities to disclose personal data. A December 2022 study of 578 mental health apps published in the Journal of the American Medical Association found that 44% shared data they collected with third parties.
“I sit on both sides of the fence,” Diane O’Connell, an attorney and president at Sorting It Out Inc., told Yahoo Finance. “One is convenience [mental health apps] it provided greater access to mental health and physical health care. But on the other hand, private health information being hackable is also a concern.”
Someone who uses one of these mental health apps to seek help for depression or anxiety may start seeing ads for antidepressants, even if they’ve never expressed interest in taking medication.
Legal escape routes
Data brokers exploiting mental health data is nothing new. A February 2023 report from Duke University found that of 37 different data brokers contacted by researchers about mental health data, 26, and 11 firms “ultimately responded and were able to sell the requested mental health data .”
It is also completely legal.
President Clinton implemented HIPAA – the Health Insurance Portability and Accountability Act – in 1996 as a way to “achieve a balance that allows important uses of information, and protects the privacy of those seeking care and medicine.” It is now considered America’s primary health care privacy law.
However, not all entities are bound by HIPAA, including many mental health apps. According to HIPAA Journal, the law applies to “the majority of workers, most health insurance providers, and employers who sponsor or co-sponsor employee health insurance plans.” Those not required to comply with HIPAA include life insurers, most schools and school districts, many state agencies, most law enforcement agencies, and many city offices.
“HIPAA only applies to a conversation or information shared between a doctor and their patient,” Caltrider said. “A lot of these [mental health] apps, you are not considered a patient in the same way. You are part of them. I think Talkspace is a good example of it [how] Once you become a client of Talkspace, they have a different privacy policy that will cover your interactions than before you became a client. They have it so you have a relationship with an actual therapist rather than a coach when you’re a client.”
This often happens with talk therapy apps, Caltrider explained, adding that HIPAA doesn’t cover “the vast majority of what a lot of people are sharing with mental health apps.”
“People don’t understand that [HIPAA] it only covers the communications between the health care provider, so these apps are not covered by that,” Caltrider said. Even if HIPAA applies to the conversation between you and a therapist, some of the metadata collected about your appointment times and the apps you use for video calls may not be covered by the law.
HIPAA protections also depend on the type of provider you are meeting with; a licensed therapist is considered a health care professional, but not an emotional coach, professional coach or volunteer.
‘The Napster Argument’
Another legal loophole that data brokers and mental health app providers can use is the Health Information Technology for Economic and Clinical Health (HITECH) Act of 2009, which expanded HIPAA guidelines to those considered “business associates of a covered entity.” the HIPAA. Journal cited.
According to O’Connell, many private equity firms bought medical practices and hospital networks after Congress passed the Affordable Care Act. Because the investors who made these M&A deals were not in the medical industry, O’Connell explained, the HITECH Act did not apply to them since they were not considered business associates – a billing, health care or health insurance company – under HIPAA .
“It’s created this confusion about how do you exchange data in these merger and acquisition deals when you’re not allowed to exchange personal health information with the company that’s actually trying to buy you?” said O’Connell.
That’s where terms and conditions come into play. O’Connell called it the “Napster controversy,” referring to the former peer-to-peer file-sharing network that was permanently shut down in 2001 after numerous lawsuits related to music copyright infringement.
“Napster wasn’t stealing music – it was just creating a platform for people to share it,” Ó Conaill said. “So you make these different arguments about how the regulations don’t apply, and then you create a fact pattern that fits your story until someone takes you to court, and then the judges make a decision.”
According to West, the main issue is that the US doesn’t have a national privacy law, which means that “there’s not a lot of regulation that governs behavior in this area, so there’s a wide range of companies out there. Some take privacy very seriously, and some don’t.”
“We’re not against mental health apps,” West said. “There are a lot of talents. It brings medical services to a wider range of people because you don’t have to physically go to a doctor’s office.”
West added, “We want to make sure people are aware of the risk and have better protections built in. And people need to look at the privacy practices of the particular app they’re using to make sure the protections are there. that the individual patient needs.”
—
Adriana Belmonte is a reporter and editor covering health care politics and policy for Yahoo Finance. You can follow her on Twitter @adrianambells and you can reach her at adriana@yahoofinance.com.
Click here for an in-depth analysis of the latest health industry news and events affecting stock prices