Dramatic growth in mental-health apps has created a risky industry

Dramatic growth in mental-health apps has created a risky industry

by Lily White
0 comments 62 views
A+A-
Reset

Customers’ “emotional data” can be hacked, and no one is checking if the apps work


WHEN CAROLINA ESCUDERO was severely depressed, going to a therapist’s office became hard to face. So she joined BetterHelp, a popular therapy app. She paid $65 each week but spent most of her time waiting for her assigned counsellor to respond. She got two responses in a month. “It was like texting an acquaintance who has no idea how to deal with mental illness,” she says. BetterHelp says its service does not claim to operate around the clock, all its therapists have advanced degrees and “thousands of hours of hands-on clinical work”, and users are able easily to switch them if scheduling is hard.

Listen to this story.

Enjoy more audio and podcasts on iOS or Android.

Your browser does not support the

Helping people to deal with mental problems has rarely been more urgent. The incidence of depression and anxiety has soared in the pandemic—by more than 25% globally in 2020, according to the Lancet, a medical journal. That, combined with more people using online services, has led to a boom in mental-health apps. The American Psychological Association reckons 10,000-20,000 are available for download. But evidence is mounting that privacy risks to users are being ignored. No one is checking if the apps work, either.

Mental-health-tech firms raised nearly $2bn in equity funding in 2020, according to CB Insights, a data firm. Their products tackle problems from general stress to serious bipolar disorder. Telehealth apps like BetterHelp or Talkspace connect users to licensed therapists. Also common are subscription-based meditation apps like Headspace. In October Headspace bought Ginger, a therapy app, for $3bn. Now that big companies are prioritising employees’ mental health, some apps are working with them to help entire workforces. One such app, Lyra, supports 2.2m employee users globally and is valued at $4.6bn.

Underneath, though, a trauma lurks in some corners of the industry. In October 2020 hackers who had breached Vastaamo, a popular Finnish startup, began blackmailing some of its users. Vastaamo required therapists to back up patient notes online but reportedly did not anonymise or encrypt them. Threatening to share details of extramarital affairs and, in some cases, thoughts about paedophilia, on the dark web, the hackers reportedly demanded bitcoin ransoms from some 30,000 patients. Vastaamo has filed for bankruptcy but left many Finns wary of telling doctors personal details, says Joni Siikavirta, a lawyer representing the company’s patients.

Other cases may arise. No universal standards for storing “emotional data” exist. John Torous of Harvard Medical School, who has reviewed 650 mental-health apps, describes their privacy policies as abysmal. Some share information with advertisers. “When I first joined BetterHelp, I started to see targeted ads with words that I had used on the app to describe my personal experiences,” reports one user. BetterHelp says it shares with marketing partners only device identifiers associated with “generic event names”, only for measurement and optimisation, and only if users agree. No private information, such as dialogue with therapists, is shared, it says.

As for effectiveness, the apps’ methods are notoriously difficult to evaluate. Woebot, for instance, is a chatbot which uses artificial intelligence to reproduce the experience of cognitive behavioural therapy. The product is marketed as clinically validated based in part on a scientific study which concluded that humans can form meaningful bonds with bots. But the study was written by people with financial links to Woebot. Of its ten peer-reviewed reports to date, says Woebot, eight feature partnerships with a main investigator with no financial ties to it. Any co-authors with financial ties are disclosed, it says.

Mental-health apps were designed to be used in addition to clinical care, not in lieu of them. With that in mind, the European Commission is reviewing the field. It is getting ready to promote a new standard that will apply to all health apps. A letter-based scale will rank safety, user friendliness and data security. Liz Ashall-Payne, founder of ORCHA, a British startup that has reviewed thousands of apps, including for the National Health Service, says that 68% did not meet the firm’s quality criteria. Time to head back to the couch?

This article appeared in the Business section of the print edition under the headline “Psyber boom”

Read More

You may also like

Leave a Comment