The mental health app market has grown rapidly in recent years. In fact, it was valued at $5.2 billion in 2022 and is expected to grow by at least 15% between 2023 and 2030.
Mental health apps can be beneficial. In many cases, they can provide improved treatment outcomes and more awareness of mental health conditions. But, like everything, they come with risks, and privacy experts are now warning that users need to be careful with their data.
As demand for mental health services continues to rise, research from the Mozilla Foundation shows that developers need to improve their warnings about user privacy.
The research, which is called “Privacy Not Included”, found that 59% of mental health apps have failed to safeguard their user’s privacy sufficiently and now come with a warning label.
The team said: “Our main goal is better protection for consumers, so we were encouraged to see that some apps made changes that amount to better privacy for the public. And sometimes all that had to be done to make those positive changes was to ask the companies to do better.
But the worst offenders are still letting consumers down in scary ways, tracking and sharing their most intimate information and leaving them incredibly vulnerable. The handful of apps that handle data responsibly and respectfully prove that it can be done right.”
The researchers found that a third of apps had improved since 2022. Some of the apps that had seen improvements over the year were PTSD Coach, Wysa, Weobot, and Youper.
However, they also found that approximately 40% had actually got worse in the last year. The worst-performing app in terms of data security and privacy was Replika: My AI Friend, which has 10 million downloads on Google Play and millions on the Apple app store.
The researchers said Replika was particularly poor for a number of reasons, including weak password requirements, the recording of voice and text messages, photos, and videos, and the sharing of its users’ personal data with third-party advertisers.