in

Mozilla finds mental health apps fail 'spectacularly' at user security, data policies

An investigation into mental health and prayer apps has revealed a disturbing lack of concern surrounding user security and privacy.

On Monday, Mozilla released the findings of a new study into these types of apps, which often deal with sensitive topics including depression, mental health awareness, anxiety, domestic violence, PTSD, and more, alongside religion-themed services.

According to Mozilla’s latest *Privacy Not Included guide, despite the deeply personal information these apps manage, they “routinely share data, allow weak passwords, target vulnerable users with personalized ads, and feature vague and poorly written privacy policies.”

In a study of 32 applications geared toward mental health and religion, the organization found that 25 of them did not meet Mozilla’s Minimum Security Standards.

These standards act as a benchmark for the *Privacy Not Included reports. The mismanagement or unauthorized sharing and sale of user data, vague data management policies, a lack of encryption, weak password policies, no clear vulnerability management system, and other lax security policies can all downgrade a vendor product in the eyes of Mozilla.

If an app or service fails to meet these basic requirements, they are slapped with the “*Privacy Not Included” warning label.

The mental health and prayer-related apps have received an accolade — but not one you’d covet. The company says:

“When it comes to protecting people’s privacy and security, mental health and prayer apps are worse than any other product category Mozilla researchers have reviewed over the past six years.”

The organization examined apps including Talkspace, Better Help, Calm, Glorify, 7 Cups, Wysa, Headspace, and Better Stop Suicide. As a result, each app now has a dedicated space that can be accessed to find out more about the software’s privacy and security rating.

For example, Better Stop Suicide, a suicide prevention app, failed Mozilla’s test.

“Holy vague and messy privacy policy Batman! Better Stop Suicide’s privacy policy is bad,” Mozilla says. “Like, get a failing grade from your high school English teacher bad.”

While the app gathers some personal information and says that users can reach out to them if they have further queries, they did not respond to Mozilla’s attempts at contact and did not mention who “trusted partners” were when data sharing.

Only two applications on the list, PTSD Coach and the AI chatbot Wysa, seemed to take data management and user privacy seriously.

“The vast majority of mental health and prayer apps are exceptionally creepy,” commented Jen Caltrider, Mozilla’s *Privacy Not Included lead. “They track, share, and capitalize on users’ most intimate personal thoughts and feelings, like moods, mental state, and biometric data. Turns out, researching mental health apps is not good for your mental health, as it reveals how negligent and craven these companies can be with our most intimate personal information.”

Previous and related coverage


Have a tip? Get in touch securely via WhatsApp | Signal at +447713 025 499, or over at Keybase: charlie0



Source: Information Technologies - zdnet.com

Directorate of Enforcement seizes $725 million from Xiaomi India

Why we need more than one Twitter