icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
2 May, 2022 21:28

Mental health and prayer apps have ‘worst’ privacy – survey

Despite their sensitive content, therapy apps have some of the poorest privacy protections, an investigation finds
Mental health and prayer apps have ‘worst’ privacy – survey

Despite their warm fuzzy packaging and seemingly altruistic intentions, mental health and prayer apps are “worse than any other product” when it comes to user privacy and security, an analysis by browser firm Mozilla revealed on Monday.

The vast majority of mental health and prayer apps are exceptionally creepy,” Mozilla’s Jen Caltrider, the primary creator of the firm’s “Privacy not Includedguide, which evaluated 32 such apps on their respect for users’ personal data, told the Verge on Monday. Caltrider noted that the apps one might think would recognize the sensitive nature of their data instead “track, share, and capitalize on users’ most intimate personal thoughts and feelings, like moods, mental state, and biometric data.” 

Apps slurping up biometric data and other seemingly private information are nothing new, but given the assumption of privacy that comes with the real-life relationship between a therapist and patient, or a believer and religious institution, one might hope that app developers would at least attempt to replicate such safe spaces when taking the whole process online.

However, the Privacy not Included guide found that out of 32 such apps, fully 29 received a privacy warning label, noting that they either stored large amounts of personal data under vague privacy policies or otherwise maintained poor security practices, such as weak passwords. One popular app, Talkspace, stashes entire chat transcripts between user and therapist. Another, an AI therapy chatbot called Woebot, actually collects information about users from third parties, then shares it for advertising purposes, all while pretending to be their silicon shoulder to cry on.

Mozillla researcher Misha Rykov referred to the apps his team analyzed as “data-sucking machines with a mental health app veneer,” or “a wolf in sheep’s clothing.” But with real-life therapy increasingly expensive and the process of finding a therapist who 'matches' a patient hit or miss, the lure of a friendly voice just a click away is difficult to resist for some. But given the apps’ apparent core purpose of data mining and selling users’ deepest darkest secrets, it might be a better idea to hold one’s tongue until one can meet up with a qualified therapist or at least a trusted friend – ideally in real life.

Podcasts
0:00
27:19
0:00
26:12