CR sent a letter on December 17, 2020 to seven companies that offer app-based mental health counseling: BetterHelp, Moodpath, Sanity & Self, Talkspace, Wysa, Youper and 7 Cups.
Based on an evaluation of privacy practices, CR is urging the companies to incorporate changes that address the following recommendations:
- Clearly explain procedures used for de-identification of data used for research. Identifiable data should not be shared except at the consumer’s direction. We advocate for companies to improve clarity on research for data sharing especially around how they define “anonymized data.” Companies should be explicit about what processes they use to de-identify data. We highlight this to help prevent people from being reidentified. Mental health applications collect sensitive information that can create damaging, irreversible impacts on individuals if shared with third parties, including social stigmatization and additional barriers to future opportunities.
- Provide clear and contextually-appropriate explanations of how user-provided data will be used, so users are aware of potential consequences before they share. Companies should not overwhelm people with superfluous information or choices. Wherever possible, app default settings should be that your privacy is protected and users should not have to worry about managing this on their own. However, if there are choices to be made or information someone should be aware of, they should be presented in a clear and straightforward way.
- Adhere to platform guidelines that are in place to protect people’s privacy. App developers should ensure that their apps meet the guidelines laid out in Android developer documentation, such as Best Practices for Unique Identifiers which recommends avoiding the use of identifiers like the Android ID (SSAID). App developers should also make sure that the libraries (SDKs) they embed within their apps meet their own expectations for data collection, and that they are configured accordingly.
- Transparently disclose the service providers that receive data when people use your apps. We recommend that companies are more transparent in their privacy policies about the service providers that receive data. Although it is not legally required or common practice in the U.S. to list every service provider or institution receiving data, we recommend companies proactively disclose this information.
The full report, 2-page brief and our letter can be downloaded using the links above. We also published a case study that provides additional information about our investigation.
Editor’s note: Earlier versions of this report included a chart summarizing a subset of the findings. While the general findings about CR’s reading of the different apps’ privacy policies remain, the chart has been removed to avoid any reader confusion that these mental health apps were being rated or ranked or that all these apps worked the same way. We have made other minor edits to reflect the removal of the chart.
Contact: Stephanie Nguyen, email@example.com