Suicide hotline ends data sharing with own for-profit after ethics concerns

After coming under fire for collecting and sharing data from its clients' messages, Crisis Text Line, a nonprofit mental health counseling service, has ceased its sharing agreement with its for-profit arm.

A Jan. 28 Politico report detailed ethical concerns with sharing anonymous yet sensitive client information for profit, Crisis Text Line announced Jan. 31 that it ended its agreement with for-profit arm Loris AI.

Crisis Text Line is one of the most prominent mental health hotlines, and it uses artificial intelligence to help people manage mental health crises. Since its launch in 2013, it has exchanged 219 million messages, and by collecting information on the text conversations, Crisis Text Line now claims to have the world's largest mental health dataset, Politico reported.

"These are people at their worst moments," Jennifer King, PhD, privacy and data policy fellow at Calif.-based Stanford University, told Politico. "Using that data to help other people is one thing, but commercializing it just seems like a real ethical line for a nonprofit to cross."

Prior to the announcement, the organization shared repackaged and anonymized versions of highly sensitive conversations with Loris, which then used them to create and market customer service software. In return, Loris had pledged to share some of its profits with Crisis Text Line. The two companies had the same CEO for more than a year. 

"During these past days, we have listened closely to our community’s concerns. We hear you. We understand that you don’t want Crisis Text Line to share any data with Loris, even though the data is handled securely, anonymized and scrubbed of personally identifiable information," the press release read. "As a result, we have ended our data-sharing relationship with Loris."

Copyright © 2024 Becker's Healthcare. All Rights Reserved. Privacy Policy. Cookie Policy. Linking and Reprinting Policy.

 

Featured Whitepapers

Featured Webinars