Tech

Reddit and Crisis Text Line join forces to provide mental health resources to users in need

Moderators will be able to flag users expressing suicidal or self-harming thoughts, and get them help.

A schoolgirl covers her face with her hands and weeps, and a friend is comforting her next to her. h...
Shutterstock

Reddit doesn't exactly enjoy a great reputation for being conducive to good mental health. It's often been on the receiving end of complaints about deeply insensitive content that flourishes on some parts of the website. Yet, in spite of its flaws, Reddit is also a safe haven for many people who turn to its multiple subreddits to vent about their very personal — and often painful — struggles and frustrations. One of those subreddits is r/depression, which University of Utah researchers found can actually be helpful for people coping with mental disorders. And it's now adding professional support for users in need.

The backbone of Reddit is the group of moderators who monitor its individual communities and keep a vigilant eye on the thousands of posts uploaded to the service each day. In some subreddits, that means encountering posts about self-harm or suicide. To better equip these community moderators with mental health resources, Reddit announced on Wednesday it's partnering with Crisis Text Line. The new, self-harm and suicide prevention kit tool goes live on March 10.

How it works — The idea is to connect Reddit users with resources and confidential real-time support from trained counselors, the company says. From March 10, Reddit moderators can "report the specific post or comment that worried you and select, Someone is considering suicide or serious self-harm," or they can "visit the person’s profile and select, Get them help and support." Here's what it looks like:

Reddit
Reddit

A good effort — Slowly but surely tech companies are becoming more proactive and involved about broaching sensitive subjects like their users' mental health, and acknowledging that can lend a helping hand. Facebook and Instagram, for exmaple, have their own mental health support tools. Sure, we'd like to see more explicit measures to tackle cyberbullying or misinformation online for social media companies and their peers, but we'll take what we can get. Especially when providing any sort of support tends to fall outside the purview of online company's core focus: making money for their shareholders.

If you're interested in understanding more about Reddit's mental health kit, group product manager of safety at Reddit, u/jkohhey took users' questions on Wednesday morning.