Culture

TikTok adds mental health resources, hides graphic videos by default

By continuing to update its preventative wellness policies, TikTok ensures it's leagues ahead of other social media.

picture alliance/picture alliance/Getty Images

TikTok today updated its Community Guidelines, with most of the refresh focused on users’ wellbeing. The update includes re-vamped in-app health resources, a brand new text-to-voice feature, and a new warning screen in front of “graphic” videos.

“At TikTok, safety isn’t a nice-to-have or an afterthought; it’s central to all our work, and our teams strive to be inclusive and thoughtful when developing our policies,” writes Cormac Keenan, TikTok’s Head of Trust and Safety, on the update. He adds that the update is based on feedback from the community, academics, civil society organizations, and its Content Advisory Council.

TikTok

The new opt-in viewing screens, TikTok says, will appear in front of videos flagged for graphic or otherwise “distressing” content. While the most violent of videos are removed from the platform, others — like horror movie clips, for example — are left up but left out of consideration for the "For You" feed. That category of video will come with a new content warning screen, too.

Overall, the update is far from the most comprehensive we’ve seen from TikTok, but it’s nonetheless a valuable one that continues expanding the app's harm reduction techniques.

Social media wellness — The updated health hub, which TikTok has done a solid job of keeping current as the pandemic has progressed, will now include COVID-19 vaccine information, including FAQs with public health experts. TikTok says its resource hub has been viewed more than 2 billion times globally in the last six months.

TikTok

TikTok is also expanding its efforts to avoid normalizing self-injury and other dangerous behaviors, like those related to eating disorders. The company has been hyperaware of this need after being hit with some well-deserved criticism around the viral video of a man’s suicide.

Similarly, the new guidelines update TikTok’s “minor safety” policy to reiterate that content promoting “dangerous dares, games, and other acts that may jeopardize the safety of youth” is not allowed on the platform. The company has made serious progress toward making TikTok safer for younger users since advocacy groups pushed for increased youth protections back in May.

Taking the preventative road — The second half of 2020 has been very rough for TikTok in terms of politics; the first half (and some of last year) were defined for the app by its seeming inability to create harm-reduction models of moderation. Remember when moderators were told they should suppress posts by “ugly” users?

TikTok has — thankfully — course-corrected to the extreme. Now the company’s general policymaking decisions are driven by prevention, both to avoid future scandal and to genuinely help users feel better (at least some of it comes across as genuine). This preventative mindset stands in direct contrast to other social media companies’ policies, which are often reactionary, skirting around a problem until it’s too late to really mitigate harm.

The service is also introducing a text-to-speech feature that converts typed text to voice that plays over the text as it appears in a video. It’s the app’s second accessibility feature in recent months, following the creation of a photosensitive epilepsy tool.