Gaming

Twitch releases first ever Transparency Report — without much transparency

From timeouts to bans, Twitch’s Transparency Report shows how aggressive its moderation is getting. But users still have plenty of unanswered questions.

Streamer gamer playing at strategy game in broadcast browser - Focus on his head
Shutterstock

Streaming platform Twitch has presented its very first Transparency Report which details the company’s efforts to improve in moderation, and filtering, and the steps it takes to get users to adhere to its community guidelines. The company says it will publish transparency reports twice a year in an effort to improve both the user experience and, hopefully, users’ trust in the moderation process.

The announcement comes several months after Twitch came under fire for purging thousands of videos without warning because of copyrighted music in them. The service subsequently apologized for being overly enthusiastic.

"This transparency report is the first of its kind for Twitch: it takes a hard look at how we think about safety; the product choices we made to create a safe space for all our communities, and how our safety staff, community moderators, and technological solutions help enforce the rules we set," the report states.

The key takeaways — Twitch claims that 95 percent of channels on its platform now boast moderators, whether they’re human ones or bots. This is an increase from last year's 93 percent. The company also says that message deletion is up from 3.2 per 1,000 messages to 4 per 1,000 messages.

The company also notes that its content blocking, chat filter, content warnings, review enforcement procedures, processes around flagging and reporting users, and other policies have all improved, but doesn’t go into too much detail about what that means. Timeouts — which allow channel owners to block users from their rooms for 10 minutes — as well as channel bans, have also gone up.

While it’s encouraging to see a major streaming platform take moderation seriously and try to offer proof of enacting its rules around speech and safety, many Twitch users are still curious about the moderation process, as Gamespot points out. Specifically, Twitch users want to know who exactly is doing the moderating.

Keep 'em moderated — Are moderators employed by Twitch? Or is the job outsourced to third-party companies as we've seen in the case of Facebook? In light of famous streamer Dr. Disrespect getting banned without a clear explanation from Twitch, people have also wondered about who has the final say when it comes to bans.

According to Gamespot, Twitch has said that these moderators “work across multiple locations, and support over 20 languages, in order to provide 24/7/365 capacity to review reports as they come in across the globe” but doesn’t answer the questions above. And therein lies the problem of self-reporting — without outsider input, the real questions often never get asked.