How many times verified election-related info was viewed by users.
As part of its ongoing transparency efforts, TikTok has released a fairly extensive dataset about its fight against misinformation and other violations of its terms of service in 2020. All told, the company removed 89,132,938 videos globally during the second half of 2020 for violating its Community Guidelines or Terms of Service.
Overall, TikTok reports that a staggering 92.4 percent of all removed videos were taken down before a user had even reported them — so the platform’s auto-moderation is working very well, by all accounts. A total of 83.3 percent of removed videos were taken down before they had even received any views.
Around 11 million of those videos were removed in the United States. Election misinformation topped the list of infractions, unsurprisingly enough, though plenty of videos were removed for attempting to spread COVID-19 misinformation, too.
TikTok’s dedicated efforts to reduce harm across its platform are paying off in spades. Other social media companies should grab a pen and start taking notes.
Being proactive is working — TikTok’s overall moderation strategy focused on two points: using proactive measures and tightening everything up. This methodology is working well, based on the company’s latest statistics.
Perhaps most impressive is the traffic TikTok has managed to drive to its many information hubs, which are basically just easily scannable landing pages packed with verified information. The platform’s COVID-19 hub, for example, was viewed more than 2.5 billion times since its inception. The 2020 U.S. elections guide was viewed nearly 18 million times.
Adding public service announcements to videos with touchy subjects has also boosted the visibility of verified information. PSAs with links to information from the World Health Organization were viewed more than 38 billion times (!) in the second half of 2020 alone. Those on election-related videos were viewed more than 73 billion (!!!) times.
Everyone else can do better — TikTok had a very difficult 2020, mostly because the outgoing Trump administration continued to rail against it as a national security threat far beyond any proof it had. If anything, that scrutiny sent TikTok’s moderation efforts into overdrive. The numbers show that, in the end, this increased attention actually helped TikTok significantly slow the spread of misinformation on its platform.
This stands in direct contrast to other social media companies like Facebook that consistently struggled with 2020’s overload of misinformation. Facebook’s reactionary election plans, for example, were so bad that the majority of its employees thought the company wasn’t doing enough.
TikTok’s election mitigation plans, on the other hand, ended up being appropriately stringent. And given that verified election information was viewed on TikTok billions of times last year, that planning certainly paid off. TikTok went into 2021 a much safer space than most other social networks.
You can read the full text of TikTok’s H2 2020 Transparency Report here.