Connect with us

Technology

TikTok removed hundreds of thousands of videos spreading misinformation about the election and COVID-19

Published

on

Even the video platform made for dance clips and viral challenges isn’t safe from misinformation.

According to TikTok, the company removed more than 340,000 videos in the U.S. during the second half of 2020 for spreading “election misinformation, disinformation, or manipulated media.” During that same period, TikTok also removed over 50,000 videos promoting misinformation about COVID-19.

It should be noted that TikTok made the decision to ban content promoting the right wing conspiracy theory during the time frame covered by this transparency report.

TikTok revealed these details in its most recent , which was released on Thursday.

In addition to the removed content, TikTok also deleted 1,750,000 accounts for “automation” during the same time period as the U.S. election. The company maintains that it was unclear if any of these accounts were being used to boost political misinformation, but viewed it as “important” to remove these automated accounts at the time.

Furthermore, TikTok shared that there were more than 400,000 videos of “unsubstantiated content.” The company did not delete these videos from the platform, but it did remove them from being eligible for TikTok’s “For You” feed. TikTok says that the “majority of content people see on TikTok comes through their For You feed,” which delivers users with recommendations that its algorithm believes a user would like.

The company says that it works with outlets such as PolitiFact, Lead Stories, and SciVerify in order to fact check claims made by users on TikTok.

In all, TikTok says it removed more than 89 million videos globally during the second half of 2020 for violating the platform’s community guidelines or terms of service. Nearly 12 million of those videos were posted by users in the states. 

Continue Reading
Advertisement Find your dream job

Trending