From Qanon to 'Purebloods,' TikTok's growing extremism problem

Coronavirus news image header
Photo credit
ConspiracyTheories Data Insurrection SocialMedia Vaccine Violence DepartmentofHomelandSecurity Coronavirus tiktok COVID19 capitolriots extermists

Social media is great because it allows us to connect with people from all walks of life, but at the same time it can be dangerous. Most social media platforms allow anyone to post videos or spread information, and extremists are using this to their advantage. Corrupting the platform known best for short videos and viral dances, extremists are using TikTok to spread COVID-19 misinformation and conspiracy theories, and even encourage violence.

A newly released Homeland Security briefing found that insurrectionists used TikTok to spread information about the Jan. 6 Capitol attack and interfere with the National Guard during the riot and information about how to access the White House via tunnels and sabotage railroad tracks. According to the briefing document, domestic extremists have been using TikTok since October of 2019 to “recruit adherents, promote violence, and disseminate tactical guidance for use in various terrorist or criminal activities.”

While TikTok does have a moderation team, the team has done little to nothing to address these concerns. Instead TikTok, in many cases, has actually removed the videos of those fighting these extremist influencers as opposed to removing the accounts of those who spread the extremism.

In one incident a conservative influencer with 800,000 followers went viral by encouraging people to “snitch” on strippers to the IRS. While the platform took no action against him it deleted videos made by sex workers calling him out, Rolling Stone reported.

The briefing also noted that TikTok algorithms can “unintentionally” promote violent extremist content, even for those with little to no following, because of the layout of the platform’s “for you” page. This feature of TikTok recommends content to users based on their viewing history and activity. Once you’ve seen one conservative or extremist video, more will continue to appear on your feed. “A user’s account may have zero followers but could have substantial viewership on some videos, which could aid violent extremist TikTok users in evading TikTok’s content moderation efforts,” the document said.

While the brief is dated April 19, it was obtained Thursday through an open records request under the Freedom of Information Act (FOIA) by Property of the People, a nonprofit organization focused on government transparency, and shared with Politico shortly after.

According to another report published in August 2021 by the Institute for Strategic Dialogue (IDS), at least 491 accounts were found that shared a combined number of 1,030 videos that promoted hatred, extremism, and terrorism. Keywords associated with extremism allowed researchers to find the content. While a majority of the videos promoted white supremacist content, the most-viewed video, which was seen 2 million times, promoted anti-Asian hate and spread COVID-19 misinformation.

“Not only are communities, minorities, or groups of people like African Americans being targeted, those who target them are also being praised, shared, and glorified on the platform,” strategic dialogue institute analyst Ciaran O’Connor told USA Today.

At the time of writing the report, the Institute found that 81.5% of the extremist videos they identified on the platform were still live. A majority of the videos were posted in the three months leading up to the start of the project on June 4.

As a result, dozens of advocacy groups signed a letter sent to top TikTok executives calling for the platform to take “substantive action” against hate speech and extremism.

“We’ve already seen countless examples of online hate translating to offline violence, both in the United States and around the globe, often with deadly consequences. It’s because of this dangerous reality that TikTok has a responsibility to act to address gaps in your content moderation policies and boost transparency more widely for the public and researchers. For some populations around the world, this is a matter of life and death,” the groups wrote, according to a copy of the letter shared with The Hill.  

But TikTok is not alone in its failure to stop extremists from promoting hate on its platform. Facebook, Twitter, and YouTube have had similar struggles with groups like the Proud Boys using the platforms to spread hate. However, what makes TikTok’s platform more harmful than others advocates say is user-friendly features including the ability to add original audio and video and the switch function in addition to TikTok’s most common demographic, youth. Through the switch feature, users can use other videos in their own content, many extremists used this feature to attack minority community members or to induce violence, according to Politico.

As one of the largest social media platforms for children and teens TikTok has at least 100 million users in the U.S. The DHS report noted the demographic is a concern not only because extremists are targeting the youth but because of “some Homeland Security stakeholders’ limited awareness of its functionality,” due to their age.

According to researchers, this point is valid as American national security agencies have historically struggled to keep up with changes in social media.

“The extremism research field itself is pretty slow on TikTok,” Seamus Hughes, the deputy director of the Program on Extremism at George Washington University, said. “There’s something to be said about the demographics of researchers — they tend to skew older. Very few can hear the first five seconds of a TikTok video and know what song that’s referencing.”

He reiterated the DHS report’s point that even watching one video can change a user’s recommended videos to extremist content. “The TikTok algorithm is so good that before you know it, you’re on a domestic violent extremism spiral,” he said.

While extremism on social media is not a new thing, the report indicates it is growing at a rapid rate. The report listed evidence that both national and international extremists used the app to encourage violence and recruit individuals for their causes.

A disturbing trend on the platform recently adopted the term “pureblood” to convince those who are unvaccinated that they are superior to those who have received the COVID-19 vaccine. This same trend can be traced to the promotion of drinking bleach, taking ivermectin, and gargling Betadine, an antiseptic used to treat cuts and scrapes, to treat COVID-19.

Yet, the app maintains that it is working to remove content that violates its rules. “There is absolutely no place for violent extremism or hate speech on TikTok, and we work aggressively to remove any such content and ban individuals that violate our Community Guidelines,” a spokesperson for TikTok, Jamie Favazza told Politico.