fbpx

TikTokers Have Stood Against Trumpism — But Extremists Co-Opted The App For 1/6 Violence

TikTok is very clear that its policies stand against violence, hate, and extremism. However, opportunistic extremists were still able to use the app to get their message out in advance of the 1/6 riots.

SARASOTA, FL – JULY 03: Former U.S. President Donald Trump arrives to hold a rally on July 3, 2021 in Sarasota, Florida. Co-sponsored by the Republican Party of Florida, the rally marks Trump’s further support of the MAGA agenda and accomplishments of his administration. (Photo by Eva Marie Uzcategui/Getty Images)

Tiktokers have engaged in a lot of effective political activism. In fact, Donald Trump tried to ban the app entirely in the U.S., after users banded together to reserve tickets to his rally in Tulsa, letting him expect a massive crowd that wouldn’t show up.

The app has also served as a podium for teens to reach a wide audience with their own views, and find like-minded peers. Even those close to politics, like Claudia Conway, whose mother, Kellyanne, served as an advisor to Trump and appeared on television frequently to defend him, and whose father, George, headed an anti-Trump Republican group, turned to TikTok to release her own frustrations about Trumpism and its effects on the country and her family.

At the same time, though, a Department of Homeland Security report found that pro-Trump extremists were also gathering on the app, whose algorithms can be so effective that different views and interests are often described as “sides” — ie, the “Christian side of Tiktok” or the “lgbt side of Tiktok” or the “crafting side of Tiktok.” What this means is that someone who shows interest in a certain type of content — whether it’s gaming, accessibility, or political — can expect to be provided with more content in the same realm. This can make it easy for political activists to be sure their content reaches like-minded audiences.

According to Politico, the DHS document detailed how the right-wing extremism had taken off on one corner of the app, raising concerns about the potential radicalization of another generation. However, one of the major concerns wasn’t so much about the content itself, but the need for researchers to be more savvy with newer forms of social media.

“There’s something to be said about the demographics of researchers — they tend to skew older. Very few can hear the first five seconds of a TikTok video and know what song that’s referencing,” explained the deputy director of the Program on Extremism at George Washington University, Seamus Hughes.

The Institute for Strategic Diologue published a report (PDF) last month on the ways that extremists use TikTok (separate reports address other apps, like Twitch) to promote hate and racial bias and “target people on the platform based on numerous protected attributes such as ethnicity, religion, gender, or others,” and addresses methods used to evade existing anti-extremism policies.

These include recreating banned accounts with slightly changed names, restricting comments, or deliberately misspelling hashtags to dodge blocks.

#GypsyCrusader (a reference to white supremacist Paul Miller) is blocked, but #gypseycrusadur is not and features in 5 videos in our sample. Additionally, #gypsycrudarer features in 18 videos in our sample. On TikTok, the page for this hashtag indicates there are currently 88 videos on the platform using this hashtag and they have been viewed a collective 4.9 million times.

Ultimately, the report gives TikTok a lot of credit, noting that it has “learned from the lessons of older, more established social media platforms in dealing with problematic content” and uses a “comprehensive and wide-ranging” understanding of extremism and hate speech. Still, the ISD has a few suggestions, including blocking certain audio clips from being used, and addressing hashtag issues with broader bans.



Follow Us On: Facebook and Twitter