TikTok Can Manually Select Videos and Trends to go Viral. How do They Use It?

Reading Time: 5 minutes.

TikTok logo with an eye, repeated a bunch of timesInstagram, Facebook, Twitter, YouTube, your Amazon shopping list, it’s all powered by suggestion algorithms that figure out the best ways to keep you engaged. They’ll show you content they know can keep you coming back. But no app has been able to capture the attention spans of the world quite like TikTok. The algorithm is so good that people claim it “knows” them within just a few minutes of use. It shows users what they’ve been looking for, even if they didn’t know what they were looking for. Aimlessly scrolling and seeing exactly what they wanted to see.

Or perhaps it’s exactly what ByteDance, TikTok’s Chinese owners, wanted you to see.

As it turns out, TikTok has a way around the algorithm, called “Heating.” This allows them to make certain posts go viral, end up on more feeds. Not only are they feeding people placating videos for aimless scrolling, they’re boosting certain ones. But how does TikTok choose? In the past, they’ve used their moderation abilities to hide videos about China’s concentration camps for Uyghur Muslims, hide posts from transgender people, and hide Black Lives Matter protests. We also know how quickly a conservative-leaning person on TikTok will drown in radicalizing far-right content shortly after joining. Just what is TikTok trying to boost?

Actually, it’s not hard to tell. After all, Russia engaged in similar influence campaigns on Twitter and Facebook prior to the 2016 election. That misinformation campaign may have been the deciding factor in Trump’s win. What could China use TikTok for? They’ve captured the attention spans of the world, now what will they use it for?

Virality is Influence

If you’re not terminally online, much of your socialization likely comes from in-person interactions with your friends. But where do the topics of discussion, the pop culture mentions come from? Depending on your age bracket, likely from some viral video or meme on TikTok. Someone said something that caught on, another person made a similar video or branched off the first, and, boom, we’re all saying “cheugy” until it’s cheugy and charisma is shortened to “rizz.” Or maybe everyone is buying the latest cheap threads from a Chinese sweatshop. A single viral video can shape a culture.

So how would you feel if someone was intentionally pulling those strings? What if that “someone” was also considered a threat to national security?

TikTok has a tool called “heating.” This is the artificial boosting of videos, and accounts for 1-2% of the videos a person sees in a day. That might not seem like a lot of influence, but that heating often leads to follows and more content from a particular creator ending up on your feed. While TikTok may have boosted only a small number of videos on your For You Page (FYP), the effect will be much larger and much harder to measure.

Six former TikTok or ByteDance employees told Forbes about the company’s heating practices. They admit that there are no clear reasons for some boosts. Some are just employees boosting their own content or that of their friends. However, much comes from the company itself. They may boost content from brands or creators they want to do business with, to make it seem like TikTok is a successful place for business and influencers. TikTok doesn’t tell users that content they’re seeing may have been artificially boosted for promotional effect or advertising, which falls in a legal gray area in regions where ads must be clearly labeled. While TikTok is rolling out a “Why This Video” feature, which would tell a user why they’re seeing a particular video, heating will not be a reason given to users, at least not currently.

From China with Radicalization

tw: This section will discuss topics of hate and violence. The targets of this hate include the right’s typical targets: transgender and other LGBTQ+ people, women, and racial and religious minorities in the United States.

Media Matters has done extensive research into TikTok’s radicalization pipeline. They’ve examined how a new user can quickly get far-right videos and accounts suggested on their For You Page (FYP), the mostly algorithmic page where TikTok shows users videos from people they follow as well as suggested accounts to follow. They experimented introducing new accounts to TikTok they found that one slightly right-wing or outright far-right ideal could quickly snowball into more far-right content. For example, liking transphobic videos quickly lead to their FYP being flooded with racism, misogyny, antisemitism, anti-vaccine, and many calls to violence against these groups. Following a QAnon page would quickly lead to accounts linked to far-right groups like the Oath Keepers, 3%ers, Patriot Party. Some of those groups have been labeled terrorist organizations by civil rights groups and even governments.

Screenshots from a TikTok video and comments. The video is from a video game, with piles of dead bodies on a rainbow crosswalk. The poster claims they are "doing gods [sic] work," while a commenter says, "Beautiful now do it in real life." The latter is the top comment

TikTok says these hate groups have been banned, but are easy to find, even through hate-inspired hashtags. One transphobic video that showed a person playing a video game, shooting people at a Pride-themed area had over 200k views and 25.7k likes. It was not removed. The top comment told the video creator to go kill LGBTQ+ people in real life. TikTok also did not take that down before Media Matters reported on it.

The slightest amount of engagement with right-wing content quickly throws users down a right-wing rabbit hole into more extremist content. This radicalization takes someone who may feel insecure and harbor one slightly conservative viewpoint into espousing other hateful ideals and even suggestions of violence. YouTube has been popular for this kind of radicalization, but with TikTok’s ability to quickly suggest it and throw it on users’ most frequently-viewed page, they can speed up radicalization to dangerous degrees. Misinformation and hate speech can flood a user in minutes. A vulnerable user could end up hating and even wanting to kill people different from them in no time.

Radicalization and Division

Prior to the 2016 election, Russia engaged in “Troll Farms.” These pushed people into more radical right-wing beliefs and helped push Donald Trump to popularity. They intentionally caused division to destabilize the American political environment. These efforts were incredibly successful. The right moved further right than it has been in America in decades, political violence and far-right domestic terrorism is on the rise, and far-right Republicans even staged a violent coup attempt. For Russia, their destabilization campaign worked perfectly, driven largely by radicalizing right-leaning conservatives.

Now, through TikTok, China has influence that Russia could have only dreamed of. They have one of the most popular social networking apps in the world, outside of China and nations where it’s banned. With the knowledge that the app quickly radicalizes right-leaning people into violent far-right content, and the fact that videos can be “heated” without explanation, reason, or the users’ knowledge, we can see how China has excised influence on Americans, Europe, and every other nation where its app has been allowed to spread. The algorithm is bad enough, but TikTok’s Chinese owners in ByteDance, with their strong ties to the Chinese government, can weave their own narratives through heating. Virality is influence, and a country that wants to see the downfall of American stability is at the reins.

Bans Coming Soon?

In the U.S., TikTok is banned from government phones. The intelligence community and legislators deemed it too much of a threat for government devices. Senate Intelligence Committee member Michael Bennet (D-CO) has called TikTok “an unacceptable threat to the national security of the United States,” and has called for removing it from the App Store and Google Play, as did FCC commissioner Brendan Carr. The bans cross political lines, as, in a rare case of bipartisanship, Republicans Josh Hawley (R-MO) and Congressmen Ken Buck (R-CO) have have also called for nationwide TikTok bans.Numerous countries around the world have already banned TikTok, some for censorship reasons, others to block China’s data collection. Even China does not have TikTok available, making a separate app, DouYin, for Chinese users.

TikTok has shown that it goes through secret practices to increase the spread of certain videos. They’ve hid their intentions and ties to the Chinese government. The app collects far more data than any other social networking app, and users have reported it even gobbles up data when not in use. China has little to qualm fears of influence, division, and data collection. Meanwhile, millions of people still use TikTok every day, sending data to China for AI training as well as seeing potentially radicalizing content. As nations debate whether or not the app is dangerous, security researchers and civil rights groups have already said their piece: TikTok is dangerous and not to be trusted.


Sources:
,