Open up short-form video app TikTok and you’re met with a stream of popular videos, tailored to your own interests. But when your friend launches TikTok on their own phone, they’ll see something different. TikTok calls this main stream of content the “For You” feed because of how it’s personalized to the individual user. But how the recommendation system worked behind the scenes was something of an unknown — until now. Today, the company detailed the factors that contribute to the For You feed, as well as how they’re weighted for each individual user.
It also explains what it’s doing to ensure the system isn’t creating filter bubbles — that is, a feed where you’re presented with a homogeneous stream of videos.
Like many recommendation systems, TikTok’s For You feed is powered by user input.
In its case, the app takes into account the videos you like or share, the accounts you follow, the comments you post and the content you create to help determine your interests. In addition, the recommendation system will factor in video information like the captions, sounds and hashtags associated with the content you like.
To a lesser extent, it will also use your device and account settings information like your language preference, country setting and device type. But TikTok says these factors receive lower weight in the recommendation system compared with others, because they’re more focused on making sure the system is optimized for performance.
Other signals contribute to TikTok’s understanding of what a user likes, as well. For example, if a user watches a longer video from beginning to end, it’s considered a strong indicator of interest. This would be given a greater weight than a weaker signal, like if the viewer and poster were from the same country.
TikTok also confirms that a video is likely to receive more views if it’s posted by an account that has more followers, simply because that account has a larger base of viewers. But it adds that neither the follower count nor whether or not the account has had high-performing videos in the past are considered direct factors in its recommendation system.
That seems to indicate that TikTok’s top users with massive reach — accounts like charli d’amelio, addison rae, Zach King, Loren Gray, Riyaz, Spencer X, BabyAriel and others — aren’t guaranteed to hit the For You page of every user, even if that user follows them.
As you continue to use TikTok, the system takes into account your changing tastes and interests, even noting when you decide to follow new accounts or explore hashtags, sounds, effects and trending topics on its Discover tab. All these will tailor your TikTok experience further.
Users can also signal to TikTok their more explicit likes and dislikes with a long press, where they can either add a video to their favorites or mark it “Not interested.”
Of course, like any app powered by user input and signals, TikTok has to find a way to get over the cold start problem. Upon the app’s first launch, it doesn’t know what sort of content an individual likes. To address this, it asks new users to select categories of interest — like pets or travel — to help tailor the initial recommendations. If users don’t opt to select categories, TikTok shows a general feed of popular videos until it has more input. Once it gains its first set of likes, comments and replays, TikTok will begin to initiate an early round of recommendations.
TikTok says it also understands that catering too much to someone’s personal taste can lead to the development of a limited experience, known as a “filter bubble.” That can lead to an “increasingly homogeneous stream of videos,” it says, and is a concern it takes seriously.
A couple of years ago, a VICE report indicated that TikTok had not yet overcome this challenge. A reporter trained TikTok to show white supremacist content by following a certain set of creators, by searching up related hashtags and by liking only videos that matched this “interest.” TikTok’s failure was more than just one of moderation — it was also an indication that a dedicated user could craft a version of the app filled with hateful content, if that was their goal.
Today, TikTok says it’s working on ways to keep a user’s For You page diverse and fresh. That means removing repetitive content, duplicated content, content you’ve seen before and spam. But it also means making sure you don’t see two videos in a row by the same creator or with the same sound. For safety reasons, the app also won’t recommend videos that some may find shocking — like a medical procedure or the consumption of regulated goods (even if legal).
In addition, TikTok will add videos to your For You feed at times that don’t appear to be relevant to your expressed interests or have amassed a huge number of likes. This is part of its attempts to add diversity — by giving users a chance to stumble across new content categories and new creators, and to allow them to “experience new perspectives and ideas,” the company says.
This is a problem that Facebook, Instagram and YouTube haven’t well addressed. Their algorithms often keep you in your own echo chamber, highlighting more of the same sort of content you’ve previously liked, or even pushing you to more extremist viewpoints over time.
TikTok says it knows this is a downside to personalization.
“By offering different videos from time to time, the system is also able to get a better sense of what’s popular among a wider range of audiences to help provide other TikTok users a great experience, too,” the company says in a blog post. “Our goal is to find balance between suggesting content that’s relevant to you while also helping you find content and creators that encourage you to explore experiences you might not otherwise see.”
The disclosure on how the algorithm works comes at a time when major U.S. tech companies are facing antitrust investigations in the U.S. and EU, and TikTok specifically has been under U.S. congressional review for its ties to China. In more recent weeks, TikTok and other Chinese apps have been under fire in India, as well, due to issues around a border dispute that has led to some Indian officials to ask for the apps to be blocked.
TikTok has been steadily working to change its perception in the U.S. by forming a Content Advisory Council and opening an LA facility, the TikTok Transparency Center, where outside experts can view TikTok’s source code and see TikTok’s moderation practices in action first-hand.