TikTok users often express their awe or dismay at the seemingly uncanny accuracy of the app’s recommendation algorithm. The Wall Street Journal published a video today that dives into how TikTok customizes your feed.
WSJ researchers conducted an experiment where they created bot accounts with assigned interests. The bots “watched” videos in TikTok, paused, or played videos with images or hashtags relevant to those interests. The WSJ team rated the results with Guillaume Chaslot, an algorithm expert who previously worked at YouTube.
The findings are consistent with TikTok’s explanation of how the recommendations work. TikTok has said before the For You feed is personalized based on the type of videos you interact with, how you interact with them, details about the videos themselves, and account settings such as language and location.
If you’re hesitant about a weird video that caught you off guard, there’s no way the algorithm can tell it apart from content you really like and want to see more of. That’s how some people end up with a bunch of For You recommendations that don’t seem to reflect their interests.
While humans have more different tastes than bots, the experiment shows how quickly a user can be exposed to far-flung corners of potentially harmful content. According to the WSJ, TikTok identified the interests of some bots in just 40 minutes. One of the bots fell down a rabbit hole of depressive videos, while another ended up with election conspiracy videos. Although as Will Oremus points to Twitter, algorithmic rabbit holes can also lead people to positive content.
The video has a lot of details and visualizations, so it’s a good way to wrap your head around the “magic” of how TikTok works. Watch the video above or at the WSJ website – be warned that it contains clips from TikToks referring to depression, suicide and eating disorders.