Nick Frisch
War in Ukraine has multinational social platforms scrambling to rewrite their playbooks for content moderation. Facebook has loosened hate-speech bans on Ukrainian nationalist militias who flirted with fascism and anti-semitism; new policies allow Facebook users to incite violence against Russian occupiers. YouTube has blocked official channels of Russian state media. Google has faced pressure to tone down Russian propaganda running in paid advertising. Twitter is crimping the reach of official Russian government accounts, and scrubbing images of prisoners of war that contravene the Geneva Conventions.
Then there is TikTok, the colossally popular short video platform, whose addictive algorithm serves up an endless stream of shimmying and lip-syncing Gen-Zers. TikTok’s parent company, Beijing-based ByteDance, developed its AI-driven content delivery through news aggregation app Toutiao before pivoting to video. TikTok is China’s first global media platform to compete with Silicon Valley’s social behemoths; it was the most-downloaded app of 2021. Before the war in Ukraine, TikTok struggled through public relations headaches. News reports called out TikTok’s sanitizing of content displeasing to Beijing, including accounts of repression in Xinjiang and Hong Kong. The algorithm also has a nasty tendency to drive impressionable young users towards looping video sequences about self-harm. Then there is TikTok’s aggressive and intrusive data collection.
A Trump Administration effort to force the company’s sale was framed as a national security necessity. Trump, attempting to mandate sharing of TikTok’s secret-sauce algorithm with an American corporate partner, sparked a war of words with Beijing. People’s Daily, the Communist Party’s official mouthpiece, threw its weight behind ByteDance, a nominally private company, describing its proprietary algorithm as a “core interest,” the same label applied to sovereignty issues like Tibet and Taiwan. Trump’s proposed shotgun merger between TikTok and an American firm was iced by the Biden Administration pending further review. Meanwhile, the company has repositioned itself as a global firm, with content moderation decisions taken in Los Angeles and Dublin, and TikTok’s international headquarters located in Singapore. International outrage over censoring certain content, such as a teen make-up tutorial-turned-plea for Uighur rights, has forced the company to loosen the censorship defaults originally imported from Douyin, TikTok’s domestic counterpart within China, whose videos are heavily censored and surveilled.
The war in Ukraine has tested TikTok’s content moderation, posing challenges that, due to peculiarities of TikTok’s platform, can be thornier than those handled by Facebook or Google. While the latter two companies employ extensive text-based and static image screening of misinformation as a supplement to scanning video and audio, TikTok’s sole focus on video means that clips can be delivered stripped of text or metadata, forcing moderators to use computationally intensive scanning of audio and video. Users are flooding the platform with misleadingly edited or dubbed video, privileging sensationalist material to gain attention (and, sometimes, cryptocurrency donations). An analysis by NewsGuard found that TikTok was driving users towards erroneous information within a few minutes of scrolling, including both pro-Russian and pro-Ukrainian memes. Despite these flaws, disengagement from TikTok isn’t a realistic option: the app is too popular with an otherwise hard-to-reach youth demographic, valued both by advertisers and by legacy media organizations eager to reach new users. On TikTok, misinformation videos about Ukraine circulate alongside videos from reputable and verified legacy news outlets, such as the Washington Post.
TikTok is not alone in scrambling, and sometimes failing, to deploy an effective content policy during a sudden and catastrophic war; Facebook, Google, and YouTube have struggled too. As the initial shock of the invasion dissipates and the war grinds on, social media giants will refine and consolidate their wartime moderation policies. TikTok, whose Beijing-based CEO can’t be easily subpoenaed by Congress, will face questions its peer platforms do not. Western tech firms, headquartered and operating in liberal democracies sympathetic to Ukraine, have picked sides in the conflict: besides Facebook and Google, Apple and Dell have stopped sales in Russia and Belarus. Their Western corporate counterparts in aviation, retail, energy, and other sectors have either pulled out or are eyeing the exits. TikTok, to avoid running afoul of Moscow’s new information law, has disabled some features within Russia. The company is holding daily meetings in Dublin to manage the competing demands of covering the Ukraine war in a way acceptable to governments and users in Western countries whose populations are essential to TikTok’s global market share.
Still, whatever its efforts, as a Chinese-owned company seeking to maximize global market share and profits, TikTok may face uncomfortable questions in the months and years ahead. Beijing has clearly signaled its disapproval of the West’s attempts to economically blockade Russia. TikTok’s corporate parent, ByteDance, ultimately answers to the whims of the Chinese Communist Party. Episodes from its recent history suggest a complex path forward. In 2017, ByteDance’s original news aggregator app was initially advertised as a human-free algorithm. Authorities in Beijing were horrified to discover that the AI’s tendency towards toxic looping of “unhealthy” content, including risqué sexual and political material. Bytedance’s then-CEO was forced to make a public apology, and shutter an irreverent sister app. The company assigned human moderators to tame the AI and ensure no repetition of the politically undesirable content being pushed to millions. For the past several years, ByteDance has been a loyal screw in the Party machine, its landing pages regularly reserving space for the Party’s most important bulletins. ByteDance’s news content reflects the official line on sensitive issues, even as the algorithm feeds users an addictive blend of their favorite pop stars, adventurous recipes, or home repair tips. The Chinese domestic version of TikTok, a nearly identical platform called Douyin, applies the same curation to videos within China’s Great Firewall. On these platforms, Washington and NATO are cast as villains who forced Moscow’s hand. In the last few days, Bytedance’s aggregated news about the massacre of civilians near Kyiv emphasizes Beijing’s official pro-Moscow slant, seeding doubts about whether the massacre was fake, staged, or committed by Ukrainian forces or Western agents.
ByteDance is hardly alone in propagating this view. In fact, it is only the latest in a long line of legacy media that have learned to sing Beijing’s tune, both at home and abroad. China’s heavily censored domestic media takes its cues from official spokesmen, state TV, and official newspapers like People’s Daily. Beijing is emphasizing American perfidy, denouncing sanctions, and sympathizing with Russia’s grievances, amid blandly worded appeals for de-escalation and dialogue. The Ukraine coverage circulating in China’s domestic media ecosystem is echoed abroad in Beijing’s overseas platforms, in its international television channels and newspapers. While official Russian channels have been purged from Western tech platforms, Chinese channels, duly marked as state media, remain. (This occurs on a low-tech level, too: paper copies of China Daily, Beijing’s major English publication abroad, are still sold in the streets of Manhattan, alongside local papers like New York Times.) These vectors for China’s official line may not make much of a dent in a robust media market in America or Western Europe. Further afield, however, China’s decision to support and amplify the Kremlin’s fevered narratives, such as claims about Ukrainian biolabs, may have deeper consequences. In Africa and Latin America, China’s efforts to train local journalists have run in tandem with efforts to provide news-wire bulletins from Xinhua state media, in local languages, to pad out local papers. Such efforts to shape legacy media outlets, far away from Washington or Brussels, have been underway for years.
But TikTok, with 100 million users in Latin America and a growing user base in Africa, changes the game. Privileging growth while trying to avoid controversy will only take ByteDance so far. Beijing has recently cracked down on tech giants, displacing ByteDance’s previous CEO and taking a company board seat. The Communist Party may soon awake to previously unknown opportunities for Beijing’s priorities to subtly shape discourse far away from media centers in London or New York. TikTok’s choices to amplify or downplay certain types of content around the world will be influenced by human moderators whose ultimate corporate bosses sit in Beijing. Even if American and European data and content moderation compliance laws are enforced on TikTok subsidiaries in those jurisdictions, that leaves the rest of the world’s TikTok users using a platform without local visibility into how content is curated. The consequences may trickle into geopolitics. When Facebook was used to whip up Islamophibic hysteria against Myanmar’s minorities, the platform faced accountability through the Western media and political system. If TikTok played a role in inciting a genocide in Africa, would we ever know how it happened? How would TikTok behave—and to whom will its CEO ultimately listen?
Nick Frisch is a Resident Fellow of Yale's Information Society Project. You can reach him by e-mail at nick.frisch@yale.edu.