Social media algorithms are powerful tools shaping our online experiences, but TikTok’s recommendation system is particularly influential. An experiment by Guardian Australia reveals how a breaking news event can quickly steer users toward a conservative Christian and anti-immigration content funnel, shedding light on the significant impact of these algorithms on user content exposure.
Setting Up the Experiment
In April, Guardian Australia set up a new TikTok account on a fresh smartphone linked to a newly created email address. This account, representing a generic 24-year-old male named John Doe, was left inactive with no likes or comments. The goal was to observe how TikTok’s algorithm would curate content without any initial user interactions.
At first, the account received typical content, such as local Melbourne videos and iPhone tips. However, this changed dramatically following a major event.
The Wakeley Church Stabbing and Its Impact
On April 15, a stabbing attack occurred at the Assyrian Christ the Good Shepherd Church in Wakeley, Sydney, targeting Bishop Mar Mari Emmanuel. This event triggered a significant shift in the TikTok content served to the experimental account.
Instead of showing the attack itself, TikTok began presenting videos of Emmanuel’s conservative Christian sermons. The algorithm picked up on this content, leading to a surge in similar videos, all focused on conservative Christian themes.
From Sermons to Extreme Content
Three months into the experiment, the TikTok feed was dominated by conservative Christian content, along with videos supporting controversial figures and ideologies like Pauline Hanson, Donald Trump, and anti-immigration and anti-LGBTQ views. The account even encountered extreme content suggesting violence against drag queens.
This pattern was consistent with findings from similar experiments on Instagram and Facebook, but TikTok’s algorithm showed greater sensitivity to user engagement, including video watch times. The platform continued to push related content unless the user explicitly chose to avoid it.
TikTok’s Explanation and Expert Opinions
A TikTok spokesperson commented, “The more someone searches or engages with any type of content on TikTok, the more they will see. However, you can refresh your feed or select ‘not interested’ on specific videos at any time.”
Dr. Jing Zeng, an Assistant Professor of Computational Communication Science at the University of Zurich, explained that TikTok’s “For You” algorithm can be unpredictable. Early engagement with content can significantly shape the feed. “If the first pro-Trump video catches your attention, the algorithm might show you more similar content,” she noted.
Comparing TikTok and Meta Platforms
Jordan McSwiney, a senior research fellow at the University of Canberra’s Centre for Deliberative Democracy and Global Governance, compared TikTok’s recommendation system with Meta’s platforms (Facebook and Instagram). TikTok’s algorithm is designed to keep users engaged by continuously recommending new videos, a strategy Meta is beginning to incorporate into its Reels feature.
McSwiney highlighted that these platforms are driven primarily by profit. “They’re not operating with any kind of social license. Their focus is solely on keeping users clicking and scrolling to maximize advertising revenue,” he said. “Their goal isn’t to foster meaningful debate or a healthy public sphere.”
Need for Transparency and Accountability
McSwiney advocates for greater transparency from tech companies about their algorithms, which are often seen as “black boxes” with limited visibility for researchers. He believes that platforms should not be excused for the societal impact of their content. “We shouldn’t let these multibillion-dollar companies off the hook. They have a social responsibility to avoid promoting harmful content,” he argued.