Elon Musk, CEO of X (formerly Twitter), announced this week that the platform will weaken its blocking feature, sparking concern among users. Musk’s decision means that blocked users will still be able to view public posts from those who have blocked them, though they will be prevented from engaging directly through replies or messages. This move has raised alarm among users who rely on blocking to manage harassment, offensive content, and general unwanted interactions. For many, blocking isn’t just a tool of convenience; it’s essential for mental health and safety in an increasingly toxic online environment.
The Importance of Blocking on X
For many users, blocking is what makes the platform usable. It allows individuals to shield themselves from harassment, trolls, and offensive content that can make their experience unbearable. As a longtime user of social networks, from the early days of Usenet and pre-Internet services like CompuServe and GEnie, I’ve seen platforms evolve. However, none of these platforms have ever reached the low points that X has, especially since Musk’s takeover.
The block feature on X serves as a protective measure, allowing users to avoid harmful or distressing content and interactions. Whether it’s trolls, stalkers, or people who simply won’t respect boundaries, the ability to block has become essential for users trying to carve out a manageable experience on the platform. Musk’s latest decision could undermine this, making X even more difficult to navigate, particularly for women, marginalized groups, and others who are frequent targets of online abuse.
Musk argues that weakening the block function will increase transparency, suggesting that users should be able to see all public posts, even if they’ve been blocked by others. According to Musk, this aligns with his vision of a more open and “transparent” platform. However, many critics argue that this so-called transparency benefits no one except trolls and bad actors, who can now continue viewing the posts of people they’ve harassed or abused.
The idea that transparency equates to removing users’ ability to control what they see and who interacts with them is problematic. True transparency in a social media context should empower users with the ability to curate their experience, not expose them to more harmful interactions.
Why X Needs Stronger Blocking, Not Weaker
The real problem with weakening the blocking feature is that it ignores the needs of vulnerable users. For many, blocking isn’t about avoiding differing opinions—it’s about protecting themselves from targeted abuse, cyberstalking, or harassment. Many women, in particular, use the block function to stop unwanted advances or threats. Without a strong blocking feature, those users are left more exposed.
As rapper Zuby pointed out, “There are some REALLY bad actors on social media.” For individuals who are frequently targeted, the ability to block abusers is crucial. If those users can continue to see your posts despite being blocked, it opens the door to further harm.
The data speaks volumes. Daily active users of X have fallen in the U.S. since Musk’s takeover, with a significant 18% decline as of February 2024. Moreover, a YouGov survey from August found that 42% of X’s daily users have a negative view of the platform. Removing or weakening features like blocking could accelerate this exodus.
Beyond the blocking controversy, Musk’s recent actions have led to growing concerns about the way the platform is manipulated. According to Kate Conger and Ryan Mac in their book *Character Limit: How Elon Musk Destroyed Twitter*, Musk has pushed for algorithmic changes to ensure that his tweets gain more visibility than others’. This all came to a head after the 2023 Super Bowl when Musk reportedly became upset that President Biden’s tweet about the game outperformed his own. The result? X’s algorithm was tweaked, and many users began seeing Musk’s tweets at the top of their feeds—even if they didn’t follow him.
As some users have noticed, you can’t block Musk or certain “super-users” on X, despite Musk’s insistence on the importance of transparency. For regular users, this only deepens frustration, as their feeds are flooded with content they never asked to see.
How to Protect Yourself on X
If X goes through with weakening its block feature, users will need to consider other strategies to protect themselves. Mute functions and private accounts offer some options, but they’re limited compared to blocking. For many users, leaving the platform altogether is becoming a more appealing option.
If you’re tired of the changes on X, there are alternatives. Bluesky, for example, recently opened its doors to the public and functions similarly to the original Twitter. With a maximum post length of 300 characters, Bluesky offers a familiar experience with added features like a “What’s Hot” feed for trending posts. Unlike X, Bluesky’s block feature is robust—once you block someone, they can’t see or interact with your content at all.
Similarly, Mastodon is an open-source platform that’s gaining popularity. Its decentralized structure allows users to choose from various communities within the “Fediverse.” While it’s not as straightforward as Twitter in terms of interaction across different servers, it offers comprehensive blocking options that ensure users can maintain their privacy and safety.
Lastly, there’s Meta’s Threads, which integrates with Instagram. Although it’s not as similar to Twitter in terms of its interface, it offers simple blocking features that extend across both platforms. Blocking someone on Threads also blocks them on Instagram, making it a useful tool for managing multiple social platforms.
If Musk continues with his plan to defang the blocking feature on X, the platform risks alienating an even larger portion of its user base. At a time when X’s daily active users are already in decline, decisions like this will only accelerate the trend of users abandoning the platform for alternatives like Bluesky, Mastodon, and Threads. More than ever, X needs stronger blocking capabilities, not weaker ones, to protect its users and provide a safe, enjoyable experience. Without these features, the platform will become increasingly unwelcoming, driving away users who once found solace in the ability to control their online environment.