Last Thursday, Thierry Breton, the prominent European regulator and Commissioner for the Internal Market, penned a strongly-worded missive addressed to Shou Zi Chew, the Chief Executive Officer of TikTok. Within the lines of his letter, Breton articulated profound concerns regarding TikTok’s role in the dissemination of disinformation and illegal content pertaining to the Israel-Hamas conflict. In no uncertain terms, Breton underscored the urgent imperative for TikTok to demonstrate a swift and effective response in curbing the spread of such misleading information. This is particularly critical, given the substantial reliance of many young users on the platform as a primary source of news.
It is noteworthy to mention that Breton also extended analogous letters to other prominent figures within the tech industry during the same week. One such letter was addressed to Elon Musk, the influential figure behind X, expressing parallel concerns about the responsibilities and actions of Musk’s enterprises. Another letter was dispatched to Mark Zuckerberg, the Chief Executive Officer of Meta, with similar apprehensions surrounding the content and influence of the Meta platform.
EU’s Digital Services Act Holds Tech Industry Accountable for Content Moderation
Thierry Breton’s proactive communication with these tech leaders underscores the growing awareness and scrutiny of the tech industry’s role in shaping public discourse, particularly in light of significant geopolitical events like the Israel-Hamas conflict. Breton’s plea for accountability and responsible content moderation reflects the broader debate about the power and responsibility of tech companies in safeguarding the integrity of information and news on their platforms.
In the letter, Breton wrote, “First, given that your platform is extensively used by children and teenagers, you have a particular obligation to protect them from violent content depicting hostage taking and other graphic videos which are reportedly widely circulating on your platform, without appropriate safeguards.”
Under the newly enforced Digital Services Act by the European Union, TikTok is now required to actively monitor and remove any illegal content, like terrorist content or hate speech. They must also provide details on how they plan to do this.
Not complying with these regulations in Europe could lead to fines amounting to 6% of a company’s yearly revenue.
Commissioner Thierry Breton emphasized the importance for TikTok to enhance their efforts in removing harmful content and urged them to work closely with law enforcement agencies. He expects a response from TikTok within 24 hours.
Addressing Concerns on TikTok and Company X’s Platform
Breton stressed that TikTok, as a platform, has a special duty to protect young users from violent content, terrorist propaganda, dangerous challenges, and other potentially harmful content.
A spokesperson from TikTok has officially acknowledged receipt of Commissioner Breton’s letter and expressed the company’s commitment to provide a comprehensive response. They have also shared resources with CNBC detailing how they are diligently upholding their commitments under the Digital Services Act.
Commissioner Breton has emphasized the need for transparency and accountability regarding crisis measures on the platform managed by Elon Musk, CEO of a particular company. The focus is on addressing concerns related to the dissemination of violent and terrorist content, as well as manipulated images. Responding to this, CEO Linda Yaccarino from Company X outlined the concrete actions taken to identify and promptly remove Hamas-affiliated accounts, particularly following the recent Hamas attack on Israel.
Enhanced Monitoring and Swift Response Efforts
In a similar vein, Commissioner Breton has urged Mark Zuckerberg, CEO of Meta, to prioritize the effective removal of misinformation from Meta’s various platforms. This emphasis on misinformation is particularly relevant during critical periods like the Israel-Hamas conflict and upcoming elections.
Meta, the parent company of well-known social media platforms such as Instagram, Facebook, and Threads, has assured CNBC through a spokesperson that they are working diligently to safeguard their platforms. They have detailed the establishment of a specialized operations center aimed at monitoring and swiftly responding to evolving situations, especially in the aftermath of the recent Hamas terrorist attacks on Israel. Furthermore, they have emphasized their recruitment of experts proficient in both Hebrew and Arabic to enhance their monitoring and response capabilities.
The spokesperson claimed, “After the terrorist attacks by Hamas on Israel on Saturday, we quickly established a special operations center staffed with experts, including fluent Hebrew and Arabic speakers, to closely monitor and respond to this rapidly evolving situation.”