TikTok may be fined millions of pounds due to violating children’s privacy following a judgment from the EU data protection regulator. The European Data Protection Board announced that it had reached a conclusive verdict regarding the Chinese-owned video-sharing platform’s handling of children’s data.
After TikTok raised legal objections to a previous ruling in Ireland, where the company’s European headquarters are located, the regulatory body has now “concluded a resolution to the dispute.” The anticipated fine is projected to be issued within the upcoming four weeks.
This EU ruling comes after a probe initiated in 2021 by the data protection commissioner in Ireland. The investigation delved into TikTok’s adherence to the EU’s general data protection regulation and its procedures concerning the information of youngsters between 13 and 17 years old.
On Friday, TikTok made a significant announcement outlining innovative features tailored to their European user base. The primary objective behind these enhancements is to ensure seamless adherence to the upcoming regulations set forth by the European Union, scheduled to take effect on the 25th of August.
Navigating the Road Ahead: TikTok’s Ongoing Evolution within the EU’s Digital Services Act Framework
The introduction of the EU’s Digital Services Act (DSA) has paved the way for substantial changes in how online platforms operate. This legislative framework mandates prominent online entities such as TikTok, Google, and Facebook to take on the responsibility of meticulously monitoring and curbing the presence of illicit content within their domains. Furthermore, these platforms are now entrusted with the task of prohibiting certain advertising practices that have the potential to be misleading or harmful to users. A noteworthy stipulation of the DSA also involves sharing pertinent data with authorized regulatory bodies.
Notably, TikTok’s recent voluntary participation in a comprehensive “stress test” held at their headquarters in Dublin demonstrated their willingness to align with these new regulations. The European Union’s Technology Commissioner, Thierry Breton, acknowledged this positive step yet emphasized that there remains room for further refinement on TikTok’s part to achieve complete compliance.
TikTok has taken proactive measures by rolling out a range of features tailored to enhance user experience while seamlessly integrating the required compliance protocols. By fostering an environment of increased transparency, user empowerment, and responsible content sharing, TikTok aims to meet and exceed the stipulations set by the EU’s Digital Services Act. These efforts are a testament to TikTok’s commitment to cultivating a safe, enjoyable, and compliant online ecosystem for its European user base.
As the 25th of August approaches, all eyes are on how TikTok and other major online platforms will continue to adapt and fine-tune their strategies to effectively navigate the evolving regulatory landscape, thereby ensuring a harmonious coexistence between technological innovation and legal responsibilities.
TikTok Enhances Compliance Efforts with New Measures: Empowering EU Users and Addressing Past Penalties
Breton told CNN: “TikTok is dedicating significant resources to compliance. Now it’s time to accelerate to be fully compliant.” On Friday, the company announced that it had implemented fresh measures to adhere to the DSA. These measures simplify the process for EU users to report any illegal content they come across. Additionally, it grants users the option to turn off personalized video recommendations. Moreover, the company will no longer display targeted advertisements to users aged 13 to 17.
The company stated, “We will continue to not only meet our regulatory obligations, but also strive to set new standards through innovative solutions.”
Earlier this year, TikTok faced a fine of £12.7 million from the UK data watchdog. The company was penalized for unlawfully processing the data of around 1.4 million children under 13. These children were using TikTok without obtaining proper parental consent.
The information commissioner in the UK stated that TikTok had taken minimal action to verify its users’ ages and remove underage ones, even though the company had received internal warnings about violating its terms and conditions.
According to a survey conducted by the UK regulatory authority Ofcom in 2022, over 60% of young individuals between the ages of eight and 17 who engage with social media had created TikTok accounts using their names.