Meta, the parent company of Facebook and Instagram, is facing a lawsuit from the attorneys general of 33 U.S. states. The lawsuit alleges that Meta intentionally designed and implemented features on its social media platforms that “purposefully addict children and teens.” The legal action accuses Meta of profiting from the distress of young users by incorporating manipulative features, contributing to a national youth mental health crisis.
The lawsuit, initially filed in October and recently disclosed in a “less-redacted” version by the state of California, contains troubling statistics. In 2021 alone, Meta reportedly received over 402,000 reports of users under the age of 13 on Instagram. Still, the company purportedly acted on fewer than 164,000 of these reports. The lawsuit claims that Meta actively avoided addressing complaints about underage users, citing instances where internal communications discussed “coaching” parents to allow their children to remain on the platform.
Exploitative Business Model
The core of the legal challenge revolves around Meta’s alleged business model, which is accused of being centered on maximizing the time young users spend on its platforms. The lawsuit contends that Meta designed and deployed “psychologically manipulative” features to exploit children and teenagers while simultaneously promoting these features as non-manipulative. The company is also accused of publishing misleading reports that downplayed negative experiences among its user base.
Concealing Negative Research
Furthermore, the lawsuit asserts that Meta “continued to conceal and downplay” research findings indicating various negative outcomes associated with social media use, including internal studies that revealed the company’s awareness of serious harms affecting young users. It suggests that Meta was aware of the negative consequences but chose to downplay them for public relations purposes.
Violation of Children’s Online Privacy Protection Act (COPPA)
The legal action also raises allegations of widespread violations of the Children’s Online Privacy Protection Act (COPPA). The lawsuit claims that Meta marketed its platforms to children under 13, had knowledge of their usage, but failed to obtain parental consent before collecting and monetizing their personal data. Epic Games was fined a hefty amount under COPPA in 2022 in a lawsuit that mirrors Meta’s.
Meta’s Response and Parental Supervision Tools
In response to the mounting criticism, Meta announced new parental supervision tools in June, initially available in Messenger and later expanded to Facebook, Instagram, and Horizon Worlds in November. These tools, as Meta claims, allow parents to monitor their teen’s activities without reading their messages. However, the lawsuit questions the effectiveness of these tools and criticizes Meta’s design choices, including the “Take a Break” tool introduced on Instagram in 2021, which supposedly falls short in preventing excessive platform usage by young users.
Legal Consequences and Seeking Accountability
While the matter is yet to be tried in court, the comprehensive nature of the lawsuit raises serious concerns for Meta. The attorneys general of the involved states are seeking a permanent injunction against Meta’s practices on all its social media platforms. Additionally, they are pursuing per-state fines, civil penalties, and legal costs, the cumulative impact of which could be substantial. The lawsuit reflects a growing trend of holding tech companies accountable for their impact on users, particularly vulnerable groups such as children and teenagers. As the legal battle unfolds, Meta faces a challenging road ahead in defending its practices and mitigating potential consequences.