Privacy advocate Max Schrems has thrown a significant roadblock in the path of Meta’s European AI ambitions. His organization, noyb (None Of Your Business), sent a cease and desist letter to the tech giant Wednesday, challenging Meta’s plans to train its AI using data from European users.
Noyb Threatens Meta with Legal Action Over AI Data Training
The letter delivers an ultimatum: either make AI data training explicitly opt-in rather than opt-out, or face potential legal action that could include injunctions or even a massive class-action lawsuit.
“We are very surprised that Meta would take this risk just to avoid asking users for their consent,” Schrems said in a statement. “Even just managing this litigation will be a huge task for Meta.”
This isn’t noyb’s first confrontation with Meta over AI training data. Just last month, Meta announced plans to resume AI training in the EU following a pause agreed upon in June 2024. That pause came after noyb filed 11 complaints with the Irish Data Protection Commission.
At the heart of the dispute is Meta’s reliance on the “legitimate interest” exception within the EU’s General Data Protection Regulation (GDPR). This provision allows companies to process personal data without explicit consent under certain circumstances. Noyb argues this exception doesn’t apply to AI training.
Schrems points to a previous victory where Meta abandoned the “legitimate interest” justification for targeted advertising following years of legal battles.
“The European Court of Justice has already held that Meta cannot claim a ‘legitimate interest’ in targeting users with advertising,” Schrems noted. “How should it have a ‘legitimate interest’ to suck up all data for AI training?”
Noyb Challenges Meta’s Data Demands for AI
The privacy group dismisses Meta’s claim that it needs comprehensive access to user data to make its AI culturally aware. Noyb argues that even if just 10 percent of Meta’s 400 million monthly EU users consented to data collection, it “would already clearly be sufficient to learn EU languages and alike.”
Going further, noyb called it “absurd” for Meta to argue it needs every user’s posts and comments from the past two decades for AI training.
The organization took a jab at Meta’s AI capabilities, noting that “most other AI providers like OpenAI or French Mistral have zero access to social media data and still outcompete Meta’s AI systems.”
The financial stakes are enormous. Should noyb pursue a class-action case seeking non-material damages of €500 per EU user, Meta could face liability exceeding €200 billion ($224 billion).
Meta’s AI Data Processing Faces Legal Challenge in Europe
Meta firmly rejects these arguments. A company spokesperson told The Register that their approach complies with European Data Protection Board guidance and characterized noyb’s actions as “part of an attempt by a vocal minority of activist groups to delay AI innovation in the EU.”

The company maintains that the EDPB’s December opinion actually validates its use of the “legitimate interest” basis. Meta also claimed it isn’t the only AI company using this approach for data processing in the EU, asserting its methods are more transparent than competitors’.
Given these contradictory positions, court action seems likely. Noyb is “currently evaluating our options to file injunctions,” according to Schrems, while noting that other European entities are considering their own legal actions against Meta’s data collection practices.
This standoff highlights the ongoing tension between technological innovation and data privacy in Europe, where regulators continue to enforce some of the world’s strictest data protection rules. For Meta, the challenge lies in balancing its AI development goals with compliance requirements that may require more explicit user consent than the company wants to obtain.
As this legal battle unfolds, European Facebook and Instagram users might want to check their privacy settings – while Meta calls its approach transparent, the current default appears to include their data in AI training unless they actively opt out.