noyb sends Meta ‘cease and desist’ letter over AI training. European Class Action as poten

May 13, 2025

Meta AI not compliant with GDPR. To use personal data, companies must comply with one of six legal bases according to Article 6(1) GDPR. One of them is opt-in consent, which means that users can choose to provide a “freely given, specific, informed and unambiguous” “Yes” to the processing of their data – or to say no or even stay silent. However, companies can also claim a so-called ‘legitimate interest’ to process personal data. A thief, for example, won’t consent to a CCTV camera filming him – but a bank may still have a ‘legitimate interest’ in having CCTV cameras. Instead of allowing users to choose between saying “Yes” or “No”, Meta is claiming that is has such a ‘legitimate interest’ to just take their data for AI training. This leaves users with only the right to object (opt-out) under Article 21 GDPR. But Meta is even limiting this (statutory) right by saying that it only applies if people opt-out before the training has started.

Meta will likely also be unable to comply with other GDPR rights (like the right to be forgotten, the right to have incorrect data rectified or to give users access to their data in an AI system). Furthermore, Meta provides AI models (such as Llama) as open-source software for anyone to download and use. This means that Meta can hardly call back or update a model once it is published.

Max Schrems: “The European Court of Justice has already held that Meta cannot claim a ‘legitimate interest’ in targeting users with advertising. How should it have a ‘legitimate interest’ to suck up all data for AI training? While the ‘legitimate interest’ assessment is always a multi-factor test, all factors seem to point in the wrong direction for Meta. Meta simply says that it’s interest in making money is more important than the rights of its users.

Simple solution: Ask users for consent! While Meta tries to paint the picture that compliance with the GDPR would make AI training in the EU impossible (see their absurd press release for Germany), the legal solution under the GDPR is simple: Meta would just have to ask users for an opt-in consent (instead of an opt-out objection) to use their personal data for AI training. Given that Meta has massive user numbers, having 10% or more agree to such training would already clearly be sufficient to learn EU languages and alike. If Meta would be clear about the conditions for training (e.g. anonymisation and alike) it would be likely that users would provide their data. It is however totally absurd to argue that Meta needs the personal data of everyone that uses Facebook or Instagram in the past 20 years to train AI. Most other AI providers (like OpenAI or French Mistral) have zero access to social media data and still outcompete Meta’s AI systems.

Max Schrems: ” This fight is essentially about whether to ask people for consent or simply take their data without it. Meta starts a huge fight just to have an opt-out system instead of an opt-in system. Instead, they rely on an alleged ‘legitimate interest’ to just take the data and run with it. This is neither legal nor necessary.Meta’s absurd claims that stealing everyone’s person data is necessary for AI training is laughable. Other AI providers do not use social network data – and generate even better models than Meta.”

Injunctive relief & potential class actions. As a Qualified Entity under the new EU Collective Redress Directive, noyb can bring an injunction to stop an unlawful practice by a company like Meta. Such cases can be brought in various jurisdictions, not only at the Meta headquarter in Ireland. If an injunction is granted by the competent court, Meta would not only have to stop the processing, but also delete any illegally trained AI system. If EU data is “mixed in” with non-EU data, the entire AI model would have to be deleted. Any injunction would also stop the clock on how long people can bring damages claims (“statute of limitations“). This means that with every day that Meta continues to use European data for AI training, it would increase the potential damages claims by users. The GDPR allows for non-material damages that are usually in the hundreds or thousands per user.

In addition to a mere injunction, any Qualified Entity can also bring a redress action (similar to a US class action) to recoupe such damages for large user groups. noyb or any other Qualified Entity (see current list of the EU) would have years to file such an action if Meta does not change its course. If the non-material damage were only € 500 per user, it would amount to around €200 billion for the roughly 400 million monthly active Meta users in Europe.

Max Schrems: “We are currently evaluating our options to file injunctions, but there is also the option for a subsequent class action for non-material damages. If you think about the more than 400 million European Meta users who could all demand damages of just €500 or so, you can do the math. We are very surprised that Meta would take this risk just to avoid asking users for their consent.”

Injunctions in other EU jurisdictions. While the reaction to Meta’s flagrant breach of the GDPR is developing quickly, noyb understands that various groups in the EU are reviewing options for litigation. The German Consumer Organisations (led by the Verbraucherzentrale in North-Rhine-Westphalia, “VZ NRW”) have already made their intention to take action public. It is also to be expected that many individuals may seek action against Meta for the use of their data for AI training.

Max Schrems: “We would expect that the use of social media data for AI training would trigger a lot of litigation throughout the EU. Even just managing this litigation will be a huge task for Meta.”

DPAs do not seem to approve. The national Data Protection Authorities (DPAs) should – theoretically – take enforcement action against non-complaince with the GDPR. However, in a climate where the EU is pushing hard for less regulation and more “innovation” at any costs, we now observe that DPAs in some EU countries just largely function as a messenger for Meta: Many DPAs have just “informed” people that they should urgently opt-out of Meta’s AI training. This is putting the responsibility on the users instead of Meta. Meta also publicly claims to have “engaged” with EU regulators on using social media data for AI training. However, as far as noyb is informed, DPAs largely stayed silent on the legality of AI training without consent. It therefore seems that Meta simply moved ahead anyways – taking another huge legal risk in the EU and trampling over users’ rights.

Max Schrems: “As far as we have heard, Meta has ‘engaged’ with the authorities, but this hasn’t led to any ‘green light’. It seems that Meta is simply moving ahead and ignores EU Data Protection Authorities. The Authorities in turn seem to just stay silent, telling users to protect themselves. We are witnessing how data protection authorities loose more and more relevance and NGOs have to take action before the courts.”