Digital Omnibus: EU Commission wants to wreck core GDPR principles

19 November 2025

This is a first reaction to a developing story. For any changes and/or updates to our statement, please re-check this page.

Despite heavy criticism from civil society and large parts of the EU Parliament, the EU Commission has now published its proposal for the “Digital Omnibus”. Contrary to the Commission's official press release, these changes are not “maintaining the highest level of personal data protection”, but massively lower protections for Europeans. While having basically no real benefit for average European small and medium businesses, the proposed changes are a gift to US big tech as they open up many new loopholes for their law departments to exploit. Schrems: “This is the biggest attack on European’s digital rights in years. When the Commission states that it ‘maintains the highest standards’, it clearly is incorrect. It proposes to undermine these standards.”

wrecking ball destroying GDPR house

Largest cut to privacy rights in years. Industry lobby groups have successfully used the European fear of global economic pressure to call for massive cuts to European’s digital rights. The abrupt changes that were proposed today may undermine more than 40 years of a clear European stance against commercial surveillance by private actors, as also enshrined in Article 8 of the European Charter of Fundamental Rights and widely supported by the European Public.

Max Schrems: "The Digital Omnibus would mainly benefit big tech, while failing to provide any tangible benefits to average EU companies. This proposed reform is a sign of panic around shaping Europe’s digital future, not a sign of leadership. What we really need is a strategic, well-designed long-term plan to move Europe ahead.”

Little Political Backing for Changes. The European Commission has pulled this GDPR reform out of a hat despite the fact that most EU Member States had explicitly asked to not reopen the GDPR. In addition, leaked texts last week generated strong opposition by the Center and Left groups of the European Parliament (S&D, Renew and Greens) who explicitly called upon the Commission to stop these massive cuts to the GDPR. Furthermore, 127 civil society organisations (including noyb) heavily criticised the Comission’s unexpected foray and leaked text.

Nevertheless, today, under the lead of Commission President Ursula von der Leyen, Vice-President Henna Virkkunen and Justice Commissioner Michael McGrath, the Commission decided to go ahead with major cuts to the GDPR. There is talk about massive political pressure within the Commission to cut laws – without proper process or analysis.

Max Schrems: "There is limited political backing for these cuts by the public, by Member States and the European Parliament. It seems that the European Commission simply tries to steamroll everyone else via a 'fast track' procedure, which looks more like a panic reaction than well considered, evidence based lawmaking."

Background: German or US influence? One of the driving forces behind the reform for which “paper proof” exists is Germany. Some voices also point at recent reporting by Politico, that Virkkunen's message to US businesses in direct meetings was that the EU will become more “business-friendly”. There is also reports about increasing pressure by the Trump administration for the EU to cut back on protections to avoid tariffs. 

While it is unclear where the pressure exactly comes from, it is clear that the European Commission has surprisingly and secretly stepped far beyond the initial plan for the "Digital Omnibus", which should not have included changes to the GDPR.

EU Commission "moves fast and breaks things". Instead of sticking to the original plan of a “Digital Fitness Check” in 2026 to cut administrative burdens, the European Commission seems to have followed Silicon Valley's motto to "move fast and break things" to push through reforms on core rules in a “fast track” procedure. The lack of any impact assessment or evidence gathering throws overboard long-established principles of minimal standards for EU law making and follows a “trump’ian” type of erratic changes . The consequence is very bad drafting and law that is not fit for purpose.

Max Schrems: “These changes happened without proper procedures and are not based on evidence but rather on fear and industry claims. 'Move fast and break things' is not a motto that works to push through legislation affecting not only the lives of 450 million people, but ultimately also the proper functioning of our societies and democracies."

"AI tunnel vision". The proposed reform of the GDPR seems to primarily aim at removing any obstacles that could limit the use of personal data, such as social media data, for AI. However, many of these changes would have massive consequences to society in other areas than AI, such as online advertisement.

Max Schrems: "Artificial intelligence may be one of the most impactful and dangerous technologies for our democracy and society. Nevertheless, the narrative of an 'AI race' has led politicians to even throw protections out of the window that should have exactly protected us from having all our data go into a big opaque algorithm.”

No benefits for European SMEs – but opening the flood gates for the “big guys”. Despite frequent promises to mainly alleviate the burden on small European businesses, the proposed changes are anything but a simplification. Most Articles become more complex, unclear and illogical. Rather than working to reduce the need for paperwork (which is the main problem for European businesses) the Commission is introducing legal loopholes that only large corporations and big law firms will be able to exploit.

Max Schrems: “While the Commission constantly argues that this reform would be good for small companies, there is very little for them in these changes. The changes to well-established law will only increase market concentration, spark more legal uncertainty, generate new lawsuits and require more expensive legal advice. The only real beneficiary here is big tech and law firms.”

Death by 1000 Cuts. According to the Commission’s public consultation in October, the data protection related aspects of the proposal should have mainly focused on addressing “cookie banner” fatigue. However, today, the Commission tabled in-depth cuts to the GDPR. Many of these cuts seem to violate, or at least conflict with, the right to data protection in Article 8 of the EU Charter of Fundamental Rights.


Here is a first overview on the main problems:

(1) A new GDPR loophole via "pseudonyms" or "IDs". The Commission proposes to significantly narrow the definition of "personal data" – which would result in the GDPR not applying to many companies in various sectors. For example, sectors that currently operate via "pseudonyms" or random ID numbers, such as data brokers or the advertising industry, would not be (fully) covered anymore. This would done by adding a "subjective approach" in the text of the GDPR.

Instead of having an objective definition of personal data (e.g. data that is linked to a directly or indirectly identifiable person), a subjective definition would mean that if a specific company claims that it cannot (yet) or does not aim to (currently) identify a person, the GDPR ceases to apply. Such a case-by-case decision is inherently more complex and everything but a “simplification”. It also means that data may be “personal” or not depending on the internal thinking of a company, or given the circumstances that they have at a current point. This can also make cooperation between companies more complex as some would fall under the GDPR and others not.

Further, such a “subjective” definition makes it impossible for users or authorities to know if the GDPR applies in each case. In practice, this can make the GDPR hardly enforceable due to endless debates and disagreements on the true intentions and plans of a company.

Max Schrems: “It is like a gun law that only applies to guns when the owner confirms he is able to handle a gun, and intends to shoot someone. It is obvious how absurd such subjective definitions are.”

(2) Pulling personal data from your device? So far, Article 5(3) ePrivacy has protected users against remote access of data stored on "terminal equipment", such as PCs or smartphones. This is based on the right to protection of communications under Article 7 of the Charter of Fundamental Rights of the EU and made sure that companies cannot "remotely search" devices.

The Commission now adds "white listed" processing operations for the access to terminal equipment, that would include "aggregated statistics" and "security purposes". While the general direction of changes is understandable, the wording is extremely permissive and would also allow excessive "searches" on user devices for (tiny) security purposes.

(3) AI Training of Meta or Google with EU's Personal Data?  When Meta or LinkedIn started using social media data, it was widely unpopular. In a recent study for example only 7% of Germans say that they want Meta to use their personal data to train AI. Nevertheless, the Commission now wants to allow the use of highly personal data (like the content of 15+ years of a social media profile) for AI training by Big Tech. 

Max Schrems: "There is absolutely no public support for Meta or Google to include Europeans' personal data into their algorithms. For years we were told that people should not worry, because our personal data will be used to ‘connect’ us, or at best be used for targeting some advertisement. Now all your data is shoved into the algorithms of Meta, Google or Amazon. This makes it easier for AI systems to know even the most intimate details - and consequently manipulate people. This primarily benefits the trillion-dollar US industry that builds based models from our personal details." 

The European Commission foresees that users can opt-out, but companies and users usually don't know whose data is in a training dataset. Even if they know, users would have to opt-out thousands of times per year, whenever another company trains an algorithm with their data. 

Max Schrems: “The opt-out approach does not work in practice. Companies don’t have the contract details of users and users don’t know who is training based on their data. This opt-out approach is the Commission’s try to place a fig leaf over this obviously unlawful processing activity.

The Commission does not only want to privilege training of AI systems, but also the “operation” of such systems. This would amount to a “wildcard” where otherwise illegal processing would become legal, just because it is done using AI.

Max Schrems: “Usually more risky technologies have to meet a higher standard. The Commission proposal now opens the floodgates once AI is used – while traditional data processing would still fall under the current laws. That’s insane.”

(4) User Rights Cut to almost Zero - upon German Demand? Based on a national debate that GDPR access rights can be used to prove e.g. non-payment in employment contracts, the German government demanded a massive limitation to these rights – framing such use as “abuse”, even though the GDPR already has an “abuse” clause. The Commission has followed that German demand and proposes to limit the use of data subject access right to "data protection purposes" only. 

Conversely, this means that if an employee uses an access request in a labour dispute over unpaid hours – for example, to obtain a record of the hours they have worked – the employer could reject it as "abusive". The same would be true for journalists or researchers. In a broad reading this could go even further. If a person asks access to their data in order, subsequently, to delete false credit ranking data to get a cheaper loan at the bank, such rights may not be exercised purely for a "data protection purpose" but out of economic interest.

This limitation is a clear violation of CJEU case law and Article 8(2) of the Charter. The right to informational self-determination is explicitly meant to level the information gap between users and the companies that hold the information, as more and more information is hidden on company servers (e.g. copy of time sheets). The CJEU has held multiple times that one can exercise these rights for any purpose - including litigation or to generate evidence.

Max Schrems: “This change is a clear violation of the Charter and the CJEU case law. It will be used by controllers throughout Europe to further undermine users' rights. The reality is that we don't have wide-spread abuse of GDPR rights by citizens, but that we have wide-spread non-compliance by companies. To cut back user rights even further shows how detached the Commission is from the daily experience of users.

noyb's work is only possible thanks to our 5,287 Supporting Members. Would you like to support us too?

Share