5G(4) AI(25) Best practices(10) CEE(4) Cloud(3) Connectivity(17) Cybersecurity(7) Data(18) DMA(4) E-commerce(2) E-governance(13) Fintech(5) Global(15) Innovation(16) Intellectual Property(2) News(45) Privacy(18) Wiser Regulation(4)

Consumer concerns over EU DMA proposals for Google

EU DMA proposals could force Google to share sensitive search data with third parties in ways that may weaken privacy, cybersecurity, and consumer choice. As debate grows, the real question is whether competition can be increased without putting European users at risk.

Consumer concerns over EU DMA proposals for Google
Photo by sarah b / Unsplash

When it comes to the Digital Markets Act (DMA), Europeans are most used to hearing either that it is an overall extremely bad approach (usually coming from across the Atlantic) or that it is very advanced, only insufficiently enforced, lacking in human resources and, of course, “not on the table” for EU-US negotiations (the European approach).

Discussions about the nuances - such as the enforcement measures and their positive and negative impacts on European markets and consumers - normally receive little attention or engagement.

Yet, the discussion around the recent proposals for obligations set out by the European Commission on Google Search to hand over sensitive data to “any third party” is starting to gain traction. At the heart of the issue is the mismatch between two things - the seemingly noble goal of preventing gatekeepers from withholding their data from other companies (including European ones and Google’s direct competitors from any country in the world) - and the practical inability to enforce such a goal securely, without undermining European consumers’ safety.

What’s at stake

This conversation is a consequence of the public consultation opened in January 2026 on the Commission’s proposals for Google Search Data on how to best comply with the Digital Markets Act’s Article 6(11). The consultation is open for opinion submissions until May 1, 2026. The DMA article 6(11) essentially says that a gatekeeper should provide information to third parties on data such as ranking, queries, and more:

The proposed measures (“DMA.100209 - Preliminary Measures, April 16, 2026”) would require Google to share search data with “any third party”, including query data (what users type into Google Search, changes made to searches, and related metadata), click data (how users interact with results pages, including clicks, timing, order, duration, and more), as well as all links, images, and other content shown on search results pages, and ranking data (where links appear in results, including their position, prominence, and visibility on a user’s device).

Best case scenario vs. most probable and pessimistic scenario for consumers 

The optimistic scenario that the European Commission likely assumed when drafting Article 6 (11) of the DMA and proposing measures for Google Search is that Google Search would share data with third parties, those parties would then develop, or improve their own products, alternative search engines and similar services would gain traction, consumer prices would fall and consumer choice would be broadened. Finally, the world would acknowledge that the European ex-ante digital market regulation approach works and would follow in Brussels’ footsteps.

The more probable and pessimistic scenario is that the scope of data that Google Search is obliged to hand to third parties is way too large, the proposed safety measures to prevent personal data from being abused is way too lax, and this can end up being a major privacy, security and consumer dissatisfaction issue amongst European consumers. The very same consumers that may have trusted Google as their provider for decades, but would not necessarily trust other, unspecified platforms which may take advantage of these obligations. Additionally, this would only help non-European, direct competitors of Google Search, while not helping the “European alternatives” as much.

Cybersecurity expert Lukasz Olejnik puts it simply, calling it  “one of the largest mandated transfers of sensitive user data in Europe in decade” and that “users do not expect to be shared, especially with random entities and in bulk.”

What’s wrong with the current approach

Cybersecurity experts who criticise this approach focus on what they understand best - specifically, the safety measures that the European Commission proposes to ensure that personal and sensitive data is not handed to bad actors. They argue that the current safety measures are insufficient and that even anonymised datasets may be re-identified when combined with other data.

Yet, the proposal is questionable from a policy perspective too. Normally, when it comes to data protection, the EU places heavy emphasis on the European Data Protection Board’s guidelines and recommendations - this was one of the arguments used against broader personal data definitions in the Digital Omnibus debate. Yet, when it comes to the Digital Markets Act and General Data Protection Regulation, the Commission seems to be in a rush to finalise obligations for Google before the EDPB’s anonymisation guidelines are published (they are expected to be issued this quarter). 

Finally, the argument on consumer choice is also heavily disregarded. European consumers who may trust a specific platform to handle their data may not necessarily trust unspecified third-party companies to have access to that same data. The new DMA’s obligations would effectively eliminate their choice, while also creating new security challenges that would be hard to manage.