Introduction
The European Commission and the European Data Protection Board (EDPB) have recently released Joint Guidelines on the interplay between the General Data Protection Regulation (GDPR) and the Digital Markets Act (DMA) for public consultation. Although referred to as “joint” guidelines, the document is in fact heavily driven by the DMA’s structure and obligations. This blog reframes the guidelines through a GDPR‑focused perspective, highlighting how the DMA heightens gatekeepers’ data protection obligations.
My observations unfold in three parts. Part 1 outlines Gatekeepers’ limited discretion in choosing GDPR’s lawful grounds and the enhanced con…
Introduction
The European Commission and the European Data Protection Board (EDPB) have recently released Joint Guidelines on the interplay between the General Data Protection Regulation (GDPR) and the Digital Markets Act (DMA) for public consultation. Although referred to as “joint” guidelines, the document is in fact heavily driven by the DMA’s structure and obligations. This blog reframes the guidelines through a GDPR‑focused perspective, highlighting how the DMA heightens gatekeepers’ data protection obligations.
My observations unfold in three parts. Part 1 outlines Gatekeepers’ limited discretion in choosing GDPR’s lawful grounds and the enhanced consent mechanism in the GDPR-DMA interplay. Part 2 then touches on vague boundaries of DMA’s limitations, illustrated by a surprising exception introduced by the joint guidelines. Part 3 situates these regulatory developments within the broader context of the Commission’s proposed digital omnibus package, suggesting that EU data protection law is moving toward a more nuanced, market‑power‑sensitive approach.
Overall, the interplay between GDPR and DMA reveals not only technical coordination but also signals a strategic shift of EU data protection law. The obligations imposed on gatekeepers increasingly differ from those applicable to ordinary controllers, raising important questions about proportionality and the future architecture of EU data protection law.
Part 1 Gatekeepers’ limited discretion in choosing GDPR’s lawful grounds
EU data protection law features a legitimising regime, meaning that a controller who intends to process personal data must demonstrate a lawful ground for doing so. There are six lawful grounds, set out in Article 6(1) of the GDPR exhaustively, known as valid consent, contractual necessity, legal obligations, vital interests, public interests, and legitimate interests. It has long been said that there is no hierarchy among the six lawful grounds (most recent see, e.g., AG opinion in C‑394/23, para 28; EDPB’s Guidelines 1/2024 and Opinion 28/2024).
Now, one must approach this position with caution taking into account DMA. DMA applies to core platform services (CPSs) provided by gatekeepers - specific undertakings that are designated by the European Commission pursuant to Article 3 of DMA. DMA lays down special rules on personal data protection for gatekeepers to address their data-driven market advantages. Among other personal-data-related measures, notably, gatekeepers’ ability to choose a GDPR’s lawful ground for certain processing activities is limited.
1.1 Article 5(2) of DMA’s Choice of Consent
Codifying previous regulatory efforts, Article 5(2) of the DMA imposes a general prohibition on gatekeepers regarding a set of personal data processing activities: processing certain personal data for advertising, combining and cross-using personal data from different sources, and signing in end-users for combination purposes.
The subparagraphs of Article 5(2) open up the possibility of exemptions, however. Gatekeepers may justify these processing activities within the scope of Article 5(2) of DMA, provided that:
1) end users have been presented with ‘specific choice’ and have consented to these otherwise-prohibited processing activities, or
2) any of the other three GDPR’s lawful grounds, which are legal obligation, vital interests, public interests, would be appropriate.
Recital 36 of the DMA excludes the possibility of relying on Article 6(1) (b) and (f) of the GDPR (contractual necessity and legitimate interests) to carry out the prohibited processing activities, which has been confirmed by the joint guidelines (para 18).
Additionally, the joint guidelines clarify that reliance on Article 6(1), points (d) or (e) (vital interests, or public interests) of the GDPR would be possible in very limited circumstances (para 80), given the economic and commercial nature of gatekeepers’ processing activities. Likewise, Article 6(1)(c) of GDPR (compliance with a legal obligation), which has been scrutinised in line with the proportionality principle (e.g., Case C-184/20, paras 71-116), cannot be invoked to justify processing activities that go beyond what is required by law.
Consequently,** **valid consent becomes the practical centrepiece, if gatekeepers decide to carry out these forms of processing regulated by Article 5(2) of DMA.
Further, the gatekeepers’ limited choice is compounded by the stricter conditions for obtaining valid consent under Article 5(2) of the DMA. To make this case, we start with the concept of equivalent alternatives and its link to consent. Under the GDPR, providing an equivalent alternative – a version that does not involve unnecessary personal data processing – is a highly recommended practice to demonstrate that genuine choices have been presented to data subjects (EDPB’s Guidelines 05/2020, para 37); therefore, data subjects’ consent is likely to be freely given, and valid if other cumulative conditions are also met. The linkage between equivalent alternatives, genuine choice for data subjects, and freely given consent is therefore traceable under GDPR. Nevertheless, providing equivalent alternatives is neither necessary nor sufficient to fulfil the GDPR’s consent requirements for controllers.
By contrast, for gatekeepers, the joint guidelines clarify that **providing a less personalised but equivalent alternative is a requirement **to demonstrate that the end user has been presented with ‘specific choice’ - a precondition of obtaining GDPR’s valid consent under Article 5(2) of DMA (Joint guidelines, paras 15, 23-28).
It follows that Article 5(2) of DMA not only funnels Gatekeepers to rely on GDPR’s valid consent for processing activities that fall within its scope but also expresses its own logic: no equivalent alternative, no valid consent.
1.2 The Regulators’ Choice of Legal Obligations
Article 6 of the DMA is another substantial provision that imposes special obligations on designated gatekeepers. Among other things, Article 6(10) of the DMA confers the right to data access on business users, and gatekeepers bear the obligation to grant access to such data. When data access involves personal data, the parties concerned (the gatekeeper, business users, and authorised third parties) are bound by the GDPR and must therefore rely on appropriate lawful grounds to enable such personal data sharing.
Although Article 6(10) of DMA explicitly requires ‘the end users opt in to such sharing by giving their consent,’ the joint guidelines make a crucial distinction: only business users’ access to personal data is conditional on obtaining consent (para 163), while gatekeepers are directed to rely on Article 6(1), point (c) of the GDPR (legal obligations) to execute the data sharing (para 159).
The joint guidelines have framed this distinction as a way of rendering the consent mechanism more effective. By designating ‘legal obligations’ as the basis upon which gatekeepers could share personal data, the regulators highlight gatekeepers’ obligations under Article 13, paragraph 5 of DMA (paras 161-163). This anti-circumvention provision takes into account the market power imbalance between gatekeepers and business users – both are separate controllers within the meaning of GDPR, requiring the former to take necessary steps to enable the latter to obtain consent or otherwise comply with data protection laws.
Therefore, the regulators spotlight gatekeepers’ reinforced obligations under DMA to promote systemic compliance with data protection rules that turn on the validity of consent. Because the GDPR itself tends to assess the lawfulness of each controller’s processing activities independently (e.g., Google Spain, para 86; Fashion ID, para 96), its consent mechanism cannot easily accommodate such institutional arrangements on its own. Conceivably, Article 6(1), point (c) of the GDPRis selected to bridge gatekeepers’ special responsibility to the data protection framework.
Part 2 Vague Boundaries of DMA’s Limitations
While Article 5(2) of DMA, as mentioned, precludes gatekeepers from relying on contractual necessity and legitimate interests to justify prohibited processing activities that fall within its scope, the joint guidelines introduce a surprising exception under Article 5(2)(c) of DMA.
The joint guidelines firstly clarify that Article 5(2)(c) of DMA does not prohibit the cross-use of personal data between a CPS and the gatekeeper’s other services that are ‘provided together with or in support of’ the CPS (para 67). Building on this, the joint guidelines boldly recognise gatekeepers’ online advertising as a service ‘supporting’ their CPS (para 68), and furthermore, suggest that gatekeepers may rely on their legitimate interests to cross-use personal data from their CPSs in their advertising service (para 75).
Section 2.1 questions the joint guidelines’ categorisation of online advertising as a supporting service, and Section 2.2 evaluates how this reasoning may pave the way for gatekeepers to make analogous arguments for their AI training.
2.1 The ‘Supporting Service’ Exception: Conceptual Tensions in the GDPR-DMA Interplay
The recognition of online advertising as a supporting service under Article 5(2), point (c) of DMA, in my opinion, generates tensions between the two regulatory frameworks.
Firstly, under EU data protection law, advertising has been consistently treated as a standalone processing purpose, not a supporting one. Recital 47 of the GDPR enumerates direct marketing as a distinct processing purpose; the CJEU has considered online personalised advertising to be a form of direct marketing and examined its lawfulness separately from other processing purposes (e.g., C‑252/21, paras 115-118). Moreover, the sending of communications for direct marketing purposes has been a distinct subject matter comprehensively regulated under Article 13 of the e-Privacy Directive, a lex specialis of the GDPR (C‑654/23, paras 64-69). It seems elusive to fit the understanding of online advertising as a supporting service to CPSs into the existing data protection narrative.
Secondly, the understanding of online advertising as a supporting service also seems to be at odds with Article 2(2), point (j) of the DMA, which defines online advertising services as an independent category of CPS. The joint guidelines attempt to reconcile this by distinguishing advertising as part of the online advertising CPS or the other CPS on which advertising is displayed (see footnote 71). Yet this conceptual distinction is thin, and considerable effort is needed to make sense of it.
This ill-founded exception exemplifies the vague boundaries of the DMA’s limitations, demonstrating how specific interpretative avenues threaten to undermine its strictness.
2.2 AI as A Supporting Service?
The joint guidelines, which recognise gatekeepers’ legitimate interests in cross-use of on-platform personal data in their advertising services, may also open the door to making an analogous argument regarding the processing of on-platform personal data for AI training purposes. In practice, gatekeepers, e.g., Meta, already decide to use personal data from their CPS to train AI models, resorting to their legitimate interests to do so.
In this context, EDPB has reaffirmed that GDPR does not impose a hierarchy among the six lawful grounds, and that legitimate interest may be appropriate for training and deploying AI models involving personal data, provided that it is strictly necessary and proportionate (Opinion 28/2024, second and third questions).
Once again, the no-hierarchy narrative should be approached with caution when it comes to the gatekeepers’ AI-related processing activities. It raises the very first question under Article 5(2) of the DMA: whether the AI-empowered service is provided as a separate service or as a supporting function of CPSs. The former requires valid consent from end users, while the latter may arguably rely on legitimate interests as a basis. These determinations are highly contextual andnecessitate granular legal and factual analysis; it seems to me that framing all training activities as ‘service improvement’ would be legally insufficient.
Admittedly, gatekeepers currently face fewer explicit restrictions on processing personal data for training AI models, by contrast with their processing for advertising purposes, which has been exhaustively scrutinised under various frameworks, e.g., GDPR, e-Privacy, DMA, and the Digital Service Act (DSA). However, fewer prohibitions do not mean greater freedom to process more personal data. The GDPR’s core principles, along with the AI Act’s risk-based requirements, continue to apply. Looking forward, parallel lawful grounds, particularly consent and legitimate interests, are likely to coexist to structure responsible AI use.
Part 3 The Broader Picture: Differentiated data protection obligations
My preceding observations on the interplay between GDPR and DMA highlight the additional data protection obligations imposed on gatekeepers under DMA, which are arguably more stringent than those of ordinary controllers.
In parallel with the stricter data protection obligations for gatekeepers, the Commission is proposing a digital omnibus package to simplify its digital legislation, particularly in data protection law. The said objective is to reduce administrative burdens and compliance costs for companies, particularly small and medium-sized enterprises (SMEs).
Considered together, the two ostensibly opposite developments—the tightening of rules for gatekeepers and the easing of obligations for SMEs—reveal a broader picture: the EU data protection framework is being reformed to establish differentiated data protection obligations tied to market power.
A reasonable differentiation of data protection obligations would not necessarily contradict the fundamental nature of the right to the protection of personal data. While Article 8 of the EU Charter entails a high level of data protection, it is also true that this right is not absolute and is subject to limitation in line with the proportionality principle under Article 52(1) of the EU Charter. The differentiation as such may be justified by a delicate proportionality test weighing the protection of fundamental rights against economic freedoms, which have been equally valuable in the Union.
Yet the direction is not without pitfalls. The proposal of the Digital Omnibus Regulation reveals more substantial changes to the core GDPR elements than initially anticipated, raising concerns about potential deregulation. Without clear calibration, simplification risks undermining the very foundations of the EU data protection law.
In sum, the interplay between GDPR and DMA reveals more than a technical coordination exercise; it may signal the early contours of a more stratified data protection framework. The EU stands at a pivotal moment to update its data protection framework amid a rapidly transforming digital market. The most debated issues in the coming years would likely be whether this reform strengthens or undermines the EU’s data protection regime.
Aolan Li is a PhD candidate in Law at Queen Mary University of London specialising in EU data privacy law.