The EU's Digital Markets Act has issued its first formal sanctions, TikTok faces preliminary findings under the Digital Services Act, and national regulators are processing thousands of complaints. Here is what happened — and what it means.
These decisions are not isolated enforcement actions. They are the first returns on a regulatory framework that took years to build and was always designed to produce exactly this kind of outcome: binding decisions, significant financial penalties, and precedents that will shape how digital platforms operate in the EU for the foreseeable future.
The April 2025 Apple and Meta decisions mark a genuine inflection point. The DMA had been in force since March 2024, and some observers questioned whether the Commission would move quickly enough to give the regulation credibility before political resistance softened its enforcement posture. The decisions answer that question, at least for now.
The Apple case turns on the anti-steering obligation in Article 5(4) DMA. Gatekeepers operating app distribution platforms may not restrict developers from directing users to pricing alternatives outside the platform. Apple's rules had the practical effect of preventing developers from informing users that the same digital purchase was available more cheaply elsewhere. The Commission treated this as a structural distortion — one that systematically advantages the gatekeeper's own commercial ecosystem.
The Meta decision engages Article 5(2) DMA, which prohibits gatekeepers from combining personal data across their core platform services without adequate legal basis. Meta's "pay or consent" framework was found to fail this standard. The reasoning is significant: the Commission held that a binary commercial choice does not constitute freely given consent.
The TikTok advertising transparency finding, though still preliminary, is instructive for a different reason. It concerns a specific technical obligation — the requirement under Article 39 DSA to maintain a publicly accessible advertising repository with information on ad sponsors, targeting parameters, and display periods. The Commission's preliminary view is that TikTok's repository lacks sufficient detail for researchers and civil society to detect coordinated disinformation and fraud — precisely the public oversight function Article 39 was designed to enable.
The DSA establishes a tiered compliance framework for all providers of intermediary services with a connection to the EU market. The tier a company falls into determines the density of its obligations — but every tier carries substantive duties.
One structural element of the DSA now generating tangible enforcement activity is the Trusted Flagger system under Article 22. Formally recognised entities — organisations with demonstrated expertise and independence — whose reports of allegedly unlawful content to platforms must be processed with priority. Germany's Bundesnetzagentur has now recognised four such entities. A platform that has not assessed whether its content moderation infrastructure distinguishes between ordinary notices and Trusted Flagger submissions is already out of step with its obligations.
The liability framework in Articles 4 through 6 DSA is also working its way into German courts. The OLG Bamberg became the first German appellate court to engage directly with the Article 25 dark patterns prohibition in a ticket platform case. The OLG Frankfurt addressed platform liability for deepfake content. The DSA is no longer only a regulatory instrument — it is being integrated into civil litigation.
The DMA operates through a designation mechanism. The Commission identifies specific companies as "gatekeepers" for specific core platform services — covering social networks, app stores, operating systems, search engines, advertising services, and intermediation services. Designation brings a non-negotiable set of behavioral obligations and prohibitions. Some are per se rules: the anti-steering prohibition, the data combination prohibition, the self-preferencing ban. Others require specification through Commission proceedings before their precise technical scope becomes clear.
The Commission has shown that it applies gatekeeper thresholds with genuine interpretive flexibility: confirming ByteDance's designation despite multi-homing arguments; declining to designate X after finding insufficient ecosystem lock-in; and revoking Facebook Marketplace's designation when Meta demonstrated the threshold was no longer met — the first revocation in DMA history.
Any analysis of current DSA and DMA enforcement that ignores the transatlantic political context would be incomplete. The current US administration has characterised European digital regulation as economically discriminatory — noting, accurately, that almost all designated gatekeepers are US-headquartered companies. Tariff negotiations have been accompanied by public statements and reportedly private pressure regarding specific enforcement proceedings. Reports emerged that at least one Commission investigation involving a US platform was paused in the context of trade discussions.
The April 2025 decisions against Apple and Meta nonetheless went ahead with substantial fines. The Commission has consistently maintained that the DMA's application follows from market position, not national origin. That position is legally correct — but the political environment adds genuine uncertainty that companies should factor into their planning, without treating it as a reason to defer engagement with legal obligations. Political negotiations do not pause enforcement timelines.
Beyond financial penalties, the Commission can impose interim measures, require structural remedies, and restrict market access in cases of persistent non-compliance. In the DMA context, systematic infringement may result in a prohibition on acquisitions in the relevant sector. Apple has already announced it will contest the €500 million fine — initiating what is likely to be prolonged litigation whose outcomes will shape DMA interpretation for all gatekeeper-designated companies.
The enforcement activity to date has concentrated on the largest and most visible platforms. This has encouraged a false sense of distance among smaller and mid-sized technology companies operating in the EU market. That distance is largely illusory.
The DSA's obligations apply to any provider of intermediary services to EU users, regardless of company size or establishment. The Article 13 requirement to designate a legal representative within the EU applies to all non-EU providers — not only VLOPs. National DSCs are operational, receiving and processing complaints, and building enforcement capacity rapidly. And courts — as the OLG Bamberg case illustrates — are applying DSA standards in civil proceedings independently of any regulatory action.
A structured compliance assessment for companies at an early stage of EU digital law engagement typically covers:
The enforcement record of spring 2025 confirms what the regulatory text always indicated: the DSA and DMA are designed to produce binding outcomes, not advisory guidance. Companies that engage with their obligations now, before a complaint or proceeding forces the question, are in a materially better position than those that wait.
Kanzlei Theo Funk — EU digital law counsel for US technology companies
This firm advises US-based technology companies on DSA and DMA compliance: scope and tier assessments, Article 13 legal representative services under the DSA, regulatory correspondence support, and ongoing compliance counsel as the enforcement landscape develops. If your company has EU users and has not yet assessed its position under the DSA, now is the right time.
Get in touch →