Written by Polona Car.
Enforcement of the Digital Services Act at national level is still very limited owing to delayed implementation. The European Commission has therefore begun infringement procedures against several Member States. At European level, the Commission has started formal proceedings against five very large platforms and has found, on a preliminary basis, that the platform X does not comply with the act. The other investigations are still ongoing.
Digital Services Act: A short introduction
The Digital Services Act (DSA) is designed to prevent illegal and harmful activities online, while protecting users’ fundamental rights and safety and nurturing trust in the digital environment. The DSA does this through a ‘layered responsibilities’ approach, which means that online intermediary services’ obligations correspond to their role, size and impact. Very large online platforms (VLOPs) and very large online search engines (VLOSEs) that pose particular risks must abide by the strictest rules. These are platforms that have, on average, at least 45 million active monthly users in the EU. The DSA imposes significant fines for non-compliance of up to 6 % of the intermediary’s annual worldwide turnover. Users who have suffered damage or losses due to online intermediaries’ infringements may also seek compensation.
Enforcement of the DSA is shared between national authorities and the Commission. National authorities supervise and enforce smaller platforms’ compliance, while the Commission’s primary responsibility is supervising VLOPs and VLOSEs. The DSA also established the European Board for Digital Services (EBDS), an independent advisory group composed of national digital service coordinators (DSCs) and chaired by the Commission. The EBDS plays a crucial role in applying the DSA and became operational in February 2024. In addition, the European Centre for Algorithmic Transparency was established to provide technical assistance in enforcing the DSA; it was launched in April 2023. The Commission has also set up the DSA whistleblower tool, to help it monitor compliance by VLOPs and VLOSEs.
Enforcement at national level
National enforcement of the DSA is still very limited due to delayed designation and/or empowerment of several DSCs. These are independent authorities, responsible for supervising and enforcing intermediary services’ compliance with the DSA on the territory of the Member States. DSCs have the power to request access to data, conduct inspections, and issue fines to intermediary service providers within their jurisdiction. Each Member State was required to designate a national DSC by February 2024, when all providers of online intermediary services became accountable to a set of general, due diligence obligations imposed by the DSA.
Some DSCs have already taken action on enforcement. The Irish DSC, for example, has opened a review of several platforms’ compliance with the DSA provisions on contact points for recipients of the service and a notice and action mechanism for reporting illegal content. The Dutch national authority conducted a survey, in which it established that many companies do not comply with the DSA. It committed to opening formal investigations as soon as it is fully empowered to do so.
In response to the late implementation, the Commission sent letters of formal notice to six Member States in April 2024, and a further six in July 2024. While some Member States have taken the necessary steps to enable enforcement of the DSA, others have only partially complied. The Commission has already sent reasoned opinions to Czechia, Cyprus and Portugal for not empowering their designated DSCs. The Commission could issue further reasoned opinions, as in Belgium and Poland the formal designation of DSCs is still pending, while in a few other countries designated DSCs have not yet been empowered.
Enforcement at EU level
At EU level, the Commission has responsibility for supervising and enforcing additional obligations imposed on VLOPs and VLOSEs. These include enhanced due diligence obligations to address systemic risks, such as dissemination of illegal content, conducting annual risk assessments, and facilitating access by authorities and vetted researchers to data and algorithms. In April 2023, the Commission adopted the first designation decisions under the DSA, designating 17 VLOPs and 2 VLOSEs. These platforms must comply with the DSA within 4 months of their designation, and are obliged to deliver risk assessment reports. The number of designated VLOPs and VLOSEs has since increased and now amounts to 23 VLOPs and 2 VLOSEs. Temu was the last platform to be designated as a VLOP in May 2024 after the European Consumer Organisation (BEUC) submitted a complaint against it.
Requests for information
The Commission can send formal requests for information about measures that the VLOPs and VLOSEs have put in place to comply with the DSA provisions. These investigatory acts do not prejudge potential further steps by the Commission, and can result in fines for incorrect, misleading or incomplete responses. Most requests seek to gather additional information on access to data – i.e. on measures VLOPs and VLOSEs have taken to comply with the obligation to give access to data to eligible researchers. In addition, the Commission is enquiring about measures on risk mitigation to prevent illegal and harmful content and risks linked to generative AI, recommender systems, and the protection of minors. Some requests enquire about dark patterns, advertising, notice and action mechanisms, and content moderation. As a result of receiving such a request for information, LinkedIn announced steps to comply with the DSA provisions on targeted advertising based on sensitive data. The request was initiated by a complaint from civil society.
Formal proceedings
The Commission has opened formal proceedings against five platforms so far. It can start such proceedings if it is not satisfied with the replies to its requests for information. So far, the Commission has issued preliminary findings on part of the investigations against the platform X, while all the other investigations are at an earlier stage. These proceedings could serve as an example to national authorities when they start enforcement at national level.
More precisely, in December 2023 the Commission opened formal proceedings against X, investigating its compliance with several DSA provisions. The Commission adopted preliminary findings in July 2024, which identified a potential breach of the DSA regarding dark patterns because the platform’s verified accounts system potentially deceives users. In addition, the Commission’s preliminary view was that X breaches the DSA rules on transparency of advertising, as it does not have a searchable and reliable advertisement repository, and makes it difficult to monitor and investigate potential risks of online ads. Finally, the Commission identified, on a preliminary basis, a breach regarding access to data by researchers, as X prohibits eligible researchers from independently accessing its public data. Moreover, its process for granting eligible researchers access to its application-programming interface appears to dissuade researchers. X may now exercise its right of defence and reply to the preliminary findings in writing; in parallel, the Commission will consult the EBDS. The Commission can only adopt a non-compliance decision if the preliminary findings are confirmed.
In February 2024, the Commission opened formal proceedings against TikTok, investigating its compliance with provisions on risk management of addictive design and harmful content, protection of minors, transparency of advertising, and access to data. Further proceedings against TikTok were opened in April 2024 to investigate whether the launch of the TikTok Lite rewards programme, which allows users to earn points while performing certain tasks on the platform, is in breach of the DSA because it was launched without due diligence risk assessment. The latter proceedings were closed in August 2024, after TikTok committed to withdrawing the feature from its applications offered within the EU.
Formal proceedings against AliExpress started in March 2024, investigating compliance with provisions on risk management of illegal content, notice of action, internal complaint handling, traders’ traceability, advertising transparency, recommender systems, and access to data. In April and May 2024, the Commission opened proceedings against Meta (Instagram and Facebook). The investigations are assessing Meta’s conformity regarding transparency of content moderation, a notice and action mechanism to flag illegal content, dark patterns and protection of minors, and risk management in relation to integrity of elections, dissemination of harmful content and addictive design, and access to data for researchers. In October 2024, the Commission started proceedings against Temu, to investigate what the platform is doing to limit the sale of non-compliant products as well as its compliance with the DSA provisions concerning addictive design, recommender systems, and access to data for researchers.
Read this ‘At a glance’ note on ‘Enforcing the Digital Services Act: State of play‘ in the Think Tank pages of the European Parliament.




Comments are closed for this post.