Generated by GPT-5-mini| Digital Services Act | |
|---|---|
![]() User:Verdy p, User:-xfi-, User:Paddu, User:Nightstallion, User:Funakoshi, User:J · Public domain · source | |
| Name | Digital Services Act |
| Enacted by | European Parliament and Council of the European Union |
| Citation | Regulation (EU) 2022/2065 |
| Signed | 2022 |
| Effective | 2024 |
| Subject | Regulation of digital services and online platforms |
| Status | In force |
Digital Services Act The Digital Services Act is a European Union regulation creating harmonized rules for certain online services and platforms, aiming to address illegal content, systemic risks, and transparency obligations across the European Union. It complements the ePrivacy Directive and interacts with the General Data Protection Regulation, shaping obligations for multinational companies such as Meta Platforms, Alphabet Inc., Amazon (company), and TikTok. The instrument emerged from debates involving the European Commission (EC), the European Parliament, and member state authorities including Bundesinstitut für Risikobewertung-level regulators.
The legislative process began after the Lisbon Treaty-era ambitions to modernize single-market rules for digital services, responding to incidents involving platforms like Cambridge Analytica scandal and disputes involving YouTube (brand) content moderation. The European Commission published a proposal that entered trilogue negotiations with the Council of the European Union and the European Parliament, influenced by stakeholder input from BusinessEurope, DigitalEurope, civil society actors such as European Digital Rights (EDRi), and regulatory bodies including the European Data Protection Board. Key milestones included committee votes in the European Parliament Committee on Civil Liberties, Justice and Home Affairs and ratification debates in national parliaments such as Bundestag and Assemblée nationale (France). The final text drew on precedents like the eCommerce Directive and mirrored issues litigated before the Court of Justice of the European Union.
The regulation distinguishes categories of services such as hosting services used by Twitter (now X) and marketplaces operated by eBay. It defines criteria for very large online platforms (VLOPs) and very large online search engines (VLOSEs) based on active user thresholds and systemic reach relevant to companies including Microsoft Corporation and ByteDance. Obligations include risk assessments, transparency reporting, content moderation audits, and appointment of points of contact; these requirements affect entities represented in trade associations like Chamber of Commerce of the European Union. The act interfaces with sectoral regimes such as the Audiovisual Media Services Directive and consumer protections enforced by agencies such as European Consumer Organisation (BEUC).
Intermediaries, including cloud providers like Amazon Web Services, payment processors such as Visa Inc., and social platforms like Snap Inc., are subject to differentiated duties. Smaller intermediaries retain the benefit of safe-harbour provisions traceable to eCommerce Directive jurisprudence before the Court of Justice of the European Union. By contrast, VLOPs face obligations to mitigate systemic risks associated with misinformation campaigns observed during events like the 2016 United States presidential election and public-health crises such as the COVID-19 pandemic. The act mandates cooperation with-designated supervisory authorities coordinated by the European Commission and national competent authorities like the Bundesnetzagentur and Autorità Garante della Concorrenza e del Mercato (AGCM).
Provisions require platforms to implement notice-and-action mechanisms for illegal content removal, mirroring debates involving The New York Times Company coverage of removal requests and litigation seen in cases before the European Court of Human Rights. Obligations encompass transparent terms, user redress, and independent audits akin to oversight models proposed by groups like Center for Democracy & Technology. Due diligence duties cover risk mitigation for disinformation, child sexual abuse material, and goods trafficking, relating to investigations by agencies such as Europol and civil-society campaigns led by Save the Children. The regulation emphasizes non-discriminatory moderation and safeguards for fundamental rights similar to standards in Charter of Fundamental Rights of the European Union jurisprudence.
Enforcement is carried out by national supervisory authorities empowered to issue orders, fines, and interim measures; the European Commission retains coordination and designation powers for VLOPs and VLOSEs. Penalties can reach significant percentages of global turnover, paralleling sanction models in General Data Protection Regulation enforcement by authorities including Data Protection Commission (Ireland). Compliance mechanisms include required risk assessment reports, external audits by certified assessors, and obligation to appoint compliance officers, with dispute resolution involving national courts and potential referral to the Court of Justice of the European Union for preliminary rulings. Cross-border cooperation is facilitated through boards of regulators modelled on networks such as the European Banking Authority.
The act has reshaped corporate compliance programs at firms like Apple Inc. and prompted strategic adjustments by platforms such as Reddit. Supporters, including European Consumer Organisation (BEUC) and certain member states, argue it enhances user safety and market fairness, while critics from Computer & Communications Industry Association and scholars at Oxford Internet Institute warn of overreach and risks to innovation. Legal challenges have been mounted concerning scope, proportionality, and interaction with free-speech protections litigated before the European Court of Human Rights and the Court of Justice of the European Union. Transnational trade partners including United States and India have engaged in dialogues about extraterritorial effects, and scholarship from Harvard Law School and London School of Economics continues to assess its long-term impact on platform governance and competition.
Category:European Union law Category:Internet law Category:Regulation of online platforms