Generated by DeepSeek V3.2| Digital Services Act | |
|---|---|
![]() | |
| Legislature | European Parliament |
| Territorial extent | European Union |
| Introduced by | European Commission |
| Related legislation | Digital Markets Act, e-Commerce Directive |
| Status | In force |
Digital Services Act. It is a landmark regulation in European Union law that modernizes the legal framework for digital services, aiming to create a safer online environment and establish clear responsibilities for platforms. Formally proposed by the European Commission and adopted by the European Parliament and the Council of the European Union, it works in tandem with the Digital Markets Act to comprehensively regulate the digital single market. The legislation seeks to protect fundamental rights online, foster innovation, and ensure fair competition within the internal market.
The primary objective is to create a transparent and accountable online ecosystem, updating rules established decades ago by the e-Commerce Directive. A core aim is to better protect European Union citizens and their fundamental rights, including freedom of expression, while combating illegal content and disinformation spread across digital platforms. It establishes a harmonized set of rules across the internal market to reduce fragmentation and legal uncertainty for businesses operating in multiple member states. The regulation also aims to empower users and civil society, providing new mechanisms for redress and oversight of very large online platforms that act as systemic gatekeepers of public discourse.
Central provisions impose tiered obligations based on the size and role of intermediaries, with the strictest rules targeting very large online platforms and very large online search engines reaching over 45 million users in the European Union. These entities must conduct annual risk assessments regarding systemic risks like the dissemination of illegal content or negative effects on fundamental rights. They are required to implement mitigation measures, which could include changes to their recommendation algorithms or content moderation practices, and undergo independent audits to ensure compliance. All online platforms must provide users with clear mechanisms to flag illegal content and challenge content moderation decisions, while also publishing transparency reports on their actions.
The regulation applies broadly to a wide range of digital service providers that operate within the European Union, regardless of their place of establishment. This includes intermediary services like internet access providers, cloud computing services, and content delivery networks. Its rules specifically cover hosting services, such as cloud storage and online marketplaces, and most notably, online platforms that bring together sellers and consumers, like social media networks and app stores. The strictest obligations are reserved for a subset designated as very large online platforms and very large online search engines, a list that includes companies like Meta, Google, and Amazon.
Enforcement follows a novel, tiered governance model where the European Commission directly supervises and enforces rules for very large online platforms and very large online search engines. For other entities, primary enforcement authority rests with individual member states, which are required to appoint competent Digital Services Coordinators as national regulators. These national bodies, along with the European Commission, form the new European Board for Digital Services to ensure consistent application across the internal market. Penalties for non-compliance are severe, with fines of up to 6% of a company's global annual turnover, and the European Commission retains the power to impose periodic penalty payments and even request temporary suspensions of service for repeated infringements.
The legislation is expected to have a profound global impact, acting as a de facto standard for digital regulation and influencing policy debates in jurisdictions like the United States and the United Kingdom. It has been praised by consumer protection groups and lawmakers for empowering users and holding powerful technology companies accountable for their societal impact. However, it has faced criticism from some industry representatives and free speech advocates, including organizations like the Electronic Frontier Foundation, who argue that its compliance burdens may stifle smaller innovators and that its content moderation rules could lead to the over-removal of legal speech. Its practical effects on issues like disinformation, hate speech, and market competition will be closely watched as enforcement by the European Commission and national authorities like the Bundesnetzagentur begins.
Category:European Union law Category:Internet governance Category:2022 in law