DSA Compliance: Key Obligations for Hosting Providers, Platforms, and Marketplaces Operating in the EU (Part 1)

DSA Compliance: Key Obligations for Hosting Providers, Platforms, and Marketplaces Operating in the EU (Part 1)

Key Takeaways

(Part 1) The Digital Services Act (DSA) has been fully applicable since February 2024. Hosting providers, platforms, and marketplaces must comply with strengthened obligations regarding liability, transparency, and content management.

The Digital Services Act (DSA), which has been fully enforceable since February 17, 2024, establishes a comprehensive set of obligations related to liability, transparency, and risk management for providers of intermediary services, including hosting providers, online platforms, and marketplaces operating within the EU. Designed to combat the dissemination of illegal content, regulate algorithmic systems, and safeguard users’ fundamental rights, the DSA introduces a tiered regulatory framework based on the nature and size of the service provider.

In this first article, we summarize the DSA’s objectives, the categories of digital services it covers, and the obligations that apply to each category of intermediary service provider. A second article will address the territorial scope and the penalties for non-compliance with the DSA.


1. Objectives of the DSA

Adopted on October 19, 2022, the European Digital Services Act entered into force in several phases between April 25, 2023, and February 17, 2024. It is now fully applicable to all relevant services. (1)

The DSA aims to regulate the provision of intermediary services by establishing harmonized rules across the European Union.

It builds upon the foundations of the E-Commerce Directive of June 8, 2000, updating its core principles to reflect the evolution of the digital landscape and user behavior, i.e. the rise of social media, content-sharing platforms, and online marketplaces.

The DSA pursues several key objectives to foster a safer digital environment: combating illegal content and systemic risks, safeguarding fundamental rights, and enhancing the accountability of digital service providers.

    1.1 Combating Illegal Content and Systemic Risks

The primary objective of the DSA is to address the proliferation of illegal content (such as hate speech, counterfeit goods, and disinformation), as well as the systemic risks generated by very large platforms, including information manipulation, the spread of harmful content, and threats to the protection of minors.

To that end, the DSA introduces a graduated set of due diligence obligations based on the nature and size of the service providers involved. The strictest requirements apply to very large online platforms (VLOPs) and very large online search engines (VLOSEs).

    1.2 Safeguarding Fundamental Rights

The second objective of the DSA is to protect fundamental rights, particularly freedom of expression, privacy, and data protection, while ensuring digital safety. The regulation imposes several key obligations on platforms, including:

     - Transparency of content moderation rules;
     - Accessible appeals and redress mechanisms for users;
     - A ban on manipulative interface designs (dark patterns);
     - Enhanced consumer and child protection safeguards.

    1.3 Strengthening Accountability and Transparency of Providers of intermediary Services

The third objective of the DSA is to clarify the liability exemption rules applicable to providers of intermediary services, without undermining the limited liability regime established under the E-Commerce Directive and transposed into French law by the Law on Confidence in the Digital Economy of June 21, 2004 (LCEN).

Accordingly, providers of intermediary services are not subject to a general obligation to monitor the data they transmit or store, nor are they required to actively seek out illegal activities. However, they must take measures to restrict access to illegal content or to provide information to the authorities upon receipt of an official order.

Their liability may be triggered in cases where they publish or promote content, or if they fail to promptly remove or disable access to illegal content after being informed of its unlawful nature.

Finally, the regulation introduces a structured regulatory governance framework based on three new mechanisms:

     - Digital Services Coordinators designated in each Member State. In France, ARCOM was appointed as the Digital Services Coordinator by the Law on the Security and Regulation of the Digital Space (loi “SREN”) of May 21, 2024;
     - Centralized supervision of very large platforms by the European Commission; and
     - A new European Board for Digital Services to facilitate cooperation among national authorities.


2. Online Services Covered by the DSA


The DSA applies to the provision of intermediary digital services. Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) are subject to a specific enhanced supervision regime, while small and microenterprises benefit from targeted exemptions.

    2.1 Providers of Intermediary Service

The DSA applies to the provision of intermediary services, as defined in Article 3(g) of the regulation, whenever the recipients of these services are established or reside within the European Union. The determining factor is therefore the target market, not the location of the service provider.

The categories of intermediary services covered are as follows:

    a. Mere conduit services refer to services that consist in the transmission, over a communications network, of information provided by a recipient of the service (i.e. a user), or the provision of access to such a network. This category includes internet service providers (such as Orange, Bouygues, SFR, and Free), as well as domain name registrars.

    b. Caching services are services that involve the automatic, intermediate, and temporary storage of information for the sole purpose of making its onward transmission to other recipients more efficient (e.g., services provided by AWS, Akamai, and Cloudflare).

    c. Hosting services refer to services that store information provided by a recipient of the service, at their request. This category includes:
. Web hosting providers (e.g., OVH, GoDaddy, Gandi, Infomaniak) and cloud computing services (e.g., AWS, Azure, iCloud);
. Online platforms, including marketplaces (e.g., Amazon, LeBonCoin, eBay, Vinted), video-sharing services (e.g., YouTube, Dailymotion, TikTok), social media (e.g., Facebook, Instagram, LinkedIn), and accommodation and travel platforms (e.g., Airbnb, Booking.com); and
. Search engines.

Among the providers of intermediary services, platforms and search engines with an average monthly number of active users in the EU equal to or greater than 45 million are designated as Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs). The list of designated VLOPs and VLOSEs is published and updated every six months by the European Commission. (2)

    2.2 Micro and Small Enterprises are Partially Exempt from the DSA

Providers of intermediary services that qualify as micro or small enterprises, according to the criteria set forth in the European Commission Recommendation of May 6, 2003, (3) are exempt from complying with certain obligations under the DSA, including transparency reporting requirements, internal complaint-handling systems and specific platform-related duties (e.g., content flagging, child protection measures, etc.).

These enterprises remain subject to the provisions common to all the providers of intermediary services (see below).

This partial exemption from DSA compliance, which applies as long as the service provider meets the size criteria defined by he Commission, is intended to avoid placing disproportionate financial and administrative burdens on smaller economic actors and to support innovation.


3. Obligations Applicable to Different Categories of Intermediary Services

The DSA introduces a tiered system of obligations based on the nature of the service provided and the size of the provider.

A common baseline of obligations applies to all providers of intermediary services. Additional, more specific obligations are imposed on hosting providers, online platforms, B2C marketplaces, Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs).

    3.1 Common Obligations for All Providers of Intermediary Services

All providers of intermediary services are subject to the following requirements (Art. 11 to 15):

     - Designation of contact points, to enable the authorities and the users to communicate directly with the service provider (Art. 11 and 12);
     - Appointment of a legal representative for providers not established in the European Union (Art. 13);
     - Transparent terms and conditions (T&Cs) that are easily accessible and written in clear, simple, intelligible, and unambiguous language. The T&Cs must detail, among other things, content moderation policies, algorithmic decision-making processes, and complaint-handling mechanisms (Art. 14);
     - Publication of an annual transparency report outlining content moderation activities and the number of orders and complaints received (Art. 15).

    3.2 Specific Obligations for Hosting Providers (Including Online Platforms)


Hosting providers are subject to the following additional obligations (Art. 16 to 18):

     - Implementation of a notice-and-action mechanism (aka notice and takedown), enabling any natural or legal person to report content they consider illegal (Art. 16);
     - Obligation to provide justification for content restriction or removal decisions (Art. 17). Users affected by content moderation decisions must be notified by the hosting provider, and this notification must include information on available remedies (internal complaint mechanisms, out-of-court dispute resolution, judicial review);
     - Duty to report to judicial authorities in cases of suspected criminal offenses that pose a serious threat to the life or safety of one or more individuals (Art. 18).

    3.3 Specific Obligations for Online Platforms

Online platforms are subject to enhanced obligations, including (Art. 20 to 28):

     - Internal complaint-handling system: Users must have access to an internal complaint mechanism for at least six months, allowing them to challenge decisions made by the provider in response to a notice or related to illegal content (Art. 20);
     - Access to out-of-court dispute resolution: Users must be able to submit disputes to a certified independent dispute resolution body (Art. 21);
     - Priority handling of notices from “trusted flaggers”: Trusted flaggers are entities with proven expertise in identifying illegal content, designated by the Digital Services Coordinator (Art. 22);
     - Procedure for suspending accounts of users who misuse the system by providing manifestly illegal content or who frequently submit notices or complaints that are manifestly unfounded (Art. 23);
     - Expanded transparency reporting: These reports must include the number of disputes referred to out-of-court bodies, the outcomes of such disputes, and the number of suspended user or flagger accounts due to abuse (Art. 24);
     - Ban on dark patterns and obligation to design fair and user-friendly interfaces (Art. 25); (4)
     - Advertising transparency: Clear identification of advertising content and advertisers, as well as disclosure of targeting criteria used (Art. 26);
     - Transparent recommendation systems, with an option for users to modify or influence ranking parameters (Art. 27);
     - Enhanced protection of minors: This includes a ban on targeted advertising where the provider has “reasonable certainty” that the user is a minor (Art. 28).

    3.4 Obligations for B2C Marketplaces

Business-to-consumer online marketplaces are subject to the following specific obligations (Art. 30 to 32):

     - Traceability of professional sellers: Marketplaces must identify professional sellers, collect self-certification, and verify supporting documents before allowing them to begin commercial activities (Art. 30);
     - “By-design” seller interface: The interface must be designed to facilitate sellers’ compliance with pre-contractual information duties, product conformity obligations, etc. (Art. 31);
     - Improved user information: Users must be clearly informed when a product or service available on the platform is found to be illegal (Art. 32).

    3.5 Enhanced Obligations for Very Large Online Platforms and Very Large Online Search Engines

Given their scale and the systemic risks they pose, these services are subject to the most stringent obligations under the DSA. They are required to:

     - Conduct an annual assessment of systemic risks. This includes analyzing the dissemination of illegal content via their platforms, the impact on the exercise of fundamental rights (such as human dignity, privacy, data protection, freedom of expression), and any risks related to disinformation, as well as the protection of public safety, public health, and minors (Art. 34). The assessment must also specify the mitigation measures adopted by the provider, such as adapting algorithmic systems, improving content moderation, or enhancing internal governance rules (Art. 35);
     - Establish a crisis response mechanism, allowing for appropriate action and participation in crisis management in the event of incidents such as terrorist attacks or war (Art. 36);
     - Undergo annual independent compliance audits (Art. 37);
     - Ensure increased transparency regarding recommendation systems and online advertising (Art. 38 and 39);
     - Appoint one or more compliance officers, based on independence and expertise criteria similar to those applicable to Data Protection Officers (DPOs) under the GDPR. These compliance officers are responsible for overseeing the service’s compliance with the DSA (Art. 41).

These obligations are subject to enhanced oversight by the European Commission, which may carry out inspections, issue formal orders, or impose sanctions in the event of non-compliance.


    By establishing a common set of requirements, the DSA aims to ensure a high level of safety for internet users while promoting fair competition among digital services, whether EU-based or not, targeting the European market. Some of these obligations have already been applicable in France under the E-Commerce Directive and the Law on Confidence in the Digital Economy (LCEN) of June 21, 2004, and have now been supplemented or strengthened by the DSA.

* * * * * * * * * *


(1) Regulation (EU) 2022/2065 of the European Parliament and of the Council of October 19, 2022 on a Single Market for Digital Services and amending Directive 2000/31/EC (Digital Services Act – DSA).

(2) The list of VLOPs includes: AliExpress, Amazon Store, Apple App Store, Booking.com, several services operated by Google (Google Play, Google Maps, Google Shopping, YouTube), Meta services (Facebook, Instagram), LinkedIn, Pinterest, Pornhub, Shein, Snapchat, Stripchat, Temu, TikTok, Wikipedia, X, Xvideos, and Zalando.
The VLOSEs are Google Search and Bing.

(3) The Commission Recommendation of May 6, 2003 on the definition of micro, small, and medium-sized enterprises (2003/361/EC) sets the thresholds as follows: a small enterprise has fewer than 50 employees and annual turnover under €10 million, and a microenterprise has fewer than 10 employees and annual turnover under €2 million.

(4) A “dark pattern” refers to a user interface design intended to mislead or manipulate users, such as a question phrased with a double negative, or a cookie rejection button that is smaller or less visible than the acceptance button.


Bénédicte DELEPORTE
Avocat

Deleporte Wentz Avocat
www.dwavocat.com

March 2024 (updated in June 2024)