
The Digital Services Act (DSA) – Improving the notification system for illegal content and the limit of the procedure under Article 6-3 of the LCEN
Adopted on October 19, 2022, and effective as of November 16, the Digital Services Act (DSA) aims to make online acts that are illegal in the real world unlawful online and imposes new obligations on hosting service providers—particularly large online platforms—regarding the prevention of illegal content.
Unlike the European Directive on Electronic Commerce 2000/31 of 2000, the DSA is an EU regulation and is therefore directly binding (direct applicability) on Member States. It is thus incorporated into the French legal system without the need for transposition into national law, and French courts may issue rulings by directly applying the DSA.
The DSA, which has applied to all online platforms and all internet intermediary service providers since February 17, 2024, imposes, in particular, on hosting service providers (including online platforms (Article 3(i) of the DSA)) the obligation to establish a new system for reporting illegal content and processing such reports (Article 16), as well as the obligation to justify restrictions on access or publication imposed on illegal content (Article 17), and aims to resolve issues related to the notification regime provided for by the LCEN law, which transposed the 2000 European Directive 2000/31/EC on electronic commerce.
The DSA also provides, given the significant role that online platforms play in the dissemination of illegal content, for the obligation to establish a new system for moderating published content (Articles 20–23 of the DSA); it further imposes specific rules regarding systemic risk management on large online platforms and major search engines with an average of more than 45 million monthly users that were designated by the European Commission on April 15, 2023 (Articles 33 to 43).
I. Improvement of the notification system required to establish the liability of internet service providers and platform operators regarding illegal content they transmit, store, or disseminate
Directive 2000/31/EC on electronic commerce of 2000 established the principle that internet service providers and hosting providers cannot be held liable, either civilly or criminally, for illegal content they transmit or store, provided they were not aware of such illegal acts or content (Article 15 of Directive 2000/31). Individuals who have suffered harm as a result of illegal acts or content on platforms such as YouTube, Facebook, or Twitter must therefore notify these providers of such illegal acts in order to prove that the providers were aware of them but failed to take the necessary measures to stop them. If such notification is not provided, any claim for damages or criminal complaint against the provider is inadmissible.
Prior to the entry into force of the DSA, any person who was a victim of unlawful acts or content on the Internet was required to report such acts or content to service providers in accordance with the provisions of Article 6 I 5 of the Law on Confidence in the Digital Economy (LCEN), which transposed Directive 2000/31/EC on electronic commerce into national law.
The elements that must be included in the notification of unlawful acts or content sent to the internet service provider, as provided for in Article 6 I 5 of the LCEN, are as follows (comparison before and after the amendment introduced by the Avia Act of June 24, 2020):
| Before June 26, 2020 | After June 26, 2020 | |
| Date | Date of the notification | |
| Identity of the notifier | Natural person: last name, first names, profession, domicile, nationality, date and place of birthLegal entity: legal form, corporate name, registered office, and the body that legally represents it | Natural person: last name, first name, email address
Legal entity: legal form, corporate name, email address Administrative authority: name and email address |
| Recipient of the notification | The recipient’s name and address or, if a legal entity, its name and registered office | |
| Disputed facts | A description of the facts in dispute and their precise location | A description of the disputed content, its exact location, and, if applicable, the email address(es) where it is accessible |
| Demonstration of a legitimate reason | The reasons why the content must be removed, including a reference to the relevant legal provisions and factual justifications | The legal grounds on which the disputed content should be removed or made inaccessible |
| Document to be attached | A copy of the correspondence sent to the author or publisher of the disputed information or activities requesting their suspension, removal, or modification, or justification that the author or publisher could not be contacted. | A copy of the correspondence sent to the author or publisher of the disputed information or activities requesting their suspension, removal, or modification, or a justification explaining why the author or publisher could not be contacted. |
In practice, the victim (or their attorney) was required to draft a letter exhaustively listing the mandatory information required by this provision, and send it to the relevant hosting provider via certified mail with return receipt requested, in order to obtain proof of the date of notification of the unlawful content.
If any of the mandatory details required by this provision were missing, the notification was considered null and void; consequently, even if the provider did not remove the illegal content, it was not possible to file a civil suit for damages or to file a criminal complaint.
French courts have strictly interpreted this provision, and the Court of Cassation established, in two rulings handed down on February 17, 2011, the principle that, if a notification did not include all the required elements, it was not possible to prove that the provider was aware of the unlawful nature of the act; consequently, even if the provider did not remove the unlawful/infringing content or delayed its removal, it could not be held liable under civil liability (AMEN ruling, appeal no. 09-15.857; DAILYMOTION ruling, appeal no. 09-67.896).
Nintendo Ruling (Court of Cassation, Commercial Chamber, February 26, 2025, No. 23-15966)
This case law has often been used by hosting service providers as a defense in proceedings brought by victims of illegal content on the Internet. In the case of Nintendo Co., Ltd., The Pokémon Company, Creatures Inc., Game Freak Inc. v. DStorage, which lasted seven years from the initial summons to the Court of Cassation’s ruling on February 26, 2025, the attorney for Nintendo and The Pokémon Company notified DStorage of the existence of links enabling the download of unauthorized copies of video games via certified letter with return receipt in January 2018. Summoned before the Paris Regional Court in May 2018, DStorage argued that the notifications sent by the attorney for Nintendo and Pokémon did not meet the requirements set forth in Article 6 I 5 of the LCEN, such as the identification of the notifying party or justification that the author or publisher could not be contacted. The Paris District Court (judgment of May 25, 2021, No. 18/07397) and the Paris Court of Appeal (decision of April 12, 2023, No. 21/10585) dismissed these arguments presented by the defendant company and held that the two notifications sent by the attorney for the Nintendo companies met the formal requirements prescribed by Article 6 I 5 of the LCEN. DStorage’s appeal was rejected by the Court of Cassation in its decision of February 26, 2025.
In this case, it should be noted that the Nintendo companies were awarded only a fraction of the damages sought: although their claim for 1,368,500 euros was partially granted by the Court in the amount of 935,500 euros, this amount was subsequently reduced by the Court of Appeal to €442,750, representing only 32% of the damages they had anticipated when filing the lawsuit against DStorage.
The New Mechanism for Reporting Illegal Content
To address this difficulty related to reporting illegal content, the DSA now requires hosting service providers to establish a reporting system that complies with certain technical and organizational requirements, enabling the electronic reporting of illegal content. Indeed, while the LCEN, derived from the e-commerce directive, required the notifier to provide a series of information specified in Article 6 I 5 via postal mail, the DSA now requires hosting service providers to establish a technical system for electronic notification that ensures submitted notifications contain this information (Article 16 of the DSA).
Following notification of illegal content, the hosting service provider is required to make its decision in a non-arbitrary and objective manner and to act within a reasonable timeframe (Article 16 of the DSA). In the event that the hosting service provider fails to comply, the victim of illegal content may bring a legal action against the hosting service provider under general tort law. The victim must demonstrate that they duly notified the hosting service provider of the illegal content but that the provider failed to act within a reasonable timeframe (CJEU June 22, 2021, C-682/18 and C-683/18).
In this regard, the DSA provides for out-of-court dispute resolution procedures through bodies certified by the Digital Services Coordinator (ARCOM in France), a list of which is published on the European Commission’s website in accordance with Article 21 of the DSA (see the list).
Please note that referring a matter to these bodies does not interrupt the five-year statute of limitations for holding the hosting service provider liable.
II. The procedure for obtaining the removal or deletion of illegal content
The former Article 6 I 8 of the LCEN, which provided for summary proceedings or a petition to request “any measures appropriate to prevent or cease harm caused by the content of an online public communication service ,” was amended by Law No. 2021-1109 of August 24, 2021, reinforcing respect for the principles of the Republic, which established, for the same request, an expedited procedure on the merits before the President of the judicial court. Henceforth, any application seeking the removal or deletion of unlawful content must be filed in accordance with this procedure, failing which it may be declared inadmissible by the judge presiding over summary proceedings (Court of Appeal, February 17, 2023, 22/09609, Trustpilot A/S v. SARL Rose Passion). Since the DSA, this provision has been included in Article 6-3 of the LCEN.
The inadequacy of the procedure provided for in Article 6-3 of the LCEN to address systemic risks associated with an online platform (the Shein case)
In the Shein case, in November 2025, the French government filed a petition with the President of the Paris Judicial Court, ruling on the expedited proceedings on the merits, to request the total blocking of the Shein website for the sale of illicit products, but its request was dismissed by the judgment of December 19, 2025 (press release of Ministery of Economy), in which the court found the measure sought by the French government to be manifestly disproportionate and an unjustified infringement of the right to freedom of enterprise. The court granted the State’s alternative request and ordered ISSL not to resume the online sale of pornographic products without implementing age verification measures other than a simple declaration of majority, subject to a provisional penalty of 10,000 euros per confirmed violation, for a period of 12 months.
Before the Court of Appeal, the French government—which no longer sought a total block of the site but rather the suspension of the marketplace alone for three months under ARCOM supervision—had its request dismissed once again in a ruling dated March 19, 2026, in which the Court upheld “in its entirety” the judgment of the court of first instance (press release of Paris Appeal Court), due to the absence of current harm and certain future harm following the removal of the disputed products from its platform by ISSL.
III. The systemic risk management regime – the investigation against Shein and the penalty incurred
The DSA provides for a specific systemic risk management regime applicable to very large online platforms (“VLOPs”) and very large online search engines (VLOSEs)”) that must diligently identify, analyze, and assess any systemic risk within the Union arising from the design or operation of their services and related systems, including algorithmic systems, or from the use of their services (Article 34 of the DSA). In the event of non-compliance, the Commission may impose fines on the provider of a very large platform or very large online search engine of up to 6% of its annual global turnover (Article 74 of the DSA).
In February 2026, the European Commission formally launched an investigation into Shein due to its addictive design, the lack of transparency in its recommendation systems, and the sale of illegal products, including child pornography.