Gabriele Engels, Co-Vice-Chair of the MARQUES Cyberspace Team, provides an update on the EU Digital Services Act.
With the Committee of Permanent Representatives’ (COREPER) formal approval in September 2022, the Digital Services Act (DSA) (COM/2020/825 final) took another giant step to becoming law. The DSA seeks to supplement the regulations of the E-Commerce Directive (Directive 2000/31/EC), which at over 20 years of age is no longer equipped to tackle the challenges of the contemporary digital landscape.
The DSA aims to ensure a safer online environment, predominantly by introducing new obligations for intermediaries to expeditiously remove harmful and illegal content, goods and services. The range of targeted content is extensive, with everything from hate speech over incitement to terrorism to copyright infringements falling within the scope of the regulation. The range of targeted parties is just as broad, with all online intermediaries targeting EU consumers falling into the scope of the regulation.
The importance placed on these new rules by the EU is evidenced by the consequences imposed on non-compliance. Intermediaries face fines of up to 6% of the companies’ global annual revenue, thereby exceeding even the maximum amount imposed by the GDPR.
Liability and obligations for all service providers
Contrary to what may have been expected, the Regulation refrains from establishing its own, brand-new liability regime, instead adopting the regimes of the e-Commerce Directive (Directive 2000/31/EC) largely unaltered. Liability for third-party content is only incurred by internet service providers where they have become aware of its illegality yet not removed it from their service.
The DSA uses this regime as a basis to establish procedures for reporting and removing illegal content and in this regard also trying to establish CJEU case law. This includes the obligation for all service providers to establish easily accessible and user-friendly notice-and-takedown procedures. Where the user notice includes certain details required by the Act, it can lead to a provider having positive knowledge of illegal content. It can thereby trigger the provider’s duty to remove such content as soon as possible.
Users whose content is blocked or removed following such a notice must be given an adequate explanation for this decision. The online intermediary must also notify both the accuser and the accused of possible avenues of redress. This possibility must take the form of an internal complaint system where the hosting service provider is an online platform.
Additional obligations for online platforms
Internal complaint systems are not the only special obligation placed on online platforms. For instance, with the exception of micro and small enterprises, online platforms must adjust their notice-and-takedown procedures to accommodate mechanisms to cooperate with frequent and dependable “trusted flaggers”. Content flagged by these individuals must be dealt with quicker and with priority. Where users have been repeatedly found to spread illicit content, their profiles are to be suspended for a reasonable period.
Another focal point of the DSA is transparency and protection regarding online advertising. Targeted advertisements based on sensitive personal data, such as ethnicity, political views, sexual orientation, or being aimed at children are completely prohibited. Additionally, online platforms must disclose how content is recommended to users (e.g. ranking mechanisms).
Because of the incomparable role which “very large online platforms” and “very large online search engines” play online, these service providers must comply with additional obligations to temper the dangers they pose regarding the dissemination of illegal content. These duties include mandatory risk assessments, the appointment of a compliance officer and setting up a repository of the advertisements displayed on their service within the last year. A platform is considered “very large” when it has more than 45 million monthly active users within the EU.
The only thing left to do is for the EU Member States’ ministers to endorse the final text in the Council. Subsequently, the DSA will enter into force 20 days after its publication. As a Regulation, most of its rules would become directly applicable 15 months later. Depending on the speed with which the Council casts its vote, the DSA could become applicable as early as the first quarter of 2024.
As horizontal framework it will complement sector-specific legislation such as the Directive on Copyright in the Digital Single Market and the Audiovisual Media Services Directive (AVMSD).
The MARQUES Cyberspace Team has published an update to its paper "Overview on the jurisdiction on liability of Internet Service Providers (ISP)", including a comprehensive review of the French, Dutch, German, English, Danish and Swedish case law on provider liability and examination of the relevant CJEU rulings.
The Cyberspace Team will follow the discussion around the DSA and major issues arising from the Regulation. The Team would also like to take this opportunity to invite any interested MARQUES members to contact the Team in case they would like to contribute.
The MARQUES Copyright Team has gathered a Copyright Tracker where MARQUES members can review the latest implementation status of the Digital Single Market Copyright Directive in each Member State.