
Conference’s notes – DSA enforcement starts now, what changes can platform users expect?
On February 27th 2024, Julian Jaursch, project director at the think-tank SNV (Stiftung Neue Verantwortung) moderated a background talk entitled “DSA enforcement starts now – what changes can platform users expect?”

The event’s keynote speaker was Laureline Lemoine, a senior associate working at the intersection of technology and data rights at AWO. The primary aim of the event was to provide insights shortly after the DSA (Digital Services Act) went into full effect, offering clarity to platform users regarding the implications of this new regulation on their daily usage.
First and foremost, Laureline Lemoine pointed out that numerous provisions of the DSA will have little noticeable impact in the short-term. Indeed, it might take some time to fully implement the procedures and compliance mechanisms related to due diligence obligations for online platforms, notably in terms of accountability and transparency. For instance, VLOPs (Very Large Online Platforms) and VLOSEs (Very Large Online Search Engines) are obliged to perform annual risk assessments (article 34 of the DSA). Once the risks are assessed, VLOPs and VLOSEs must then put in place measures to mitigate them (article 35 of the DSA). If these provisions might improve the safety of online platforms and allow regulatory authorities to hold providers accountable, it could take years before this risk assessment mechanism translates into actual policy revisions for the platforms. As an example, the Commission is currently reviewing the first risk assessments submitted by VLOPs and VLOSEs. Next, platforms will have to take the Commission’s feedback into consideration and come together to devise a shared definition of “risk” in order to establish a general framework through which meaningful changes can be achieved.
Nevertheless, aside from this long-term expected impact, Laureline Lemoine also mentioned measures that will have immediate repercussions. In this context, she highlighted the role of the DSA as a “transparency machine”. Regarding content moderation practices for instance, the DSA states that, when platforms take down a publication or ban an account, they must provide an explanation to users affected by their content moderation decisions (article 17 of the DSA). These decisions, or “statements of reasons”, should then be added to a publicly available database.
Additionally, Laureline Lemoine labelled article 40 of the DSA as a “sleeping giant”. This provision grants vetted researchers access to data of VLOPs and VLOSEs. According to Laureline Lemoine, article 40 of the DSA is a very powerful tool in terms of transparency and accountability. Indeed, researchers will be able to check if platforms are complying with their obligations under the DSA or to unveil new systemic risks. Therefore, the delegated act on this topic is eagerly awaited. Besides, CSOs (Civil Society Organizations) or non-academic research centers are also allowed to access publicly available data from VLOPs and VLOSEs (article 40 § 12 of the DSA). The goal of this provision is to rip open the algorithmic black boxes of online platforms and to better understand what is happening “behind the curtains” by providing data access to the audience that can make sense of them.
Furthermore, the speaker brought up other provisions with immediate effects, namely the reporting of illegal content (article 16 of the DSA) and the complaint mechanism (article 53 of the DSA). Henceforth, platform users are presented with two recourses when it comes to reporting. On the one hand, users can report illegal content through a tool set up by the platform. On the other hand, users can complain about any DSA infringement by the platforms to their local DSC (Digital Service Coordinator), the national regulatory authority in charge of DSA enforcement.
In this context, Laureline Lemoine was keen to reassure the audience on the limit between content moderation and free speech infringement. She emphasized that the DSA does not set a definition of “illegal content” and that it is up to each Member State to determine which type of content should be prohibited. Moreover, she also specified that the DSA protects fundamental rights (article 1 of the DSA), including freedom of speech. Consequently, a platform could be sanctioned under the DSA if its policies led to a systemic violation of freedom of speech.
After the presentation, Julian Jaursch moderated the Q&A session and the speaker was able to answer numerous questions from the audience. Several of them focused on article 40 of the DSA, especially on the possibility for platforms to deny data access requests from researchers by invoking the protection of “trade secrets”. Laureline Lemoine acknowledged the risk, but expressed the hope that the Commission could become the arbiter if a platform is repeatedly using “trade secret” as a reason to deny data access requests. Other audience members were concerned about the spread of online disinformation. One person raised the issue of moderating disinformation, noting that most of it falls within the bounds of legality. Laureline Lemoine replied that the issue of disinformation is tackled through systemic risk assessments and mitigation measures. More specifically, disinformation is addressed as a global issue, meaning that the goal is not to take down every piece of disinformation shared online. Instead, platforms will have to try to reduce the reach and virality of disinformation, for instance by modifying their recommender systems. Afterwards, someone else questioned whether the draft guidelines of the Commission on the impact of online disinformation on election processes are tackling the right elements and will ensure effective protection of the 2024 European elections. The speaker confirmed that the Commission appears willing to enforce the guidelines before the European elections and that some platforms already started releasing specific "trust and safety” policies to preserve election integrity. Other topics were also touched upon, such as the online protection of minors (article 28 of the DSA) or the requirements of small non-commercial platforms, which benefit from a specific regime but might still struggle to comply due to their lack of resources. Finally, the ecological dimension of the DSA, or lack thereof, was evoked as well.
Overall, this event encapsulated all the key aspects of the DSA while raising relevant questions on its enforcement, both in the short and long term.