
Conference's notes : Clearing the Hurdles : Toward a New Era in Social Media Regulation
In March 2025, The Brussels Privacy Hub (BPH) hosted a webinar entitled “Clearing the Hurdles: Toward a New Era in Social Media Regulation”, moderated by Sophie Stalla-Bourdillon, BPH Co-director.
The debate was initiated by emphasizing the extensive European regulatory framework on platforms’ regulation, including the GDPR, the DSA, and the DMA. Sophie Stalla-Bourdillon also pointed out that the harmonization of European texts didn’t necessarily meet the objective of simplifying norms. The interaction between different norms and their overall effectiveness in regulating platforms was a central element of this webinar.
While most panelists agreed on the negative impacts of social media, whether on users’ health or on the spread of disinformation, they concurred on the potential ways forward to mitigate those risks.
Arianna Sala, a project officer at the European Centre for Algorithmic Transparency, presented her research on the impact of social media use on teenagers’ mental health and well-being. Two of her recommendations focused on the role of platforms: one was to shift responsibility from users to platforms, and the other was to design ethical systems that are exempt from persuasive design techniques and dark patterns, especially for platform accessible to children.
Fernando Hortal Foronda, a digital policy officer at BEUC, shared this view. He focused specifically on addictive designs, pointing out how the Digital Services Act (Regulation n° 2022/2065) is failing to address this issue. The regulation only mentions addictive behavior in recitals 81 and 83, questioning whether it is an effective tool to curb addictive designs of recommendation algorithms. On the contrary, he argued that addictive designs should fall under consumer protection law to ensure higher protection of users. In that sense, the future Digital Fairness Act (DFA) could be an interesting tool because it takes into account the asymmetry of power between platforms and consumers. Alexander Hohlfeld also analyzed the loopholes in some EU regulations, focusing on the DSA’s risk assessment. He emphasized the lack of clear standards and definitions in the risk assessment reports, making it difficult to compare. He also pointed out that the first round of risk assessments focused primarily on risks relating to content itself rather than platform design.
Filippo Bagni, who is part of the DSA Enforcement Team at the European Commission, argued that the DSA’s enforcement requires time to be fully operational and efficient. Such time is also crucial to build cooperation and trust between platforms and public authorities. In that sense, he considered the first year of the DSA very fruitful. Fernando Hortal Foronda pointed out that this slow enforcement, even though necessary, only accentuates the feeling that the European legislator put a lot of effort into the regulation only for little effects.
Another concern shared by multiple panelists was the issue of trust between actors. While European actors focus on building trust between authorities and platforms, some panelists claimed that building trust between authorities and users was also needed. Guido Scorza, a member of the Collegio del Garante per la protezione dei dati personali at the Italian Data Protection Authority, argued that users do not seem to realize the danger that social media represent to them, which creates an imbalance for protective authorities who face difficulties trying to restrain platforms without the user’s full support and understanding. According to him, this imbalance is reinforced by vertical supervision of platforms. He suggested a rather horizontal approach, by educating users and making them more aware of the conception and functioning of the systems. This horizontal approach could lead to accentuating users’ role in enforcement.
However, the need to establish trust should not be a reason to put overbearing responsibilities on users nor to think that trust is sufficient. The democratic crisis Europe is currently facing calls for immediate action that could find grounds within the multiple legislative tools already in place. Johnny Ryan, Director of Enforce at the Irish Council for Civil Liberties, thereby stated that the DSA was not mature enough to address this crisis, especially as the regulation focuses on self-regulation and transparency. He considered the discussion around the DSA not up to the urgent situation and highlighted a lack of commitment and determination from the European Commission. He pointed out that the Commission could implement existing tools to constrain platforms instead of relying on their goodwill and self-regulation. He specifically mentioned the Audiovisual Media Services Directive (directive n° 2018/1808), which could empower the Irish authority to sanction platforms. For example, X would not be considered a video-sharing platform according to article 1 (a)(a) of the regulation, but a normal audiovisual service, subject to normal standards, and could fall under the jurisdiction of the Irish circuit courts. According to Johnny Ryan, the underlying problem is the inaction or leniency of the Irish authorities, which are the competent ones for such matters. Indeed, for him national data protection authorities or consumer protection authorities should alert the European Commission or the DG Justice on the lack of enforcement from the Irish authority for eventual sanctions. Guido Scorza’s closure statement was rather reassuring, as he was convinced that cooperation between national data protection authorities had taken a step forward.
To conclude, this conference provided an interesting perspective on the remaining difficulties in social media regulation and enabled a collective reflection on future solutions through the lens of different stakeholders.