
Conference’s notes - Protecting the 2024 Elections: From Alarm to Action
How the DSA can protect democracy by meaningfully addressing the virality of disinformation and hate speech?
On March 6th 2024, the Greens/European Free Alliance organized the “Protecting the 2024 Elections: From Alarm to Action” conference. This gathering was an occasion to discuss the influence of the recommender systems used by online platforms on political trends and electoral campaigns. Throughout the event, speakers unanimously agreed that recommender systems massively favor the spread of misinformation and hate speech on platforms. Therefore, precise policy recommendations were formulated to address the issues identified.
Frances Haugen, former Facebook employee and whistleblower, gave the opening keynote. She took this opportunity to emphasize how much 2024 is an inflection point. In the course of the year, half of the world population will be called to the polls, raising a crucial question: can our democracies defeat online disinformation? Besides, Frances Haugen stated that 2024 is also an inflection point for online platforms: will they continue to thrive on recommender systems which are optimized for private profit rather than the common good? In order to tip the scales in the right direction, users need to understand at least to things. First, recommender systems are exploiting their vulnerabilities. Second, recommender systems are being weaponized by malicious actors to enflame passions and foster polarization. While Frances Haugen acknowledged that the DSA (Digital Services Act) is an important piece of legislation in that context, she also warned that the regulatory authorities must swiftly enforce it to prevent the weakening of already fragile democratic regimes. Frances Haugen closed her keynote with this impactful statement: “the best time to fix the issue of online disinformation was years ago, the second-best time is right now.”
Panel 1: Recommender systems, disinformation, elections and the DSA
The first panel, “Recommender systems, disinformation, elections and the DSA”, provided an overview of the impact of recommender systems. Marc Faddoul, director and co-founder of AI Forensics, established that recommender systems are at the center of content distribution of online platforms; on TikTok for instance, 90% of the content is distributed through recommender systems. Furthermore, polarizing and extreme content is oftentimes pushed forward by recommender systems and becomes viral because it generates a lot of engagement: on X, disinformation elicits 35 times more engagement than other contents according to Tanya O’Carroll, an independent advisor. This situation is all the more worrying as the rise of generative AI enables malignant actors to easily create very convincing deepfakes. Nevertheless, with the DSA, the EU is better equipped to combat online disinformation. Under article 38 of the DSA for instance, VLOPs (Very Large Online Platforms) and VLOSEs (Very Large Online Search Engines) must supply an option for each of their recommender systems which is not based on profiling. Marc Faddoul as well as Tanya O’Carroll were suggesting taking this measure further by requiring VLOPs and VLOSEs to have the non-profiling setting as the default and allow users to switch it off. Eventually, Marc Faddoul and Tanya O’Carroll argued that third parties should be given access to the algorithms of the recommender systems in order to modify their features and parameters. This requirement would lead to a more independent management of recommender systems since third parties are not profit-driven and, ultimately, to the amplification of higher quality content (an example of that would be the Tournesol app). However, Renate Nikolay, Deputy Director-General at the DG Connect of the European Commission, opposed that this obligation would be very complex to impose on private companies since their recommender systems are protected under “trade secret” frameworks and it could contravene the principle of free movement of services.
Panel 2: CSO meet enforcement authorities roundtable - The future of recommender systems
Afterwards, a second panel, “The future of recommender systems”, consisted of an exchange of perspectives between a good variety of CSOs representatives and enforcement authorities. This roundtable was divided in two segments. To begin with, CSOs representatives presented their views on the future of recommender systems.
First, the representatives from civil society stressed that the algorithmic black boxes of online platforms, including the recommender systems, need to be ripped open. Indeed, these algorithms are carefully designed by online services providers to maximize engagement, but they end up sustaining polarization and misinformation. Therefore, CSOs representatives emphasized that the regulator must compel online platforms to drastically modify their recommender systems despite the expected financial losses. In this perspective, numerous policy recommendations were made.
- CSOs representatives considered that bridging-based ranking would be a desirable alternative to recommender systems.
- Additionally, they claimed that creating a “do not show me this type of content again” button that works effectively would lead to the dissemination of higher quality content.
- Besides, the vast majority of CSOs representatives were in favor of adopting friction mechanisms to disrupt the spreading of harmful content, such as setting a limit on the number of times a post can be shared or strategically demoting specific content and accounts. These measures could counterbalance the adverse effects of recommender systems.
- Most importantly, CSOs representatives emphasized that people should not be profiled by default and that they should be able to freely curate their own feeds on platform. They consider user empowerment as the most efficient way to tackle disinformation and hate speech.
Second, regulators at the national and European levels also recognized that recommender systems are a paramount issue when it comes to online services’ regulation. They indicated that the DSA initiates a paradigm shift: since its entry into application, providers of online services are accountable to the European Commission and to the national DSCs (Digital Services Coordinators). More precisely, the pitfalls of recommender systems should be addressed through annual risk assessments (article 34 of the DSA) which are mandatory for VLOPs and VLOSEs. Following this assessment, platforms must take specific measures to mitigate the identified risks (article 35 of the DSA).
Nevertheless, the Commission will not be able to directly enforce the policy recommendations from CSOs representatives, such as the implementation of bridging-based ranking in recommender systems or the ability for users to curate their own feeds. As a matter of fact, its role is limited to observing whether the mitigating measures put in place by VLOPs and VLOSEs are sufficient to address the identified systemic risks. If the European Commission is not convinced by the steps taken by a service provider, it can initiate a formal proceeding. Such procedure has already been opened against TikTok for its “dark patterns” and “addictive designs.” Aside from systemic risk assessment, the representative of the Irish DSC reported that they were currently working on a draft Online Safety Code which includes supplementary measures on safety-by-design and recommender systems safety. Finally, all regulator representatives underlined how much the contributions from civil society are of primary interest and highly valued.
To round off the event, 2021 Nobel Peace Prize Winner Maria Ressa delivered the closing speech. She pointed out that the practices of social media companies are endangering democratic regimes. She asserted that these companies are making money by propagating disinformation and by extensively personalizing user’s experiences, leading to the emergence of filter bubbles. These bubbles give users the illusion that most people agree with them, thus inducing a radicalization of the beliefs held by the users. Therefore, providers of online services need to be held accountable and the integrity of information must be protected: in a functioning democracy, lies cannot be spreading faster than facts.
