Author: Gonçalo Afonso
Media: https://www.endnowfoundation.org/social-media-and-its-impact-on-elections-php/
Anti-democratic movements are not unknown on the internet, existing on it since the 20th century, although limited to obscure and sparsely inhabited corners. It was only during the Barack Obama presidency in the U.S. that extremist movements gained popularity on social media by resorting to conspiracy theories and reaching a worldwide audience for the first time. Algorithms created to encourage engagement on social media, along with greater interaction with extremist publications due to the emotional reactions they provoke, offered these movements a new predominance, resulting in Donald Trump's first election in 2016, and not even fact-checking mechanisms during Trump's first presidency were able to stop the spread of direct attacks on the 2020 electoral process in the US.
The European Union has not been oblivious to the growth of anti-democratic movements on social media, as due to their international nature they have also affected all EU Member States. In order to try to prevent the same events that occurred in the USA, the European legislator resorted to the Digital Services Act (DSA). By regulating the activities of very large online platforms (VLOPs), which include all major social networks , e.g., Facebook, TikTok, X, Instagram, the DSA requires, in its article 34, that they assess the existence of systemic risks present in their systems, including "Any actual or foreseeable negative effects on civic discourse and electoral processes, and public security", in accordance with paragraph 1(c) of the same article.
The wording of this paragraph demonstrates a clear concern with the effects that discourse on social networks can have on the Democratic Rule of Law, however, given the ambiguity of the rule, going beyond the interpretation of this concern is a challenge for both VLOPS and supervisory authorities. At no time does the DSA determine what should be understood as negative effects on civic discourse and electoral processes, revealing the difficulty of the European legislator in implementing this rule. See, for example, recital (62) of the DSA, in which the legislator limited itself to merely repeating the text of Article 34(1)(c), once again without ever specifying what should be understood as real or foreseeable negative effects. This is an amendment to the final version of the DSA compared to the one proposed by the European Parliament and Council, in which, despite restricting the systemic risks against civic discourse and electoral processes to " intentional manipulation of their service...", in Article 26(1)(c) of the proposal, equivalent to Article 34(1)(c) of the final version, it specifies, in recital (57), through an illustrative list, the facts from which such risks may arise, e.g., "… creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions". These differences with the final version of the DSA demonstrate a clear insecurity of the European legislator in the realization of negative effects against civic discourse and electoral processes, preferring to maintain a purely abstract norm.
There are several contents on social meida that have real negative effects on civic discourse and also on electoral processes. The entire manospheremovement, based on an idea of male superiority over women and the recoup of the social position of "alpha" men, with a confessed proximity to the far-right, whose targets are mostly adolescent boys, resulting in a significant increase in violence against women on social media, and also in real life. We will also be facing negative effects on electoral processes in the face of cases such as that of Elon Musk, the current owner of X, who used his own platform to promote and directly support the AfD, the German far-right party, and also Georgia Meloni, the far-right Italian prime minister.
These two cases, if the rule in question existed in isolation from the rest of the DSA, would not cause any doubts regarding its inclusion within the scope of the DSA, however, due to the adequate obligations to respect the fundamental rights of platform users, especially the right to freedom of expression, e.g., in the enforcement of the terms and conditions (Article 14, (4)) or in the mitigation of risks by VLOPS (Article 35(1)), determining whether these cases should be included in the risks to be assessed and mitigated by VLPOs becomes complex and ambiguous, resulting in its own questions.
Regarding the manosphere community, we are faced with several users who create and publish similar content harmful to civic discourse, but this fact alone does not reveal the existence of a systemic risk to be mitigated by VLOPS. Systemic risks, as can be taken from Article 34(1), should be risks derived from the pillars and the core functions of the platforms, including algorithms, so the mere existence of this type of content cannot be considered a risk. Different will be the situations in which the algorithm itself demonstrates a greater preference for this type of content, something that has begun to be verified. If it turns out that the algorithms are increasing the relevance of extremist content, especially on short-form content platforms such as Tiktok or Youtube Shorts, pushing users to these communities, we believe that VLOPS will be obliged to mitigate such systemic risks, at the risk of sanctions for non-compliance with the DSA. Unfortunately, due to the secrecy surrounding the creation and operation of the algorithms used, it is not challenging for VLOPS to contest any allegation of non-compliance, claiming that such results do not start from the code of the algorithms and that any action against such content would violate the right to freedom of expression. It seems inevitable to us that any case involving the application of Article 34(1)(c) of the DSA will end up in the Court of Justice of the European Union, and it will be up to the Courts to do the difficult work of implementing the criteria for the application of this rule.
Regarding the case of Elon Musk's active participation in the electoral processes of Member States, there is another question that arises. In the face of publications on his X account demonstrating support for anti-democratic parties, should we consider that we are facing communications from Elon Musk as a natural person, or Elon Musk CEO and representative of the platform? For other social media CEOs, with a more reserved life and away from public opinion, the answer would be relatively simple, however, Elon Musk has actively connected his person with his companies, as if one did not exist without the other, making the distinction between them quite dubious. In addition, Musk allegedly forced the change of X's algorithm to promote his posts over those of any other user. Once again, we believe that we are facing the existence of systemic risks, in this case with real or foreseeable effects on the electoral processes of the Member States, however, once again due to the abstractedness of the rule under analysis, if the supervisory authorities decided to use it to combat this intervention in electoral processes, we consider that such a case would also be decided in the Court of Justice.
The European legislator has demonstrated in Article 34(1)(c) a clear and commendable desire to protect civic discourse and electoral processes from the democratic rule of law, however, by refraining from implementing this extremely abstract norm, it prevents both the control authorities and the VLOPs themselves from effectively combating the existence of systemic risks on their platforms. The fate of the implementation of this rule seems inevitable to us, it will be up to the Court of Justice to do so, however, its abstract nature seems to result in a fear on the part of the supervisory authorities to resort to this rule, delaying the necessary process of jurisprudential implementation.