Striking a Delicate Balance: Moderating free speech online and the case of Twitter

European authorities are concerned about the statements of Twitter's boss as well as Twitter’s drastic cuts in its means of moderating online content, particularly in the shadow of the Digital Services Act(1) and the legal accountability of online platforms regarding their policies to fight misinformation and hate speech.

Posted mardi, juin 6 2023
Striking a Delicate Balance: Moderating free speech online and the case of Twitter

About the Author: Constantin Pavleas is a lawyer practicing new technologies law, he is the founder and head of the law firm Pavleas Avocats as well as the coordinator of the Digital Law & Intellectual Property program and the head of teaching at the Hautes Études Appliquées du Droit School (HEAD). With his team, he advises French and foreign companies and organizations in the information technology sector and more generally in the innovation sector on their intellectual property strategy, the valuation and exploitation of their intangible assets, and the protection of personal data. The firm also assists its clients in the negotiation of complex contracts.


Since the takeover of Twitter by libertarian billionaire Mr. Elon Musk in October 2022, the company announced several cuts in its workforce, including the mass dismissal of 4,400 service providers, out of a total of 5,500. Among the service providers fired are the online media's content moderators, a category already considered understaffed with less than 2,000 agents for the whole world before the dismissals.2

This situation has caused the French public authority for audio-visual and digital communication regulation (ARCOM) to sound the alarm. Despite Elon Musk's claims that hate speech had decreased by a third since he bought the platform, ARCOM reminded Twitter of its legal obligations to monitor online content. In particular, the French regulator requested Twitter to declare as soon as possible what "human and technological" resources were devoted to fighting disinformation. ARCOM judged Twitter's answer to be very vague and underlined Twitter’s ambiguity in the fight against disinformation. Thus, the French authority expressed its regret for the platform’s "very limited transparency concerning digital data"3.

At the European level, the Commissioner for the Internal Market, Mr. Thierry Breton, expressed his concerns to the new Twitter boss during an interview regarding the implementation of the Digital Services Act ("DSA"). Breton warned Musk about breaches of the new regulation and that the European Union (EU) will not hesitate to exercise its sanction power. Underlying all of this are conflicting values between Europe and Twitter’s new boss.

It will be necessary to prevent platforms from exercising systematic censorship for fear of potential sanctions

A need for moderation of online content

On the one hand, the "libertarian" Mr. Musk sets up freedom of expression as an absolute principle, which must take precedence over any potential censorship. On the other hand, the EU advocates the need to establish a safe online environment in which people can move and express themselves freely within the limits of any abusive behavior that would compromise the freedom and safety of others. In these conditions, effective moderation of online content is necessary to fight against misinformation that presents a real danger to our democratic systems, but also against hate speech or threats that can undermine people's safety. The choice of the European legislature is to leave the responsibility of this moderation to online platforms, in compliance with the requirements set by EU regulations.

In France, the government wished to anticipate the entry into force of the DSA with the French Law of August 24, 2021, reinforcing in its Article 42 the compliance with the principles of the French Republic4,5. This law imposes on platforms with more than 15 million monthly visitors a set of requirements concerning cooperation with law enforcement agencies, the implementation of notification and repression systems for illegal, hateful content as well as transparency concerning the moderation of such content. ARCOM is the authority with sanctioning powers of up to 6% of the operator's worldwide turnover. For very large online platforms

(VLOPs) the requirements imposed by the law combatting online hate are even more important (such as the assessment of the risks of spreading illegal hate content as well as the obligation to take measures to fight against this dissemination, while preserving freedom of expression). In Germany, the NetzDG law enacted in 2017 proposes a penalty regime against social network hosts with fines that can range from €5 million to €50 million.


How the DSA is changing the rules

One of the foundations of the DSA is to make online behavior that would be illegal offline just as illegal online6. The objectives are clear: to better protect European Internet users and their fundamental rights, but also to help small European businesses to develop while making the VLOPs more responsible. By extending legal liability to the latter, the EU wishes to mitigate systemic risks, such as the manipulation of information or misinformation. The DSA will make it easier for users to report illegal content and will oblige online platforms to cooperate with "trusted reporters". The VLOPs will also have to be transparent in terms of content moderation, including an internal system for handling complaints, but also regarding their advertising algorithms. “Digital services coordinators" are planned to enforce these obligations in all EU countries (ARCOM in France). In case of non-compliance with these obligations, the coordinators or the European Commission may impose fines of up to 6% of the global turnover of the offenders. The regulation came into force on November 16, 2022, for all online platforms (except the smallest) and will be applicable as of February 17, 2024. However, the DSA will apply to VLOPs as early as August 25, 20237. Of note, the European Commission notified a list of qualified VLOPs which includes Twitter.

One of the difficulties in applying the new regulation will be to control the quality of moderation, both in its failures and in its excesses. It will be necessary to prevent platforms from exercising systematic censorship for fear of potential sanctions. Several French and foreign politicians have been censored by having their accounts deactivated or their publications deleted. To avoid the opposite excess, it will be necessary to think about instituting a control on the extremes of moderation, so that the moderation does not go too far and does not justify Elon Musk’s libertarian theories... A delicate balance must be found that respects European values in the search for our European digital sovereignty.



- Like ARCOM, the French authority for regulating online content, European authorities have expressed concern about the mass dismissal of moderators within the social network Twitter

- Online moderation of hateful content and "fake news" is crucial to maintaining a safe and trusted online environment

- The advent of the Digital Services Act aims to make online platforms more accountable by imposing new obligations, including moderation and transparency

- In case of non-compliance with their obligations, platforms can be fined up to 6% of their worldwide turnover


*With the help of Edouard Lambert, Master of Law and Business Practice - Hautes Études Appliquées du Droit School (HEAD) and LL.M at the University of California Los Angeles (UCLA).

1. Regulation (EU) 2022/2065 of 19 October 2022 on
a single market for digital services and amending
Directive 2000/31/EC (Digital Services Regulation)
5. These statutory provisions will remain into force
until December 31, 2023.
7. Section 92 of DSA.