X’s moderation policy in question

This summer’s ultimatum will not have been heeded. On 16 October 2023, Australia fined X (formerly Twitter) $385,000 for shortcomings in its moderation of content related to sexual assaults on minors. This penalty has raised essential questions about the way in which the major platforms manage these challenges, calling into question the supposed total freedom of expression online advocated and defended by these platforms.

The responsibility of algorithms, an old debate

The advent of the participatory web has led to heated debates about the need to open up the algorithms that make the internet accessible to everyone. This crucial question concerns not only the way search engines operate, but also the way social media work. For a long time, the web giants hid behind their status as hosts, disclaiming any editorial responsibility. However, perceptions have changed and expectations of transparency and accountability have increased. These platforms are now under pressure to disclose more information about their algorithms and moderation policies, in the name of the general interest and the common good.

The evolution of the debate has manifested itself in a series of recent events, highlighting the challenges of online moderation and the responsibility of platforms. The issues of misinformation, the spread of harmful content and the impact on society as a whole have become more pressing than ever.

The transformation of Twitter

Elon Musk promised as soon as he acquired the platform: less moderation for more freedom of expression on his social network. By making more than 80% of his staff redundant, including many of the moderators responsible for identifying and removing problematic content, he has put his words into action. If further proof of this ambition was needed, ESafety commissioner Julie Inman Grant, who instigated the fine in Australia and is herself a former employee of the now-defunct Twitter, deplored ‘hollow’ discussions with X on the subject of moderation.

At a time when more and more research is showing that algorithms are responsible for the spread and visibility of toxic content, the neutrality of social networks is being called into question. Algorithms are not concerned with the reliability of information, but with its virality, and in this way they contribute to the spread of disinformation and polarisation, undermining the quality and integrity of online exchanges.

This is a major issue when you consider that 71% of young people now get their information exclusively from social networks. A high rate that also affects other generations in times of crisis (geopolitical, national, health, etc.).

Action taken in Europe and France

It is undeniable that this action and legal victory by Australia has highlighted the inability of digital giants such as X and Meta to effectively control content on their platforms. Against this backdrop, the European Commission has opened investigations into the main social networks, X, Meta and TikTok to obtain information on the measures they are putting in place to combat the dissemination of false information and illegal content.

In France, voices are also being raised to condemn the lack of moderation on X. A group of journalists, led by anti-disinformation specialists Tristan Mendès France and Julien Plain, is calling for a tweet strike this Friday, an operation dubbed ‘No Twitter Day’ to denounce the lack of moderation on X. The date is not chosen by chance, as 27 October marks the 1-year anniversary of Elon Musk’s takeover of the platform.

Given the extent to which the former Twitter had become an indispensable tool for journalists gathering information and testimonies, this initiative is by no means anecdotal. Nevertheless, burying X would be premature. After all, the announcement, support and relaying of this strike mainly found an echo on… X. With over 21,000 mentions of the hashtag in 2 days and 3 times as many engagements, the communications operation was a success. But would it have had as much impact without the social network? The question remains open.

By Inès Hadi