How does Arcom evaluate its first year?
Benoît Loutrel, a member of the Arcom college, heads the “Online platform control” working group of the new French audiovisual and digital regulator born from the merger of CSA and Hadopi in January 2022. Siècle Digital l’ met on the occasion of publication. annual report on the fight against online data manipulation.
In this first part of the interview, he takes us back to his key findings and reviews Arcom’s first year of activity, particularly the presidential and legislative elections. The second part, published later, will be devoted to the implementation of the European regulation on digital services by Arcom.
Published at the end of November, this third Arcom assessment of the platforms’ anti-manipulation measures makes an alarming observation, to say the least. Based on statements from the 12 platforms subject to the exercise for critical metrics, including Meta, Twitter and Snapchat, it decries “recurring gaps” in the communication of data and figures. In particular, the “lack of material information” transmitted by TikTok, Yahoo and, to a lesser extent, Google, prevents any assessment of the relevance and effectiveness of the actions taken by these platforms.
Siècle Digital: The results of the Arcom report on the fight against the manipulation of information are quite encouraging. How do you see the responses of platform operators to your inquiry about the measures implemented against the manipulation of information?
Benoit Loutrell : The answers of some operators remain insufficient from our point of view. Admittedly, there is progress compared to last year, but we expect more transparency from them. The stakes are high, and citizens must have confidence in their information ecosystem. It is an important element of community unity.
We would like platforms to provide information on their policies regarding moderation, demonetization or slowing the spread of problematic content. Platforms also owe this transparency to the research world, which must be able to make its own assessment of moderation: have these measures had the desired effect? Were there any unwanted side effects?
Information manipulation is a complex subject that requires the use of multiple sources. So we closely monitor the work of the academic community or civil society that feeds our thinking.
SD: In your report, you lament that the platforms are not transparent enough. Do you believe that the legislation in force against the manipulation of the data you trust gives you sufficient latitude to act?
BL : The 2018 law provides that online platform services have an obligation to cooperate with the regulator. They need to create tools to combat the manipulation of information. The law requires them to respond to Arcom’s questions, and it discloses these statements in its annual report.
It was a very good law to enter into the statute of these players. This allowed us to improve our skills. It is now necessary to demand increased transparency from all platforms. This is one of the contributions of DSA. This will allow us to be more ambitious and demanding as a collective.
SD: DSA will be the subject of the second part of our interview. Before that, I would like to know more about your cooperation with the platforms on a daily basis. What kind of business relationship do you have with them?
BL : We have a formal discussion framework with platforms, such as for this annual report on fighting fake news. We send them a special questionnaire and set a deadline for their answers. If there is no answer, we say so and do not hesitate to name names. In this report, we identify actors’ responses as unsatisfactory. [ndlr : TikTok, Yahoo et Google en particulier] or vice versa, when our dialogue with them develops, for example in Snapchat.
And then outside of that formal dialogue, we have a lot of informal exchanges. This was also the case during the election, when digital platforms played an important role in preventing foreign interference and helping the fact-checkers who check online information about the campaign.
In particular, at one of our meetings, we were told that the moderation algorithm had malfunctioned and mistakenly blocked the accounts of campaign group members. We reminded them that they had an obligation to correct the problem, but also to be transparent and explain themselves. Therefore, they informed the public about this episode.
SD: Arcom recently published a report in which you made recommendations on social media for the 2022 elections. Do you think platforms like audiovisual media should have a commitment to fairness or equality during elections?
BL : No, this is not Arcom’s dream. Television and radio stations control their editorial offerings, which somehow influence their audience. Therefore, they should try to represent all currents of thought. Therefore, they are required to respect the criteria of fairness, for example, in the final weeks of presidential elections, and then equality of speaking time between candidates.
Internet users are more active in social networks. If you are subscribed to Mr. Mélenchon and Mr. Zemmour’s account, you will consult Mr. Mélenchon and Mr. Zemmour’s content. You have decided to subscribe to these accounts. We’re really into different types of media. On the other hand, platforms should be transparent with their users about the reasons why this or that content is “pushed” to them.
SD: What about the users? In your report, you recommend clarifying the responsibilities of users, such as campaign teams and influencers.
BL : There are two different themes. The first is to allow candidates and their supporters to campaign freely on social media. This is part of democracy. But, of course, they must do so in accordance with the law, especially those protecting intellectual property. You have no right to publish the work or music without the consent of the copyright holders. If you violate this rule three times on YouTube, your channel will be closed. Campaign clips must follow these guidelines. [ndlr : le clip de d’Eric Zemmour annonçant sa candidature a fait l’objet d’une exception sur YouTube malgré de nombreux signalements pour violation de la propriété intellectuelle. Sa vidéo a finalement été retirée quelques mois plus tard suite à sa condamnation par le tribunal judiciaire de Paris.]
Second, the Electoral Code in France also applies to influencers. It provides for a period of silence from Thursday until the closing of the last polling station on Sunday. The media stops covering politics and candidates no longer campaign. This year, the Commission, which oversees the presidential campaign, stepped in to ask platforms to remove content from influencers talking about politics during the term.
This is happening for the first time because social networks and influencers have taken on a new importance that they didn’t have five years ago.