Post a YouTube video claiming that hydroxychloroquine works well against corona, posting to LinkedIn that the covid-19 mortality rate is extremely low, or creating a Facebook page calling for the one and a half meter measure to ignore. When do you end open debate because disinformation is being disseminated?
It‘s a question that many social media platforms struggle with. In all of these cases, posts have been deleted, or even an entire account. According to Google, Facebook and LinkedIn, there was misleading information or threatening public health; according to their guidelines, that should not be.
Among other things, Virus Truth action group, Forum for Democracy and the online video channel Café Weltschmerz conducted lawsuits against the platforms last year because they believe their freedom of expression is curtailed. They all lost. There was another interim case today. This time the MP was Wybren of Haga, ex-FVD and now of his own Group Van Haga. He wants LinkedIn to reactivate his account.
That was suspended in December last year as Van Haga downplayed the corona mortality rate according to LinkedIn and said face masks don’t work. Van Haga remarked with censorship: “The patent on truth should not be with a company or government. This will make discussion impossible.” According to him, LinkedIn was an important platform for him as a politician where he could engage in debate with supporters and opponents and that was now deprived of him.
Experts give him little chance. Lawyer Thomas van Vugt, a specialist in media law, says Van Haga has agreed to the terms of the platform, including the “corona policy” by creating an account on LinkedIn. “LinkedIn is also allowed to decide which rules apply on the platform.”
The companies behind social media are private. “A room host is also not obliged to make their room available to everyone,” says emeritus professor of media law Wouter Hins.
That doesn‘t mean that you can refuse people on any ground, says Michael Klos, who is researching freedom of speech on the Internet at Leiden University. “It is prohibited by law to distinguish on certain grounds, such as descent or gender. So it’s not like they can put everything in their terms and conditions.”
The plaintiffs in the interim actions argue that the social media platforms are monopolists. If you want to have a wide range of video, you need to be on YouTube in today‘s society. If you are denied, that is a restriction on your freedom of speech. And governments have a legal obligation to ensure that it can be exercised.
However, the judge often refers to the Appleby judgment, a 2003 European Court of Human Rights ruling, a group wanted to demonstrate against the construction of a park in the English town of Washington. They wanted to hand out flyers in a mall, among others.
However, the mall did not allow that, it wanted to remain politically neutral. The Court gave the mall right: the protesters had plenty of other locations to hand out their leaflets.
Forum of Choice
“The right of everyone to freedom of expression does not therefore imply a right in the forum of his or her choice,” said in an August case of the same Van Haga, on the deletion of a YouTube video. Only if “any effective exercise” of freedom of expression is made impossible that a state or a judge should intervene.
Hins thinks that there is no question in this Van Haga case. “If all social media were to form a cartel and delete his accounts, he would have a point.” Moreover, the fact that he is a politician gives him even more stages than an ordinary citizen, Klos says: “MPS can still express their views in parliament.”
Is deletion proportional?
The difference from previous things is that this is not about deleting a video, but from an entire account. Hins: “That weighs in again on the side of the scale for Van Haga.”
Klos is therefore looking forward to this case. “The platforms have a certain responsibility to combat misinformation and are also encouraged by authorities such as the European Commission and the World Health Organization. But you might wonder if deleting a video or even an account is proportionate.”
The question is whether interim proceedings are the right way for a balance of proportionality. Because of the urgency, a judge makes a quick weighting between different interests in it, and then makes a preliminary ruling.
Klos: “You can label the claims aswrong, making algorithms don’t recommend a video, or that accounts posting disinformation can no longer make money. There are other options to address disinformation than delete it.”
On 6 October, the interim proceedings are ruled.