Skip to main content

Who decides the limits of freedom of speech exercised on digital platforms? 

Esther Bodil Huyer

If platforms that enable data sharing become very impactful, in other words very successful, they also become partially responsible for what is being shared. This is because the information being shared on their platform can have a significant and widespread impact, both good or bad, like influencing voters, uniting revolutionists, spreading fear, spreading hope. Sometimes all of the above at the same time, which makes it difficult to assess the quality and extent of the impact. Facebook, for example, became so successful that it is now one of the main platforms for political discourse.

To manage the risk of harmful impact, Facebook recently formulated "community standards".These standards are the framework to decide what cannot be shared on the platform because of its impact or illegal nature. In many countries, the law limits what opinion can be shared, for example, the limitation to freedom of speech in Germany when it comes to denying the holocaust. In addition, there are more examples besides Facebook where organisations or occupational groups complement the law with an ethical codex, like the "Deutsche Presserat" that states the jointly formulated and agreed ethical standards for journalists. 
So, what makes Facebook's community standards problematic or worth discussing? 
Recently, Facebook temporarily blocked a newspaper's and a white house representatives' account to keep them from sharing content considered harmful.  In this case, it was an article about the unlawful behaviour of Joe Biden's son. Many might agree that a populistic article from a biased newspaper can be harmful to the success of the democrats, but that does not mean that blocking its publication is lawful or ethical. On the other hand, Marc Zuckerberg only recently agreed to finally ban Holocaust deniers from Facebook although it is already law in 18 European Countries. 
What if we do not agree with the community standards of the biggest community platform because they appear biased and random to some of us? What can we do if we do not agree with the ethical feeling of a private company like Facebook, Amazon, Microsoft, or Google? We cannot support the competition, because many of them are monopolies, but we can help these companies formulating and further developing a community based, diverse view on ethical behaviour on sharing platforms.

And, we can learn from it, for when data sharing intermediaries and platforms grow in reach and impact. We can be prepared with a joint ethical codex that states what should be banned to protect the community and what fostered to empower the community.

Laura Knippenberg and Esther Huyer discussing data sharing matters
Image credit:
(C) January 2020, Eline N. Lincklaen Arriens