Universiteit Leiden

nl en

Jurisprudence

The Centre for Digital\\Jurisprudence

Online platforms such as YouTube, Facebook, and X (formerly Twitter) have become part and parcel of everyday media use. Journalists incorporate posts from politicians into newspaper reports, scientists share their insights in short posts or videos, and the judiciary uses social media to explain their work. Well-known ‘influencers’ even build their financial existence around the opportunities offered by online platforms. For citizens, social media are not only an important source of information but also an opportunity to connect with policymakers and politicians in an approachable way. Everyday things like finding work, selling stuff or shopping online are also increasingly going through social media.

What if users of online platforms break the rules?

Online platforms are legally obliged to take down punishable posts such as sexual child abuse material, death threats or calls for riots. In addition, online platforms draw up their own rules. These house rules (community guidelines) are enforced by online platforms themselves. Violating the law or the house rules of online platforms can result in a warning, the deletion of a message, photo, or video and in some cases even the (temporary) deletion of the account. In the worst case, someone is never welcome on the platform again.

There are good reasons to ban more than just actionable content. For instance, it is also important to keep the platform user-friendly for different groups of users. In this context, online platforms could, for example, make efforts to prevent female users from being constantly confronted with sexist comments. Another example in this context is the bill that prohibits sharing others’ private information (also known as doxing), something that is already not allowed on many online platforms.
Good intentions or not, sometimes a platform’s house rule can make it difficult or impossible for users to express an opinion. This tension between countering disinformation and freedom of expression was, for example, well visible during the COVID-19 pandemic. On the one hand, certain platforms prevented the spread of dangerous health advice. At the same time, the removal of disinformation could also affect criticism of public policy. 

However, not all restrictions on freedom of expression are also automatically a violation of the right to freedom of expression. The first step in this research project is to examine when such a restriction by online platforms affects users' freedom of expression to the extent that it may constitute a violation. A second step is to examine whether the legal framework applicable to freedom of expression also provides sufficient protection to users of online platforms.

How will the Digital\\Jurisprudence conduct research?

The Digital\\Jurisprudence project is interdisciplinary in nature and thus goes beyond legal analysis. Within the project, the applicable house rules of very large online platforms are collected. These house rules are revisited at different points in time. Where do the norms in the house rules differ? Are the norms of online platforms moving in the same direction (convergence) or are there, on the contrary, increasing differences (divergence)? What are possible explanations for the movement (or not) of these norms towards each other? 

It is expected that the standards of online platforms (partly influenced by legislation from the European Union) will move towards each other. Especially if online platforms start using the same standards for non-illegal content, this could have consequences for the right to freedom of expression. The Digital\\Jurisprudence project tests this assumption and places this development in a broader legal theory framework.
 

About the centre

The Centre for Digital\\Jurisprudence aims to encourage (young) researchers to conduct (empirical and legal theory) research on digital (legal) norms. 

This website uses cookies.  More information.