Platform Practice and Non-Discrimination

Conceptions of Equality, Systemic Risks, and Data Bias in the DSA

Authors

  • Felicitas Rachinger University of Innsbruck

DOI:

https://doi.org/10.25365/vlr-2025-9-3-124

Keywords:

platform governance, platform regulation, content moderation, non-discrimination

Abstract

Platforms have developed into important communicative facilities. Rules and practices of platforms define what happens on a platform, but these rules and practices are not free from discrimination. This is especially true as technical tools used for content moderation, such as hate speech detection tools or recommender systems, can transfer mechanisms of exclusion into digital spheres and might even reinforce them. The EU legislator seems to recognize the issue and repeatedly emphasizes the relevance of "non-discrimination" in the Digital Services Act (DSA). But what is “non-discrimination” when it comes to platforms? This paper addresses the conceptions of “non-discrimination” and equality within the DSA and asks how related provisions in the DSA can be put into practice. The first section of the paper examines the connection between platforms and non-discrimination, discussing how public values are being incorporated into private orders (II.). The second section of the paper shows how the term “non-discrimination” and related terms can be understood under the DSA (III.). This is followed by an analysis of specific content moderation-related provisions on the individual (IV.) and the systemic level (V.)

Author Biography

Felicitas Rachinger, University of Innsbruck

Felicitas Rachinger is a research and teaching assistant (pre-doc) at the Department of Legal Theory and Future of Law at the University of Innsbruck, Austria. In her research, she focuses on digital rights, legal gender studies and anti-discrimination law.

Downloads

Published

2025-10-30