A Crack in the Algorithm’s Façade

Social aid and health resources are scarce and their fair distribution a complex task – especially in times of a pandemic. Probabilistic algorithms seem to offer an easy way-out as they allocate resources based on data and math. However, academic research reveals that probabilistic algorithms inherit “data biases” resulting into disadvantageous effects for marginalized social groups. It seems that built on efficiency is a façade of statistical neutrality behind which the State hands over its responsibility to the algorithm and where inequalities can quietly cement. I argue that fundamental and human rights in the EU crack this façade. Societal data biases translate into – what I coin as – the harm of generalization, touching upon fundamental rights to autonomy and to equal treatment as well as to the harm of cementing societal biases touching upon the right to non-discrimination. To demolish the façade the enforcement of these rights must be supported by a rights-based regulation.