Platform regulation and the conceptual challenges of automation

Social media platforms bring together a set of issues related to the governance through algorithms. Platforms match users with news, services and applications. Automation is also an integral aspect of content moderation practices: it is used to proactively remove violent content as well as to flag potential harmful materials to moderators. Platforms’ policies and their enforcement mechanisms raise questions on how to regulate such platforms: how much law is needed (e.g. self-regulation, co-regulation or hard law) and which consequences different regulatory approaches would have in terms of rights, an understanding of law and the shape of technology. This paper discusses the latest developments within the EU framework. It focuses on the construction of platforms as “passive and neutral”, the way these terms are employed in the case law of the CJEU and their impact on our traditionally legal framework based on the rule of law and procedural guarantees