Internet platforms: free speech facilitators or state actors?

Recent European (CJEU Facebook Ireland v. Eva Glawischnig-Piesczek, C-18/18) and national courts suggest renewed attention to the debate on horizontal effects of constitutional rights, i.e. the enforceability of freedom of speech vis-à-vis digital platforms. Concerns on the role of these actors have been raised in light of the larger resort to algorithmic techniques for the processing of content and data. The key question is whether, based on their predominant market position and on the alleged nature of “essential facilities” of the services they operate, some Internet platforms can be equalized to state actors. This way, private entities such as SNs and user-generated-content platforms would be subjected to the same obligation to protect free speech applicable to public authorities. The paper argues that such scenario would possibly trigger unintended consequences, calling for a more balanced consideration of the role of modern Internet platforms.

Rated by the algorithm: suggestions from the Chinese Social Credit System

Algorithmic analysis is the technical tool enabling citizens’ rating systems to function. The biggest one in terms of people involved and data gathered is the Chinese Social Credit System. Though still in a preliminary stage, the SCS has been tested in some municipalities with different success. The SCS starts from the awareness of state deficiencies (low rate of enforcement of law; financial scams; corruption; tax evasion) and the need to overcome them in the context of a market economy. In order to do so, rating and ranking citizens into different classes allows government and private sector to assess their trustworthiness and compliance to socially acceptable behaviour. Connected to one’s score is a specific shade of citizenship, a different extent of enjoyment of rights and freedoms and access to services and benefits. Despite the SCS allows control and fosters political conformity, it’s not its primary target. Indeed, it fits the Chinese traditional favour for social conformity.

Private governance of expressions: excluding citizens for democratic debate

Traditional media outlets have always selected what information to broadcast. Nowadays, much content moderation is performed by social media platforms, raising serious concerns in terms of digital exclusion. In a world where just Facebook or YouTube moderate billions of expressions, it is time to think about how to mitigate this form of digital exclusion. Social media usually performs content moderation by implementing automated systems which can quickly delete vast amounts of content based on racial or gender stereotypes. In this process, social media, as private actors, are not obliged to respect fundamental rights, democratic values or be transparent regarding their decision-making processes. Apart from the recent proposals in the EU framework (e.g. Copyright Directive), users cannot access the reasons why specific content has been removed. This paper proposes a new regulatory framework based on new transparency and accountability obligations for online platforms.