Traditional media outlets have always selected what information to broadcast. Nowadays, much content moderation is performed by social media platforms, raising serious concerns in terms of digital exclusion. In a world where just Facebook or YouTube moderate billions of expressions, it is time to think about how to mitigate this form of digital exclusion. Social media usually performs content moderation by implementing automated systems which can quickly delete vast amounts of content based on racial or gender stereotypes. In this process, social media, as private actors, are not obliged to respect fundamental rights, democratic values or be transparent regarding their decision-making processes. Apart from the recent proposals in the EU framework (e.g. Copyright Directive), users cannot access the reasons why specific content has been removed. This paper proposes a new regulatory framework based on new transparency and accountability obligations for online platforms.
Our 2020 Annual Conference was scheduled to be held at the University of Wrocław in Poland on July 9-11, 2020.
Due to the COVID-19 pandemic, the ICON·S Executive Committee has decided to postpone our 2020 Conference to 2021. Our next Annual Conference will take place from July 8-10, 2021, in Wrocław, Poland.
Procedural details regarding the organization of the 2021 Conference will follow in the months ahead.Join ICON•S