Traditional media outlets have always selected what information to broadcast. Nowadays, much content moderation is performed by social media platforms, raising serious concerns in terms of digital exclusion. In a world where just Facebook or YouTube moderate billions of expressions, it is time to think about how to mitigate this form of digital exclusion. Social media usually performs content moderation by implementing automated systems which can quickly delete vast amounts of content based on racial or gender stereotypes. In this process, social media, as private actors, are not obliged to respect fundamental rights, democratic values or be transparent regarding their decision-making processes. Apart from the recent proposals in the EU framework (e.g. Copyright Directive), users cannot access the reasons why specific content has been removed. This paper proposes a new regulatory framework based on new transparency and accountability obligations for online platforms.
Our next Annual Conference will take place from July 6-9, 2021. It will be held in a completely novel way as a fully online Conference: ICON•S Mundo. Stay tuned.
The Call for Papers for ICON•S Mundo is now closed. Successful applicants will be notified by the end of May.
Log in to access your ICON•S account.Log In