The Charter of Fundamental Rights of the European Union establishes a number of modern fundamental rights, including the right of every individual to the protection of personal data concerning him or her. The contribution explains the characteristics of this right, especially its dependency on a legislative specification. In reaction to the societal shift towards an onlife-world, the General Data Protection Regulation as a globally noticed legal regulation is now to be supplemented by further regulations within the framework of the EU's data and digital strategy. Against this background, the article addresses key challenges that data protection faces and that must also shape the understanding of the fundamental right to data protection. Is the focus on personal data a suitable criterion? What are the protected interests underpinning data protection? How should data protection be conceptualized in the context of an overarching information and Internet law?
This contribution reviews the state of the art of the European Union (EU)’s legal framework regarding international data transfers after the Schrems II decision of the Court of Justice (of the European Union) of 16 July 2020 (case C-311/18). It will focus on the impact of the Schrems legal saga on data transfers to and from Brazil, a State which, contrary to Mercosur’s and Council of Europe’s 108 Convention members Argentina and Uruguay, has never been the recipient of an adequacy decision by the European Commission, but has recently approved legislation akin to the EU’s General Data Protection Regulation. In Schrems II, the Court of Justice accepted that Standard Contractual Clauses (SSCs) may be used in the absence of an adequacy decision, but declared that companies need to assess, on a case-by-case basis, whether national law provide adequate protection under EU law, and where it doesn’t, must provide additional safeguards of suspend data transfers
A growing number of relevant decisions are subjected to algorithmic processes. The danger of discrimination by algorithms challenges present societies and their legal responses. Do the national legal systems adequately protect individuals from algorithmic bias? And what about EU anti-discrimination law and data protection law?
The presentation aims at analyzing how the current debate on algorithmic discrimination is reflected in the case of credit scoring in Brazil. Firstly, I dis¬cuss the concepts of algorithm and algorithmic discrimination and explains why such concepts are particularly meaningful in a data-driven economy. It presents how Big Data, combined with algorithms, has fundamentally altered some decision-making process in our everyday lives, and turns to one application in particular – credit scoring – to discuss how this may pose challenges for Brazilian law, especially regarding the risk of discriminatory outcomes. After analyzing the currently evolving normative data protection framework in Brazil – including the new General Data Protection Act – it discusses whether the existing or suggested legal tools are sufficient to deal with the challenges of automated decision-making processes and their potential asymmetric outcomes.
Brazil, with its complex scenario in relation to the use of Information and Communication Technologies, insofar as it presents itself as a hyperconnected society, in which, paradoxically, there are still regulatory vacuums, especially the lack of an urgent dialogue between legal doctrine, the National Data Protection Authority and the jurisprudence on this subject, since the personal data protection system suffers from improvement, is experiencing some important changes. Mainly after the promulgation of the Constitutional Amendment 115, which included the fundamental right to protection of personal data in the fundamental rights catalog of article 5 of the Federal Constitution of 1988, it is necessary to deepen the debate particularly in relation to the structure and performance of the State, notably the Public Administration, which, in short, must be adapted to face the challenges of digital transformation without losing the consistence with the constitutional principles.
The building elements of neural privacy – from mind privacy and habeas mentes to neurorights. Data protection frameworks and neurorights.