Data Protection Insider, Issue 37

– Commission Publishes Data Governance Act –

On 25th November, the European Commission officially published the ‘Data Governance Act’ – although a draft version had previously been circulating. In essence, the Act seeks to facilitate the maximum use of European data such that the value inherent in the data, in its analysis, and in the application of this analysis can be realised. In this regard, the Act specifically aims at: ‘Making public sector data available for re-use, in situations where such data is subject to rights of others…[allowing s]haring of data among businesses, against remuneration in any form…[a]llowing personal data to be used with the help of a ‘personal data-sharing intermediary’, designed to help individuals exercise their rights under the General Data Protection Regulation (GDPR)…[and at] allowing data use on altruistic grounds.’ In terms of content, the Act is split into three key substantive parts – subsequent parts deal with issues such as supervisory authorities and the creation of a ‘‘European Data Innovation Board’…to facilitate the emergence of best practices by Member States’ authorities’: i) ‘[c]hapter II creates a mechanism for re-using certain categories of protected public sector data which is conditional on the respect of the rights of others’; ii) ‘[c]hapter III aims to increase trust in sharing personal and non-personal data and lower transaction costs linked to B2B and C2B data sharing by creating a notification regime for data sharing providers’ – including elaboration of the conditions under which data sharing services should operate; and iii) ‘[c]hapter IV facilitates data altruism (data voluntarily made available by individuals or companies for the common good)’ – including elaboration of the conditions for the establishment of ‘Data Altruism Organisations’. Whilst the Act does not solely deal with personal data, personal data is discussed extensively in the Act and provisions are made which aim to ensure that any processing of personal data within the scope of the act is protected. Whether effective protection can be ensured within the expansive sharing regime foreseen under the Act, however, will surely be the subject of debate as the legislative process moves forward.

 

 

– Unuane v the United Kingdom: Assessment Criteria for Deportation Not Compliant with Article 8 ECHR –

On 24th November, the ECtHR ruled in the Unuane v the UK case that domestic courts had not carried out a fairly balanced assessment between the applicant’s behaviour and his private life before deporting him. As to the facts of the case, the applicant was a Nigerian national living in the UK who has family ties to British nationals – his children. He had been convicted, amongst other offenses, for having falsified about 30 applications for a leave to remain in the UK. Based on these convictions, the UK immigration authorities deported the applicant. The applicant claimed that the deportation violated his Article 8 ECHR right to private and family life because the domestic courts had failed to conduct a proper assessment of his family life interests. The ECtHR agreed with the applicant. While the Court noted that the deportation had a legal basis and pursued a legitimate interest, it pointed out that, in their assessment, the domestic courts had failed to take into account certain criteria established in the Court’s case law related to his family life, namely the interests of his children who are British citizens living in the UK. Thus, it found a violation of Article 8(2) ECHR ruling that the interference was not proportionate. We note that, on its surface, the case might seem to be, strictly speaking, more a family life matter with little relevance for data protection. However, we observe that the case can be translated into the algorithmic context. More precisely, one could argue that this, and similar, judgements set requirements for the fair design of the algorithmic criteria to ensure that all the relevant criteria are included in the assessment of the behaviour of a certain individual. This is to ensure that the outcome of the assessment is fair and does not violate the human/fundamental rights of concerned individuals. This also resonates with the fairness rules on algorithmic decision-making in the GDPR and the CJEU EU Canada PNR Opinion. In this regard, the case once again shows the importance of data protection for the protection of other fundamental rights.

 

 

– The EU and the USA: A New Agenda for Global Change –

On 2nd December, the European Commission and the High Representative for Foreign Affairs and Security Policy released a communication containing an ambitious multi-faceted agenda for new transatlantic relations. Under Point 4 thereof, the EU proposes cooperation in the field of digital technologies in particular. In this respect, the EU emphasized the opportunity for a ‘joint EU-US tech agenda’ and tech governance. To achieve this, the communication suggests that ‘the EU and the US need to join forces as tech-allies to shape technologies, their use and their regulatory environment.’ In this regard, the communication especially focuses on eight measures/topics. First, it suggests working together on 5G, 6G and cybersecurity assets. Second, it proposes joining forces on AI, with a special emphasis on facial recognition. To this end the EU envisages a Transatlantic AI Agreement which is supposed to serve as a model for regional – even global – standards which are based on shared values. At the same time, the EU acknowledges the divergences between the EU and the USA concerning data governance. Third, to overcome these divergences, the EU acknowledges that establishing adequate safeguards and standards which can enable the free and trustworthy data flow are needed. Fourth, the EU and the USA need to join forces on regulating online platforms and Big tech in order to ensure fair competition, removal of illegal content online, and to prevent propaganda created and spread though algorithms. Fifth, fair taxation is high on this new agenda. Sixth, the communication proposes establishing a new EU-US Trade and Technology Council (TTC) in order to ensure that EU and US companies are competitive on the market by setting up common standards for technology. Seventh, the EU and US need to work on protecting critical technologies by focusing on ‘investment screening, Intellectual Property rights, forced transfers of technology, and export controls.’ Last but not least, the EU emphasizes the significance of fair trade and the removal of trade barriers. We note that such a cooperation is highly necessary in today’s world, including in upholding our values and freedoms. It remains to be seen, however, whether, and if so which, of the above proposals will be taken up and pursued further. It will also be interesting to see whether the US and the EU will be able to overcome the divergences which have led the CJEU to strike down the two Commission personal data transfer adequacy decisions.

 

– Committee for Convention 108 Guidelines on Children’s Data Protection –

On 20th November, the Council of Europe’s Consultative Committee for Convention 108 released its Guidelines on ‘Children’s Data Protection in an Education setting’. The rationale for the Guidelines is that there is an increasing amount of information technology involved in the educational context, that this technology involves an increasing amount of processing of children’s personal data and that this processing can have specific impacts on children’s rights and freedoms – including rights beyond privacy and data protection, such as ‘the right to development, the right to freedom of expression, the right to play and the right to protection from economic exploitation’. The Guidelines specifically seek to provide further clarity as to how the provisions of Convention 108+ should be understood and applied to protect children’s rights in the educational context. In this regard, the Guidelines provide: i) an overview of the principles of Convention 108+ in relation to children in the educational context; ii) a discussion of ‘[the f]undamental principles of children’s rights in an educational setting’; iii); a set of ‘[r]ecommendations for legislators and policy makers’; iv) a set of ‘[r]ecommendations for data controllers’; and v) a set of ‘[r]ecommendations for industry’. The Guidelines are well worth reading as they provide clarity as to the specifics of the protection of personal data in relation to children in the educational context. Of interest are the Guideline’s discussions of the vulnerable position of children – and of specific sub-groups of children in particular – the dynamic nature of childhood, the need to involve children in the processing of their own personal data, and the need for strict controls around all third-party access to children’s personal data processed in an educational context.

 

EDPS Opinion on the New Pact on Migration and Asylum

On 30th November, the EDPS released his Opinion on the New Pact on Migration and Asylum, which consists of five proposals. In his Opinion the EDPS makes comments and recommendations concerning two of the proposals in particular: the amended EURODAC Proposal (concerning the EU large-scale information system called Eurodac) and the Proposal for a Screening Regulation (concerning the risk of screening, especially of asylum seekers upon arrival). The EDPS makes nine key observations. First, he recommends that the legislator needs to carry out a detailed fundamental rights and data protection impact assessment concerning the different proposals. Second, he notes that the proposals would effectively give broad-ranging data processing responsibilities not only to the Member States but also to two EU agencies, namely FRONTEX and EASO. Therefore, he calls on the legislator to clarify in the legislative instruments the data protection responsibilities of the EU agencies and the Member States to avoid a blur in these and in the applicability of the different data protection instruments. Third, he notes that the amended EURODAC Proposal would essentially create linked sequences between the personal data concerning a certain individual within the system – e.g. an asylum seeker. This calls for a clear regulation of the access rights of the different authorities to the linked data. Fourth, he emphasizes the importance of the coordinated supervision of EURODAC by the Member State authorities and the EDPS, and suggests this be expressed more explicitly in the proposals. Fifth, he recommends that the legislator clarifies which data should be stored in the EURODAC central system and which will be stored on the Central Identity Repository to be set up pursuant to the Interoperability Framework – where identifying personal data stemming from the different large-scale databases will be stored. Sixth, with regards to the Screening Proposal, the EDPS notes that the process and criteria are mentioned only on a very general level and should be thus regulated in more detail. Seventh, he notes that the screening will result in a debriefing which will influence the outcome of the asylum application but which will not be subject to judicial review. Therefore, the EDPS emphasizes that the right to rectification – to supplement the debriefing in order to ensure its accuracy – should be guaranteed. Eighth, an adequate retention period should be set for these debriefings. Ninth, the EDPS notes that the other proposals lack substantive data protection provisions, or even a reference to the general data protection instruments, and that this should be rectified. Finally, the EDPS notes that robust data protection provisions are a must because: ‘as already stated in the EDPS Strategy 2020-2024, “[d]ata protection is one of the last lines of defence for vulnerable individuals, such as migrants and asylum seekers approaching EU external borders.”

 

The Connection between Apps, Advertising, and National Security –

On 3rd December, journalist Martin Gundersen, writing for the Norwegian NRKbeta, published an account of efforts to find out how data from apps installed on a smartphone reached the US company Venntel, which has, in the past, sold data to US security agencies. The journalist begins by describing how they obtained a copy of their personal data held by Venntel which allowed them to retrace several their movements with a degree of precision. The journalist then proceeded to try and find out how Venntel may have procured this data and draws a series of conclusions about the data flows likely in place such that this procurement could happen. The journalist paints a picture of an exchange network of data between the apps downloaded onto their phone, advertising intermediaries, data brokers, and Venntel. Interestingly, the journalist finds that precise information as to how Venntel might have procured his information is difficult to obtain, and that the fact that Venntel did procure the data indicated that there were breaches of the GDPR at some point along the chain of data exchange. The information contained in the article will not surprise anyone in the data protection community. The lack of transparency within the app-advertising-data broker system is well documented. The article is interesting, however, in how it draws connections between the different processing sectors and in how it charts the unexpected diffusion of personal data across different contexts.

About

DPI Editorial Team

Dara Hallinan, Editor: Legal academic working at FIZ Karlsruhe. His specific focus is on the interaction between law, new technologies – particularly ICT and biotech – and society. He studied law in the UK and Germany, completed a Master’s in Human Rights and Democracy in Italy and Estonia and wrote his PhD at the Vrije Universiteit Brussel on the better regulation of genetic privacy in biobanks and genomic research through data protection law. He is also programme director for the annual Computers, Privacy and Data Protection conference.

Diana Dimitrova, Editor: Researcher at FIZ Karlsruhe. Focus on privacy and data protection, especially on rights of data subjects in the Area of Freedom, Security and Justice. Completed her PhD at the VUB on the topic of ‘Data Subject Rights: The rights of access and rectification in the AFSJ’. Previously, legal researcher at KU Leuven and trainee at EDPS. Holds LL.M. in European Law from Leiden University.

Leave a Reply