Data Protection Insider, Issue 40

– ECtHR Rules on Legal Privilege –

On 21st January the ECtHR delivered its ruling in the case of Kadura and Smaliy v. Ukraine. The case concerned events around the Maidan protests in the Ukraine in 2013. In connection with the protests, both applicants were arrested by the police and were the subject of subsequent ill-treatment. One aspect of this ill treatment was the confiscation of certain of the second applicant’s belongings. This confiscation was problematic, in particular, as the applicant was arrested in the course of performing duties as a lawyer and the confiscated belongings were alleged to contain sensitive information about the applicant’s clients. The applicants complained to the ECtHR about their treatment on several grounds. One of these complaints alleged an Article 8 infringement following from ‘the seizure of [the second applicant’s] telephone and documents containing what he alleged to be confidential information relating to his clients’ cases’’ – other complaints related to Article 3, Article 5 and Article 18 of the ECHR. The ECtHR upheld the complaint on the basis that the conduct of the authorities did not fulfil the ‘in accordance with the law’ criterion. In this regard, the Court observed that ‘[an] encroachment on professional secrecy of lawyers may have repercussions for the proper administration of justice and hence for the rights guaranteed by Article 6 of the Convention’ and that, as the authorities confiscated the applicant’s belongings during the applicant’s work as a lawyer, they must have been aware that confidential information may have been seized. The Court noted that the authorities had apparently given little consideration to the professional privilege engaged by the seizure, had given no particular reason for the seizure and had apparently not put any special procedure in place for the ‘proper handling of the information potentially subject to professional privilege’. The Court also observed that: ‘No element of the domestic legal system or practice providing any safeguards in that respect was cited in the domestic proceedings or in the proceedings before the Court.’

 

EDPB Releases Recommendations on the Adequacy Referential under the LED

On 2nd February the EDPB released its first recommendations concerning the Law Enforcement Directive (LED). The focus of the recommendations is on adequacy decisions as one of the legal bases for transferring personal data to Third Countries and International Organisations under the LED. The purpose of the recommendations is to ‘establish (…) the core data protection principles that have to be present in a third country or an international organisation legal framework to ensure essential equivalence with the EU framework within the scope of the LED’. This clarification should assist both the Commission in its adequacy assessment and those Third Countries and International Organisations which aspire to be recognised as adequate for the purposes of data transfers under the LED. The document is structured as follows. First, it explains the concept of adequacy, especially as developed in the CJEU’s case law. Second, it clarifies the procedural aspects related to adopting and revising adequacy decisions from the point of view of the role of the EDPB and its members in the process – both advisory and supervisory. Third, the recommendations analyse in more detail the substantive requirements involved when assessing the adequacy of a Third Country or International Organisation’s legal framework. In this regard, the recommendations look at the applicable LED and CFREU provisions, including their main concepts, core principles, legality of data processing, data subject rights, processing of special categories of data, automated decision-making and profiling, and the requirements on independent supervision and enforcement. We welcome the recommendations as they demonstrate that the EDPB is working on providing guidance also in relation to the LED. In substance, it is also interesting to observe that the EDPB reads the provisions of the LED broadly, especially concerning automated decision-making and profiling, even though Article 11 LED is less explicit as to the safeguards to be provided. Interestingly, the EDPB clearly interprets Article 11 LED in light of Recital 38 LED, which includes the right to receive an explanation of a decision obtained and to challenge this decision.

 

– EDPB Adopts Documents During 45th Plenary Session –

In its 45th Plenary Session, the EDPB had an exchange on Whatsapp’s privacy policy updates as well as adopting the following documents:

  • ‘Recommendations on the adequacy referential under the Law Enforcement Directive (LED)’;
  • ‘opinion on the draft Administrative Arrangement (AA) for transfers of personal data between the Haut Conseil du Commissariat aux Comptes (H3C) and the Public Company Accounting Oversight Board (PCAOB)’;
  • ‘Statement on the draft provisions on a protocol to the Cybercrime Convention’;
  • ‘response to the European Commission questionnaire on processing personal data for scientific research, focusing on health related research’.

All documents, if they are not already, will be available on the EDPB website as soon as the requisite checks have been completed.

 

Council of Europe Releases Guidelines on Facial Recognition

On 28th January – Data Protection Day – the Council of Europe released a set of Guidelines concerning facial recognition for identification and verification purposes. More precisely, the document contains: i) guidelines for the legislators and decision-makers; ii) guidelines for developers, manufacturers and service providers; and iii) guidelines for the users of facial recognition technology. The guidelines cover uses of the technology both in the private and the public sectors, including for law enforcement purposes. The guidelines are based on the application of the Convention 108+ requirements on data processing relevant for the development and deployment of facial recognition technologies. The separate focus on developers and manufacturers demonstrates that the Convention 108+ requirements clearly apply to these entities as well, which raises the question as to what position these entities should occupy within data protection frameworks. Finally, the guidelines also briefly discuss the issue of data subject rights in relation to facial recognition technologies. Whereas guidance on the issue of facial recognition is much needed, we note that the guidelines remain relatively general. For example, they do not clarify how the right to rectification should apply in any given case – e.g. should it, where relevant, involve a new identity document to be issued for free? Equally, they do not clarify how the right to have one’s view heard, in the context of automated decision-making, should be exercised in relation to facial recognition – especially where the data subject might not have the necessary technical expertise to state their case?

 

– EDPS Releases ‘Orientations on Manual Contact Tracing’

On the 2nd of February, the EDPS announced the publication of the document ‘EDPS Orientations on manual contact tracing by EU Institutions in the context of the COVID-19 crisis’. As the name suggests, the intention of the document is to provide guidance to EU institutions regarding their use of manual Covid-19 tracing methods – contact tracing methods which rely ‘on health service agents to survey contaminated individuals for their close contacts’ and then contacting those contacts to evaluate ‘their risks of having being infected and spreading the disease’. In principle, the EDPS finds that such contact tracing systems are not incompatible with Regulation 2018/1725. In this regard, the EDPS highlight a number of legal grounds under which relevant personal data may be collected and processed within such systems – including Articles 10(2)(b), 10(2)(h) and 59. The EDPS does highlight, however, that Regulation 2018/1725 is much more permissive regarding the processing of EU staff – and household members’ – personal data, than regarding non-staff personal data. The EDPS then go on to highlight several obligations European Union Institutions (EUIs) should fulfil in order that contact tracing schemes might be data protection compliant. These include the obligations that, for example: EUIs should conduct a data protection impact assessment prior to launching a scheme; EUIs should ensure confidentiality in the communication of information; EUIs should make sure a robust privacy by design architecture is in place; and EUIs should make sure data subject rights are respected.

 

– Austrian Administrative Court: Party Affinity is Sensitive Personal Data

On 26th November 2020 the Austrian Administrative Court ruled that the processing of probabilistic party affinity data constitutes the processing of sensitive personal data. As to the facts of the case: a company would collect data about the addresses, names, dates of birth and titles of individuals. Based on anonymous surveys and the results of local elections, it would calculate the probability of certain groups of individuals to have an affinity for a certain political party. Then, it would apply this general profile to individuals in order to calculate the probability that a certain individual would have a certain political affinity and would be happy to receive political advertisements from a given party. It would then sell these data, in addition to the collected identifiable data, to different political parties. The data processing operations were carried out without the consent of the concerned individuals. The Austrian Data Protection Authority opened investigations against the company, which argued that it was not processing personal data, but merely calculated probabilities. The Austrian Administrative Court ruled that in applying the abstract profiles to individuals for marketing purposes and calculating the probability of their party affinity, the company clearly processed sensitive personal data – i.e., inferred data are personal data. The Court then highlighted that such processing requires a legal basis and that consent would be the only acceptable option under Article 9 GDPR. The conclusion that political affinity probabilities constitute personal data was reached on the basis of the CJEU’s reasoning in Nowak and the Article 29 Working Party Opinion on the concept of personal data, as endorsed by the CJEU in Nowak. We note that the ruling confirms the calls in academic literature and by the Article 29 Working Party to treat inferential data as personal data and it is positive that national courts are following this broad approach.

Über

DPI Editorial Team

Dara Hallinan, Editor: Legal academic working at FIZ Karlsruhe. His specific focus is on the interaction between law, new technologies – particularly ICT and biotech – and society. He studied law in the UK and Germany, completed a Master’s in Human Rights and Democracy in Italy and Estonia and wrote his PhD at the Vrije Universiteit Brussel on the better regulation of genetic privacy in biobanks and genomic research through data protection law. He is also programme director for the annual Computers, Privacy and Data Protection conference.

Diana Dimitrova, Editor: Researcher at FIZ Karlsruhe. Focus on privacy and data protection, especially on rights of data subjects in the Area of Freedom, Security and Justice. Completed her PhD at the VUB on the topic of ‘Data Subject Rights: The rights of access and rectification in the AFSJ’. Previously, legal researcher at KU Leuven and trainee at EDPS. Holds LL.M. in European Law from Leiden University.

Hinterlasse eine Antwort