Data Protection Insider, Issue 47

– CJEU Offers Guidance on the Law Enforcement Directive

On 12th May, the CJEU rendered one of its first judgements which provides guidance on the interpretation and application of the Law Enforcement Directive (LED). As to the facts of the case, the applicant, a German national and resident, was subject to a red notice in the INTERPOL database, entered by the USA. He argued that the processing of his data in relation to the red notice by the German authorities in their national files was in breach of the LED, because in his opinion he had been tried and punished in Germany for the same act (ne bis in idem principle). In addition, the applicant raised the question of whether INTERPOL, an international organisation, offered an adequate level of adequate protection as required by the LED. The CJEU decided that to answer the question of whether the processing of the applicant’s data by the German authorities was lawful, an assessment of the level of data protection offered by INTERPOL was not necessary, because in casu the question before the Court did not concern the transfer of personal data by the German authorities to INTERPOL, but concerned the processing of personal data by the German authorities on the basis of INTERPOL data. Then, the Court examined the processing of the data by the German law enforcement authorities on the background of the LED. This examination provides useful guidance on the following four provisions of the LED: (i) the recording of personal data contained in INTERPOL’s red notices in national files for law enforcement purposes constitutes personal data processing to which the LED applies; (ii) processing of personal data in the framework of INTERPOL pursues a legitimate purpose in the framework of the purpose limitation principle; (iii) the lawfulness of the processing might be infringed upon where the processed data concern a case to which the ne bis in idem principle applies; in that respect, however, the Court noted that the processing of the said data by INTERPOL cannot be considered to be unlawful in relation to the LED as the LED does not apply to INTERPOL and that the EU law enforcement authorities might need to process the data to determine whether the ne bis in idem principle applies in a specific case;  and (iv) where it has been established that the ne bis in idem principle applies, the concerned data subject should have the right to request the deletion of the data from the national files of the EU Member States or at least the annotation that the ne bis in idem principle applies via some sort of an addition to the file. We observe that the question of INTERPOL’s adequacy in light of the LED and the Schrems case law is a very pertinent question and the Court could have examined it in the present case. As to the examination of the above-mentioned LED provisions, we note that the conclusions do not raise surprises and provide a useful reference for those working with the LED. It would be interesting to know why the Court offered as a possibility the annotation of the data instead of their deletion and whether this would always offer a good solution to the concerned data subjects.

 

Council of Europe Declaration on Children’s Privacy Online –

On 28th April, the Council of Europe’s Committee adopted a Declaration on ‘the need to protect children’s privacy in the digital environment’. In adopting the Declaration, the Committee highlight a series of risks faced by children in relation to privacy and the digital environment. In this regard, the Committee ‘[c]onsiders that frameworks and measures should guarantee the effective protection of the rights of the child and, to that end, expresses the need to intensify efforts in this area by calling on member States to [engage in certain actions]’. The specific actions elaborated in the Declaration are as follows: i) ‘ratify and implement Convention 108+’; ii) ‘actively promote Recommendation CM/Rec(2018)7 of the Committee of Ministers to member States on Guidelines to respect, protect and fulfil the rights of the child in the digital environment’; iii) ‘actively promote the Guidelines on children’s data protection in an education setting’; iv) ‘develop and promote measures to support critical digital literacy, youth empowerment initiatives and parenting skills; robustly enhance public awareness and improve access to digital citizenship education and knowledge, as proposed in Recommendation CM/Rec(2019)10 of the Committee of Ministers to member States on developing and promoting digital citizenship education’; v) ‘in the context of the Covid-19 pandemic, exercise [increased] vigilance through the implementation of enhanced safety and safeguarding measures as regards the use of technology and processing of children’s data…to minimise potential adverse effects…while ensuring adequate accountability and remedies and take steps to bridge the digital divide among children to ensure full enjoyment of their human rights’; vi) ‘ensure that children are not subjected to arbitrary or unlawful interference with their rights in the digital environment, promote the implementation of regular child rights risk assessments in relation to digital technologies, products, services and policies and apply safety by design, privacy by design and privacy by default as guiding principles when these are addressed to or used by children’; vii) ‘enhance co-ordination and co-operation among data-protection authorities, relevant institutions, entities, communities and other actors’; viii) ‘co-operate with other States and relevant stakeholders to address the risks for and impact on children and their rights posed by the development and use of AI systems and take any further measures to ensure that children’s rights are respected’; ix) ‘invest in research and knowledge development on the rights of the child in the digital environment, and in child and youth participation, taking into account the needs and rights of children’; x) ‘make use of the opportunities that digital technologies, including AI, present for the enjoyment of the rights of the child’; xi) ‘promote and contribute to exchanges of expertise and good practices for capacity building; establish and promote standards, regulations and other practical measures that enable the realisation of children’s rights in the digital environment, through reinforced co-operation at international and regional levels’. Whilst the eventual impact of the Declaration is hard to foresee, the Declaration is nevertheless welcome as it deals with an important and under-discussed issue.

 

 

– EDPB on the Draft Second Additional Protocol to the Budapest Convention-

On 4th May, the EDPB published its ‘contribution to the 6th round of consultations on the

draft Second Additional Protocol to the Council of Europe Budapest Convention on Cybercrime’. The EDPB highlights, however, that the contributions only constitute preliminary feedback owing to the fact that stakeholders were only given 3 weeks to respond in this round of consultations. In terms of content, the EDPB contributions address: ‘[The] Effect of the Additional Protocol on and Interaction with EU Law in the Field of Personal Data Protection’ – including, for example, a discussion of the need to ensure that the level of protection provided for EU citizens’ personal data in the Protocol does not fall below that provided by EU law; ‘Common Provisions and Measures for Enhanced Cooperation (Chapters I And II Of The Draft Protocol)’ – including, for example, discussions of the need for the ‘Systematic involvement of judicial authorities of the requested Party’ and of the definition and access requirements to subscriber data; and ‘Conditions and Safeguards related to the Protection of Personal Data (Chapters III And IV of the Draft Protocol)’ – including, for example, discussions on sensitive data, onward transfers, remedies and oversight. 

 

– Germany Pulls the Emergency Brake on Whatsapp

In the past months the update of Whatsapp’s Terms and Conditions (T&C), which has been argued to lead to new personal data processing powers for Facebook, has been much debated. Recently, the Hamburg Data Protection Authority (DPA) has temporarily banned the planned update of Whatsapp’s T&C by triggering the emergency procedure provided for in the GDPR. Thus, the rollout of the new T&C is supposed to be delayed in Germany for three months. In parallel, the Hamburg DPA is said to be pushing for the EDPB to take a binding decision on the matter which will apply to all the 27 Member States. The EDPB has confirmed that a DPA may request such a binding decision, which would represent a derogation from the one-stop-shop principle and the consistency mechanism, which would allow the EDPB to make the decision final or to extend the measure. It has been noted that the Hamburg DPA’s move in practice challenges the inaction of the Irish Data Protection Commission (DPC), which is supposed to be the lead supervisory authority for Facebook, but which has reportedly declined to investigate the issue. In response, the Irish DPC has pointed out that its decision in relation to Whatsapp’s T&C update will be dealt with under the dispute resolution procedure as the EU DPAs could not reach a consensus on the draft decision, which was sent to the other European DPAs in January. According to Techcrunch, the Hamburg DPA might have felt encouraged to take this decisive step following AG Bobek’s Opinion on the powers of non-lead supervisory authorities, a matter on which the CJEU still has to render a judgement. Facebook reportedly claims that the decision of the Hamburg DPA is flawed as it is based on a misunderstanding of the Whatsapp update.

 

 

– Irish High Court Dismisses Facebook Judicial Review Re: Data Transfer 

The Irish High Court has rejected a judicial review procedure brought by Facebook against the Irish DPC. Facebook claimed that a preliminary decision taken by the DPC concerning Facebook’s transfers of personal data from the EU to the US, as well as the procedures used to arrive at the decision, should be declared invalid. Facebook put forward a number of arguments in this regard including: (i) that the procedure used by the DPC departed from its usual practice – as outlined in its 2018 Annual Report and on its website – and breached Facebook’s legitimate expectations; and (ii) that Facebook was not being treated equally in relation to other controllers. The court, however, whilst accepting that the DPC’s procedure might be subject to judicial review, did not accept any of Facebook’s claims regarding the illegitimacy of the decision or the review. The court highlighted that the preliminary decision itself was legitimate and that, in light of the case, it was legitimate for the DPC to conduct the process against Facebook as it had. Politico further highlights that the court agreed with the DPC’s position that transfers of personal data from the EU to the US should not proceed on the basis of SCCs in light of the 2020 Schrems judgment. Facebook will now need to respond to the preliminary decision as the process moves ahead.

 

Google Analytics to Work without Cookies

Last Thursday Google announced a new feature: it will allow companies to gain insights in measurements like Google Analytics without having to place third-party cookies. This would be possible thanks to machine learning and modelling, which allow behavioural reporting in Google Analytics. In effect, ‘if there is incomplete data in a User Acquisition report because cookies are unavailable, Google will use modeling to help fill gaps for a more complete view of the number of new user campaigns acquired. With or without cookies, advertisers will have the ability to improve on their understanding of the customer journey across apps and websites and use those insights to improve campaigns.’ MediaPost notes that as internet users become more aware of their privacy, advertisers are becoming more creative and turn to first-party data. They also note that the technology will rely on a tagging infrastructure, which will allow advertisers to ‘modify and customize tag behavior in response to users’ consent preferences.’ We observe that this move might be the result of the stricter policies on cookies in the EU following the entry into force of the GDPR and the ongoing e-Privacy discussions, as well as the ongoing debates about the future of the Adtech industry, in which cookies play a huge role. It also signals industry’s technical capabilities to circumvent the cookies policies by making up for the missing information through more sophisticated analytical tools, questioning the effectiveness of the EU data protection rules. It remains to be seen whether the new analytical tools will be implemented and what stance the data protection authorities and experts will take as regards their compliance with the GDPR.

About

DPI Editorial Team

Dara Hallinan, Editor: Legal academic working at FIZ Karlsruhe. His specific focus is on the interaction between law, new technologies – particularly ICT and biotech – and society. He studied law in the UK and Germany, completed a Master’s in Human Rights and Democracy in Italy and Estonia and wrote his PhD at the Vrije Universiteit Brussel on the better regulation of genetic privacy in biobanks and genomic research through data protection law. He is also programme director for the annual Computers, Privacy and Data Protection conference.

Diana Dimitrova, Editor: Researcher at FIZ Karlsruhe. Focus on privacy and data protection, especially on rights of data subjects in the Area of Freedom, Security and Justice. Completed her PhD at the VUB on the topic of ‘Data Subject Rights: The rights of access and rectification in the AFSJ’. Previously, legal researcher at KU Leuven and trainee at EDPS. Holds LL.M. in European Law from Leiden University.

Leave a Reply