Data Protection Insider, Issue 23

– ECtHR Rules on Journalist’s Right to Privacy –

In Khadija Ismayilova v. Azerbaijan (no.3) the ECtHR dealt with the balance between privacy and freedom of expression. According to the facts of the case, the applicant is an investigative journalist in Azerbaijan who published articles critical of the government – including concerning government corruption. In 2012, she was secretly filmed in her bedroom with her ex-boyfriend. The videos were published online and received coverage in the pro-governmental media – the subject of previous ECtHR findings of a violation of Article 8 ECHR. Subsequently, an article was published in a newspaper which made degrading statements about the applicant’s private and sexual life. Azerbaijani courts failed to sanction the newspaper suggesting this would interfere with freedom of expression. The applicant thus took the case to the ECtHR claiming the domestic Courts had failed to effectively protect her right to private life. In its ruling, the ECtHR accepted that the content of the contested article made references to the video previously released and the related coverage back then and that they were pejorative and ruled that there was a violation of Article 8 ECHR for two main reasons. First, the domestic courts had not adequately examined the balance between the two competing human rights (to private life and to freedom of expression). In this regard, the domestic Courts had ignored the “the importance and scope of the applicant’s right to respect for her private life” and had failed to take into account the ECtHR’s case-law on the how to balance rights. Second, the examination of the freedom of expression rights of the newspaper was overly brief and lacked an in-depth consideration of “whether the statements made about the applicant were compatible with the ethics of journalism and whether they had overstepped the permissible bounds of freedom of expression.” This case is a reminder that journalistic freedom of expression has limits – not all information which can be made publicly available will fall within the protection of Article 10 ECHR and the legitimate publication of information which does fall within the scope of Article 10 ECHR can only follow after a careful balancing of rights which may be impacted by publication.

 

 – EDPB Holds 25th and 26th Plenary Sessions –

Over the past two weeks, the EDPB held its 25th and 26th Plenary Sessions. There is little information available on the discussions and outcomes of the 25th session. However, in the 26th session, the EDPB adopted one document and engaged in one significant information exchange:

  • The EDPB adopted a ‘letter in response to requests from MEPs Metsola and Halicki regarding the Polish presidential elections taking place via postal vote’. According to the press release, the letter predominantly concerns the transmission of the national identification database, by one of the national ministries, to the national postal service.
  • The EDPB engaged in a significant exchange as to the state of data protection in Hungary. The Hungarian data protection authority provided the EDPB with an update as to the ‘legislative measures the Hungarian government has adopted in relation to the coronavirus during the state of emergency.’ The EDPB consider further clarification is needed and will revisit the situation in a subsequent Plenary Session.

Documents not yet available on the EDBP’s website should be made available shortly, following internal checks.

 

– EDPB Updates Guidelines on Consent –

On 4th May 2020, the EDPB released an update to the Guidelines on consent under the GDPR – originally released on 10th April 2018 by the Article 29 Working Party and subsequently endorsed by the EDPB in its first Plenary Session. The update comes on the back of EDPB recognition of the need for further clarification of two aspects of the original Guidelines. First: ‘The validity of consent provided by the data subject when interacting with so-called “cookie walls”’. Here, the EDPB has updated the original Guidelines’ discussion of conditionality and consent. The EDPB now state: ‘access to services and functionalities must not be made conditional on the consent of a user to the storing of information, or gaining of access to information already stored, in the terminal equipment of a user’. They also provide an example of a service using a cookie wall, for which consent would not be a legitimate basis for processing. Second: ‘on scrolling and consent’. Here, the EDPB has updated the original Guidelines’ discussion of consent and the need for an unambiguous indication of wishes. The EDPB now state: ‘actions such as scrolling or swiping through a webpage or similar user activity will not under any circumstances satisfy the requirement of a clear and affirmative action: such actions may be difficult to distinguish from other activity or interaction by a user and therefore determining that an unambiguous consent has been obtained will also not be possible. Furthermore, in such a case, it will be difficult to provide a way for the user to withdraw consent in a manner that is as easy as granting it.’ These clarifications will likely seem self-explanatory to those familiar with data protection law and with the original Guidelines. One should not underestimate, however, the ability of controllers to come up with peculiar interpretations of data protection law.

 

 – EP: Framework of Ethical Aspects of Artificial Intelligence, Robotics and Related Technologies –

On 21st April, the Committee on Legal Affairs at the European Parliament issued a motion for a European Parliament resolution, containing recommendations to the European Commission concerning artificial intelligence, robotics and related technologies. The motion included a proposal for a Regulation. In its resolution and legislative request, the Committee focuses on the following ten major issues: definition of the main terms; human-centric and human-made AI, robotics and related technologies; risk assessment of such technologies; safety, transparency and accountability; safeguards against discrimination and bias; respect for gender balance and social responsibility; privacy and data protection; sustainable and environmentally-friendly AI; and governance standards to be issued by designated independent National Supervisory Authorities – which should assess whether the technologies covered by the proposed regulation are high-risk and monitor compliance with the proposed Regulation. In addition, the resolution proposes that an EU Agency for Artificial Intelligence and European Certificate of Ethical Compliance be established. It is welcome that the EU Parliament takes the topic of the ethical deployment of disruptive technologies such as AI and robotics so seriously. However, the content of its resolution and proposal for a regulation leave a lot of questions open. In particular, it is unclear how the proposed Regulation will relate to existing legal instruments, such as the GDPR. For example: how will the proposed supervisory authorities interact with existing DPAs and would the former have any enforcement powers; what would be the added value of the proposed EU Agency on Artificial Intelligence be; and whereas gaps related to data protection rules concerning AI have been identified – for example concerning the transparency and explainability of AI to individuals – what is the proposed Regulation’s contribution to closing these gaps. On a procedural level, the motion is interesting as an instance of the EU Parliament making use of its power to request the Commission, in accordance with Article 225 TFEU, to submit a legislative proposal concerning data protection.

 

– Developments in AdTech –

The last couple of weeks have brought several developments in the ongoing AdTech investigations. Two deserve discussion. First, the ICO has decided ‘to pause [the] investigation into real time bidding and the AdTech industry’. The ICO suggest their decision is based on their general regulatory approach during the COVID-19 state of emergency – an approach in which data controllers are generally given more leeway – and a desire to avoid putting any ‘undue pressure on any industry at this time’. Despite the announcement, the ICO are clear that their concerns about AdTech remain, and that the investigation will continue in the future. The criteria for determining when this continuation point is, however, are not provided. Second, the IAB have released ‘A Guide to the Post Third-Party Cookie Era’. The document is not specifically a clarification of the legal issues concerning third-party cookies or the legitimacy of their use. Rather, the document provides a general discussion of the consequences and options for advertisers resulting from the diminished significance of third-party cookies in the online AdTech system. In this regard, the document highlights changes in the legal environment – particularly in Europe – as a significant cause for this diminished significance. The document makes interesting reading both in its tacit recognition of the changes in AdTech practice resulting from data privacy concerns as well as in its evaluation of the consequences and options for online advertising resulting from these changes. Two discussions will be of particular interest to data protection experts: i) discussions of a potential shift in advertising power to proprietary platforms – i.e. advertising data sources solely controlled by specific organisations, such as large publishers or data companies; and ii) discussions of alternative approaches available to advertisers to meet the deficit in advertising opportunities created by the demise of third-party cookies.

 

– EU DPAs: Underresourced? –

According to the results of a recent study done by the web browser Brave, DPA’s in EU Member States have not been allocated sufficient financial and human resources to perform their duties. Because of this, Brave suggest that they have not been able to enforce the GDPR properly and that this poses a risk to its effectiveness. The example Brave provides is the high financial risk DPAs must take if they wish to start proceedings against tech giants like Google and Facebook. Thus, the lack of sufficient resources to cover the legal costs acts as a deterrent, even in cases where infringements are obvious. In this regard, it is noted that the Irish DPC has not issued yet any fines against any of the big tech companies for which it is the lead supervisory authority in the EU. As a result of the study, Brave has submitted a complaint against all the 27 EU Member States with the European Commission for “failing to adequately implement the GDPR” by not providing DPAs with adequate resources. It remains to be seen whether the European Commission will pursue the complaint and whether the complaint will eventually result in proceedings against any Member States in front of the CJEU. If this does happen, it will be interesting to see how criteria for adequate financial and human resources are defined and how requirements for each Member States could accordingly be established.

About

DPI Editorial Team

Dara Hallinan, Editor: Legal academic working at FIZ Karlsruhe. His specific focus is on the interaction between law, new technologies – particularly ICT and biotech – and society. He studied law in the UK and Germany, completed a Master’s in Human Rights and Democracy in Italy and Estonia and wrote his PhD at the Vrije Universiteit Brussel on the better regulation of genetic privacy in biobanks and genomic research through data protection law. He is also programme director for the annual Computers, Privacy and Data Protection conference.

Diana Dimitrova, Editor: Researcher at FIZ Karlsruhe. Focus on privacy and data protection, especially on rights of data subjects in the Area of Freedom, Security and Justice. Completed her PhD at the VUB on the topic of ‘Data Subject Rights: The rights of access and rectification in the AFSJ’. Previously, legal researcher at KU Leuven and trainee at EDPS. Holds LL.M. in European Law from Leiden University.

Leave a Reply