Sectors

Services

3 December 2019

Data Blast: Germany issues €14.5 million fine for lax data security and more…

Data Blast: Germany issues €14.5 million fine for lax data security; UK and French regulators address facial recognition technology; Austrian Postal Service fined for multiple GDPR breaches

German real-estate firm fined €14.5 million over unlawful data storage

On November 5th, the Berlin Commissioner for Data Protection and Freedom of Information (the ‘Commissioner’) announced that it had fined Deutsche Wohnen SE (‘DW’), a German real estate company, €14.5 million for GDPR violations. The fine represents the highest fine issued in Germany in the GDPR era.

Having conducted on-site investigations of DW in 2017 and 2019 the Commissioner found that DW was retaining personal data of tenants for an unlimited period, without assessing whether doing so was necessary or legitimate. The personal data in question included financial information, such as bank statements and tax documents, as well as health insurance data.

The Commissioner was able to access personal data from DW’s archive going back many years, which was no longer being used for the purpose for which it was initially collected. Furthermore, it was discovered that DW’s archiving system did not allow for the removal of data no longer required for the stated purpose for which it was collected.

The Commissioner noted that, following their 2017 on-site visit, DW made modest improvements to their system. However, these improvements were inadequate in complying with the GDPR principle of data minimisation. An aggravating factor in the Commissioner’s fining decision was that DW had deliberately created the problematic archive, and had processed the affected data for several years.

In considering the size of fine, the Commissioner noted that DW has a global turnover of roughly €1 billion, allowing for a maximum fine of €28 million. However, the Commissioner took into account DW’s cooperation with the investigation, as well as the fact the DW took appropriate steps to remedy the situation when the issue was brought to their attention this year, so the fine issued is not the maximum.

The fine is not yet final, and DW may still elect to appeal the Commissioner’s decision.

ICO issues Opinion concerning police use of facial recognition technology

On October 31st, the UK Information Commissioner’s Office (ICO) issued its first ever Commissioner’s Opinion, concerning the use of live facial recognition (LFR) for law enforcement purposes.

The Opinion laid out the ICO’s view of the technology’s deployment as it related to data protection law, noting that:

  • The use of LFR involves processing personal data and therefore data protection law applies, whether for the purposes of a trial or routine operation deployment, and that it is covered by Part 3 of the DPA 2018;
  • The use of LFR for the law enforcement purposes constitutes ‘sensitive processing’ (section 35(8)(b) DPA) as it involves the processing of biometric data for the purposes of uniquely identifying individuals;
  • Sensitive processing relates to all facial images captured and analysed by software; and as a result a Data Protection Impact Assessment and an ‘appropriate policy document’ must be put in place;
  • Sensitive processing occurs regardless of whether an image yields a match for an individual included on a watchlist, or if the biometric data in question in subsequently deleted within a short timeframe;
  • Data protection law concerns the whole LFR process, from considerations about necessity and proportionality to the processing and retention/deletion of biometric data;
  • Controllers must identify a lawful basis for LFR use;

The Opinion also states that the ICO plans to consult with relevant authorities to develop a binding code of practice, specifically focused on law enforcement use of LFR and biometric technology, to sit alongside data protection legislation. This will be supplemented with more detailed guides for police compliance with data protection law, in light of the Admin Court’s decision in September that use of automated facial recognition by the South Wales Police was in fact lawful.

CNIL publishes its contribution to the debate on the use of facial recognition technology

The French data protection authority (CNIL) has published a note of its views on the public debate which must take place to agree limits on the use of facial recognition technology. The CNIL states that this will require political choices to be made; on the role of technology, on its impact on the fundamental rights of individuals, and on the place of humans more broadly in the digital era. According to the CNIL, such public debate should aim to determine when the use of facial recognition technology is necessary in our democratic society, and cautions that the use of such technology may otherwise quietly gain use in everyday life.

The CNIL note sets out its view of the technical, legal and ethical elements which should frame such debate. The CNIL summarises its aims as follows:

  1. To present, from a technical perspective, what constitutes facial recognition technology. The CNIL believes that this evaluation must be made on a case by case basis.
  2. To highlight the technological, ethical, and societal risks linked to such technology. This is said to be necessary in order to which uses are not acceptable in a democratic society, and which can be adopted with appropriate safeguards.
  3. To recognise the legal framework in which the testing and implementation of such technology is permitted. The GDPR and related national laws regulate the processing of biometric data; those requirements must be met even in the evaluation phase for facial recognition technology.
  4. To set out the role of the CNIL in the future trial deployment of facial recognition technologies. The CNIL notes that its role is not as a decision-maker in such a process; the tasks of defining and carrying out any such experimental deployment will fall to the government and to parliament. The CNIL states that it will strictly maintain its independence in order to be able to fulfil its mandate as the data protection regulator.

Austrian Postal Service fined for multiple GDPR breaches

On October 23rd, the Austrian Data Protection Authority (DPA) announced that it had fined the Austrian Postal Service €18 million for various breaches of the GDPR.

Upon investigation, the DPA found that the Postal Service had unlawfully processed customer data, including ages and addresses, in order to assess those customers’ likely political leanings, and proceeded to sell their findings.

The DPA determined that another violation had been committed, as the Postal Service had processed data regarding package delivery frequency and the frequency of relocations for the purposes of direct marketing. The DPA found a high level of culpability on the part of the Postal Service, which contributed to the considerable fine levied.

The fine is not final, and it may, within 4 weeks of delivery of the penalty notice, be appealed to the Federal Administrative Court of Austria.

For more information please contact Partner, James Tumbridge at jtumbridge@vennershipley.co.uk.