Sectors

Services

28 February 2023

Data blast: EU-US transfers not safe yet, beware ransoms, Spanish lesson on special category data, Australia update, Norwegian fines for compliance failures, Luxembourg CCTV warning, and Direct Marketing lessons from Experian

Halt say the European Committee on the EU-U.S. Data Privacy Framework

In our recent Data Blast issue of February 14th 2023 available here, we reported on the Draft EU-US Adequacy Decision. On the same day, the European Parliament’s Committee on Civil Liberties, Justice and Home Affairs (the ‘Committee’) published the Draft Motion for a Resolution advising the European Commission not to adopt adequacy based on the proposed EU-US Data Privacy Framework, on the basis that it ‘fails to create actual equivalence’ with the EU in the level of data protection that it provides.  So the free flow of data is not looking any easier at present.

The objections to the EU-U.S. Data Privacy Framework raised are in relation to:

  1. Proportionality and Necessity

These principles are long-standing key elements of the EU data protection regime and that their substantive definitions in the Executive Order 14085 (EO) are not in line with their definition under EU law and their interpretation by the CJEU. The Committee notes that the EO requires that signals intelligence must be conducted in a manner proportionate to the ‘validated intelligence priority,’ which appears to be a broad interpretation of proportionality;

  1. Bulk collection of data

According to the Committee, the EO does not prohibit the bulk collection of data by signals intelligence, including the content of communications and that the list of legitimate national security objectives can be expanded by the US President, who can determine not to make the relevant updates public;

  1. Data Protection Review Court (‘DPRC’)

The decisions of the DPRC (part of the executive branch and not the judiciary) will be classified and not made public or available to the complainant. Of concern to the Committee is also that a complainant will be represented by a ‘special advocate’ designated by the DPRC, for whom there is no requirement of independence and that the redress process provided by the EO is based on secrecy and does not set up an obligation to notify the complainant that their personal data has been processed, thereby undermining their right to access or rectify their data. Further, the redress process does not provide for an avenue for appeal in a federal court and therefore, among other things, does not provide any possibility for the complainant to claim damages, concluding that the DPRC does not meet the standards of independence and impartiality of Article 47 of the Charter of Fundamental Rights of the European Union.

  1. Insufficient remedies available in commercial matters

Although the US has provided for a new mechanism for remedy for issues related to public authorities’ access to data, the remedies available for commercial matters under the adequacy decision are considered insufficient by the Committee.

  1. No Federal data protection law

Unlike all other third countries that have received an adequacy decision under the GDPR, the US does not have a federal data protection law. The EO is not clear, precise or foreseeable in its application, as it can be amended at any time by the US President; and therefore the Committee are concerned the decision would automatically expire after its entry into force.

  1. Legal certainty

Data transfer mechanisms, which may be subsequently repealed by the CJEU, create additional costs for European businesses and is particularly burdensome for micro, small and medium-sized enterprises.

In conclusion, the Committee is of the opinion that the EU-US Data Privacy Framework fails to create actual equivalence in the level of protection and calls on the European Commission to continue negotiations with the US and ‘urges the Commission not to adopt the adequacy finding.’

A Parliament vote on the Resolution is expected in the coming months, but even if passed the Resolution will not be binding on the European Commission with respect to its adequacy decision.

We shall continue to monitor developments and report on them in our Data Blasts.

The ICO: Paying Ransoms Not a Reasonable Step.

Royal Mail (RM) has been the victim of a ransomware attack, whereby the ransom demand for $80m (£67m) came from hackers linked to Russia (LockBit) that hacked into RM software and blocked international shipments by encrypting files crucial to the company’s operations. RM and LockBit started negotiating, with the LockBit hacker setting a ransom equal to 0.5% of the company’s revenue, costing less than the fine that RM could receive from the ICO (organisations can be fined up to 4% of their annual revenue) if it were to become public that the company had failed to protect personal data. RM refused to pay, and the chat transcript ended up in the dark web together, it appears, with the stolen personal data. The full investigation is, we believe, still ongoing.

The ICO has though issued a statement confirming that “paying ransoms to release locked data does not reduce the risk to individuals, is not an obligation under data protection law, and is not considered as a reasonable step to safeguard data.”

“The ICO has clarified that it will not take this into account as a mitigating factor when considering the type or scale of enforcement action. It will however consider early engagement and co-operation with the National Cyber Security Centre (NCSC) positively when setting its response.”

In July 2022, RM and NCSD issued a joint letter to the Law Society to remind its members that they should not advise clients to pay ransomware demands should they fall victim to a cyber-attack. There is a general belief, addressed in the statement, that some firms are paying ransoms with the expectation they do not need to engage with the ICO as a regulator, or will gain benefit from it by way of reduced enforcement. According to the ICO, this is incorrect.

For the full statement please refer to the ICO link here.

 

A cautionary tale on Special Categories of Data from Spain  

A Spanish talent acquisition company, Thomas International Systems, S.A (‘Thomas’), provides behavioural tests and surveys in order to review job candidates on behalf of its clients. Candidates are required to complete two assessments. The second questionnaire was stated to be for the purposes of research and improvement of the evaluations conducted by Thomas. This second survey collected several categories of personal data, including special category of data such as disability and ethnicity. For each question in this second survey, the candidate was presented with a drop-down mechanism that included the option ‘I prefer not to answer,’ in all questions apart from those under the disability category.  The second survey also contained an informative text informing candidates that participation was entirely voluntary. They could skip questions if they wanted to.

In February 2021 a candidate filed a complaint with the Spanish DPA (DPA) against Thomas for requesting disability and ethnicity data without information on how they would use such data. The DPA held that the information contained in Thomas’s privacy policy was too generic and was limited to citing several legal bases, without specifying which legal basis corresponded to each of the controller’s processing operations. Thomas argued that it could rely on Article 9(2)(j) GDPR (scientific research purposes) to process the special category health data. It also argued that the data subject had the option to consent to the processing of ethnicity and disability, because they could simply choose to refrain from giving an answer to these questions. However, the Spanish DPA found that Thomas processed data relating to ethnicity and disability, without justifying the applicability of any circumstances or exceptions and specifically held that the ‘scientific research purposes’ exception, did not apply.

With regards to consent, the DPA held that the mere indication of voluntariness does not meet the requirements of Article 9(2)(a) GDPR, which states that consent to the processing of special categories of personal data must be explicit. The fact that the data subject could choose whether to fill in the form could not be accepted as a form of consent.

The DPA imposed a sanction of €50,000 (reduced to €40,000 by making use of the possibility, provided for in Spanish administrative law, to have the fine reduced due to a voluntary payment) and ordered the controller to stop the collection of personal data relating to ethnicity and disability from the survey.

 

Australians closer to an EU style data protection legislation

The Australian Attorney General released the Privacy Act Review Report (the Report) which contains 116 proposals for reforming the Privacy Act 1988 (the Privacy Act) to make it ‘fit for purpose’ to ‘adequately protect Australians’ privacy in the digital age’ and to enhance cross border data flows with Australia.

The Report acknowledges that there have been significant developments in data protection laws internationally in recent years to respond to the technological developments in personal information handling. These include the European Union and the United Kingdom. The proposals are designed to better align Australia’s laws with global standards of information privacy protection and properly protect Australians’ privacy. We will monitor developments and have here a selection of observations (not a full review) regarding the Proposals:

  1. Proposal 4.1: Broader definition of personal information by changing the word ‘about’ in the definition of personal information, to ‘relates to’ (that is, ‘information or an opinion that relates to an identified individual…’)
  2. Proposal 7: Additional obligations when handling employee records. In particular, obligations relating to transparency of collection and use of employee information, protection against unauthorised access or interference, and eligible data breach reporting.
  3. Proposal 11: Amended definition of consent, which must be voluntary, informed, current, specific and unambiguous.
  4. Proposal 12: The requirement to act fairly and reasonably when collecting, using and disclosing personal information
  5. Proposal 22: Introduction of the concept of processors and controllers in Australian law.
  6. Proposal 18.3: Introduction of a right of erasure.
  7. Proposals 19 and 20: Regulation of the use of personal information in automated decision making and in targeted advertising.
  8. Proposal 25: Greater enforcement powers and penalties.

 

The Norwegian DPA fines fitness chain ‘SATS’ close to €1,000,000 for breach of GDPR

Four members of the fitness chain SATS lodged complaints with the Norwegian DPA (Datatilsynet) between October 2018 and December 2021 in relation to violations by SATS of various GDPR rules in relation to failing to act in a timely way to two separate access requests; failing to take prompt action and erase certain personal data without undue delay; failing to duly inform data subjects about its data retention policy concerning the personal data of banned members, and the relevant legal basis for the processing; failing to rely on a valid lawful basis to process the training history data of the members of its fitness centres.

According to one of these members, SATS had transferred data to Facebook, located outside the EU/EEA – without a proper legal ground and its access request remained unanswered. Other members complained that SATS refused to comply with erasure requests and SATS responded to the DPA arguing, that it was wrong to find a violation of GDPR due to a failure to respond to an access request that was submitted around a month after the GDPR became applicable in Norway ‘as at that time many companies experienced challenges in applying the new rules.’ The DPA rejected this confirming that other companies’ non-compliance was not a valid justification for a violation of the GDPR that started to occur in September 2018.

The DPA issued an administrative fine of NOK 10,000,000 (ten million) (approximately €916,000).

 

In Luxemburg a bank has been fined €10,000 in relation to the use of CCTV on its staff

Luxembourg DPA (CNPD) investigated a bank to verify its compliance with the GDPR of the bank’s video surveillance systems. The investigation showed that surveillance cameras were installed in safe rooms, meeting rooms, the reception desk, the cash desk, offices, a computer room and a room where employees take breaks. The CNPD found this to be permanent surveillance of employees – who could not escape the cameras and this was disproportionate to the purpose and could create psychological pressure. The investigation also showed that employees were informed about the use of cameras but this was unsatisfactory in that it was done by way of pictogram and an old CNPD authorisation sticker at the entrance door and at a passageway closed to the public. According to CNPD this information was incomplete because it did not provide, among other things, the retention period, the purposes of the processing, the right to rectification and erasure.

The DPA imposed a fine of €10,000 and ordered corrective measures; 1) to stop filming the employees’ workplans and, if this cannot be avoided at all, to arrange for their faces to be blurred, 2) to obscure the public area within the cameras’ field of vision; and 3) to have a single place where all the information required by Article 13 is available.

 

Tribunal Rules on UK ICO Direct Marketing Case Against Experian

In 2018, the ICO began investigating how Experian and two other credit reference agencies used the personal data they received for direct marketing. The ICO issued an enforcement notice in October 2020. Experian appealed this notice to the First-Tier Tribunal. In a mixed ruling, the Tribunal largely sided with Experian, but did support the ICO in certain respects.

The Tribunal’s Substituted Decision Notice (‘SDN’) requires Experian to do the following:

  1. Within three months, set up a system to provide data subjects who information is obtained from the Open Electoral Register, the Registry Trust Limited or Companies House with a GDPR-compliant privacy notice.
  2. With a year, provide such a privacy notice to all existing data subjects in that category.

The Notice does not require Experian to provide a privacy notice where Experian:

  • Obtains personal data from its credit reference agency business, consumer services business or third-party commercial suppliers;
  • Limits the processing of personal data to the retention or sale of data from the Open Electoral Register;
  • Processes personal data solely in connection with its directory enquiry or suppression databases; or
  • Ceases to process personal data about a data subject who would otherwise be sent the privacy notice for direct marketing purposes within the 12 months of the Decision Date.

The Tribunal’s SDN requires Experian to provide privacy notes on a much smaller scale than the original ICO Decision Notice. In issuing the SDN, the Tribunal stated that it ‘must stand in the shoes of the Information Commissioner and ask whether the Information Commissioner should have exercised her discretion differently.’ The Tribunal found that the ICO ‘fundamentally misunderstood the actual outcomes of Experian’s processing.’ A major factor was the consequences of the processing of the data. The Tribunal agreed with Experian that the ‘worst outcome of Experian’s processing is that an individual is likely to get a marketing leaflet which might align to their interests rather than be irrelevant.’

The ICO is currently deciding whether it will appeal the Tribunal’s decision.