Sectors

Services

Background
21 March 2024

Data Blast: European Commission announces AI Office, facial recognition problematic for vending machines and employee monitoring, UK Ministry of Defence fined for data breach, record data protection fine in Spain, and more!

European Commission announces plan to create AI Office

The European Commission has unveiled the Commission Decision formalising the establishment of the European AI Office, completing another key step in its regulatory landscape for AI.

The AI Office will be integrated into the administrative framework of the Directorate-General for Communication Networks, Content and Technology. Notably, this establishment is designed to complement rather than supersede the powers and competencies of national competent authorities, EU bodies, offices, and agencies involved in overseeing AI systems, as outlined in the newly-adopted AI Act.

The AI Office will have the following key functions and tasks:

  • Execution of AI Act tasks: The AI Office is charged with a range of responsibilities crucial to the regulation and governance of AI systems within the EU.
  • Facilitating trustworthy AI systems: A primary focus of the AI Office is to provide support for the accelerated development, deployment, and utilisation of trustworthy AI systems and applications, with a stated aim to provide societal and economic benefits and enhance the overall competitiveness and economic growth of the EU.
  • Market and technological monitoring: The AI Office is mandated to monitor the evolution of AI markets and technologies.
  • Capability evaluation tools: The AI Office will develop tools, methodologies, and benchmarks for assessing the capabilities of general-purpose AI models.
  • Risk mitigation strategies: In line with responsible AI governance, the AI Office will proactively monitor for unforeseen risks stemming from general-purpose AI models, including responding to alerts from its scientific panel.

This establishment of the AI Office introduces a pivotal regulatory player in the realm of AI governance within the European Union. Legal professionals operating in the technology and regulatory spheres will need to closely monitor developments, as the AI Office’s activities are poised to shape the future legal landscape surrounding AI systems.

The EU’s commitment to fostering trustworthy AI aligns with the broader global discourse on responsible and ethical AI practices.  There are an ever-increasing number of voices in the AI ethics and safety space, with recent publications including:

  • A report by the UN’s AI Advisory Body; The Interim Report: Governing AI for Humanity;
  • The Council of Europe’s Committee on Artificial Intelligence draft Framework Convention on AI; and
  • The UK AI Safety Institute’s outline of principles and procedures for its forthcoming International Scientific Report on Advanced AI Safety, which is anticipated in Q2 of 2024.

ICO orders Serco to cease facial recognition and fingerprint scanning for employee attendance monitoring

The UK Information Commissioner’s Office (‘ICO’) has issued enforcement notices directing Serco Leisure, Serco Jersey, and seven affiliated community leisure trusts to halt the use of facial recognition technology (‘FRT’) and fingerprint scanning for monitoring employee attendance. The ICO’s investigation concluded that more than 2,000 employees across 38 leisure facilities were subjected to the unlawful processing of biometric data for attendance checks and payment.

The investigation found that Serco Leisure failed to justify the necessity and proportionality of employing FRT and fingerprint scanning when less intrusive alternatives, such as ID cards or fobs, were available. Notably, employees were not provided with clear alternatives and were compelled to undergo biometric scans as a prerequisite for payment. This power imbalance between the employer and employees necessarily raised concerns about the voluntary nature of biometric data submission.

In response to these findings, the ICO has issued enforcement notices, instructing Serco Leisure and the affiliated trusts to cease all processing of biometric data for attendance monitoring. Additionally, they are mandated to destroy all biometric data not legally required to be retained within three months of receiving the notices.

John Edwards, the UK Information Commissioner, emphasised the uniqueness and sensitivity of biometric data, and especially the risks associated with inaccuracies or security breaches. Edwards criticised Serco Leisure for prioritising business interests over employee privacy, stating that the absence of a clear opt-out mechanism and the mandatory nature of biometric data submission were neither fair nor proportionate under data protection laws.

This enforcement action coincides with the ICO’s release of new guidance for organisations contemplating the use of biometric data. The guidance outlines compliance measures with data protection laws and underscores the need for organisations to carefully consider potential risks, including accuracy errors and bias, associated with biometric technologies.

University of Waterloo removes vending machines with facial recognition technology

The University of Waterloo (‘UoW’) in Ontario, Canada, is dismantling 29 vending machines equipped with facial recognition technology following student-led opposition and concerns about privacy violations. The controversy emerged after a Reddit user noticed an error message on one of the machines, bringing attention to the questionable use of facial analysis technology.

In an article for the campus newspaper, River Stanley, a computer science student, questioned the necessity of facial recognition technology in vending machines and exposed the lack of transparency surrounding its implementation. UoW promptly responded by requiring the removal of all facial recognition-enabled 29 vending machines, manufactured by the Switzerland-based company Invenda, ‘as soon as possible.’ The software was to be disabled in the interim.

Invenda stated that its technology primarily employs facial analysis, not facial recognition, emphasising that it does not store data or photos, the purpose being to identify when a person is in front of the vending machine, transitioning the screen from standby mode to sales mode.

Critics argue that the distinction between facial analysis and recognition is insufficient. Stanley highlighted the absence of any markings indicating the presence of cameras on the vending machines. The President of the Privacy and Access Council of Canada, Sharon Polsky, asserted that customers should be informed and given the choice to opt-in to any such processing. She noted that the demographic attributes collected, such as age and gender, could aid retailers in determining product preferences.

While Invenda insists that its technology is limited to people detection and basic demographic analysis, privacy concerns persist, and the incident recalls past controversies, such as the development of similar technology in Canadian shopping centres. UoW’s response reflects a wider debate about the ethical use of biometric technologies and the need for transparent and consensual implementation. As privacy concerns continue to escalate, calls for stricter legislation and comprehensive investigations into such technologies are likely to gain further momentum.

UK ICO fines Ministry of Defence for data breach

The ICO has imposed a £350,000 fine on the UK Ministry of Defence (‘MoD’) for a data breach involving the personal data of individuals seeking relocation to the UK after the Taliban took control of Afghanistan in 2021. The breach occurred when an email containing personal information of 245 people was sent to a distribution list using the ‘To’ field, inadvertently disclosing the data to all recipients.

The breached data, managed by the UK’s Afghan Relocations and Assistance Policy (‘ARAP’) team, could have posed a threat to the lives of individuals had it fallen into the hands of the Taliban. The MoD’s response included contacting affected individuals, requesting the deletion of the email, and implementing internal measures to prevent future breaches.

The ICO found that the ARAP team lacked appropriate technical and organisational measures at the time of the incident, relying on ‘blind carbon copy’ (BCC) and lacking secure data transfer services. The breach highlighted the risk of human error associated with ‘To’ field emails. The ICO stressed that organisations handling sensitive personal data should use secure methods such as bulk email services or mail merge, in order to minimise the risk of similar errors.

John Edwards, the Information Commissioner, expressed concern over the breach, stating, “This deeply regrettable data breach let down those to whom our country owes so much.” He emphasised that the consequences of data breaches could be life-threatening, underscoring the need for robust data protection measures, especially when dealing with vulnerable populations. The fine, reduced from £1,000,000 to £350,000, took into account the MoD’s cooperation with the ICO, remedial actions taken, and the challenges faced by the ARAP team.

Austrian Data Protection Authority fines controller €5,900 for breach notification failures

The Austrian Data Protection Authority (‘ADPA’) has levied a fine of €5,900 against a controller for inadequacies in reporting a personal data breach, accompanied by a failure to cooperate effectively during the investigation process. The incident in question occurred on 6 March 2023, when a ransomware attack targeted the IT infrastructure of the controller’s company, resulting in the encryption of data. Despite the attack happening in early March, the managing director only notified the ADPA on 24 April, well over a month later.

Under the GDPR, controllers are required to report such incidents promptly, within 72 hours of becoming aware of the breach. However, the delayed notification was only the beginning of the controller’s shortcomings. The notification provided to the ADPA lacked essential details mandated by Art. 33(3) GDPR, including specifics about the nature of the breach, categories of affected individuals, and measures taken to mitigate its impact. Furthermore, when the ADPA sought supplementary information, the controller’s responses were deemed unclear and uncooperative. Despite requests for clarification, the controller was found to have failed to provide adequate explanations, with one response even consisting of an empty email with a previous statement attached.

The ADPA emphasised that the controller’s notification appeared designed to serve the purpose of satisfying insurance requirements rather than fulfilling the obligation to provide a comprehensive report for assessment by the ADPA. In addition to the breach notification failures, the controller neglected to conduct a risk assessment to evaluate the impact on the rights and freedoms of the affected data subjects, as required by Art. 34 GDPR.

Greek data protection regulator imposes fine on controller for DPO’s non-response to questionnaire

The Greek Data Protection Authority (‘HDPA’) has recently fined a controller €5,000 following the failure of its Data Protection Officer (‘DPO’) to respond to a questionnaire sent by the authority. In 2023, as part of an initiative led by the European Data Protection Board (EDPB), the HDPA, alongside other EDPB members, embarked on examining ‘the definition and position of the data protection officer.’ As part of this assessment, the EDPB developed a questionnaire, adopted by the HDPA, which was subsequently distributed to 31 public bodies in Greece, including the Municipality of Athens, the entity in question.

Despite multiple attempts by the HDPA to elicit a response, the DPO of the Municipality of Athens failed to submit the questionnaire within the specified deadlines. After numerous extensions of time and efforts to rectify technical issues, the DPO’s response was still not forthcoming. Even upon summoning the controller to appear before the HDPA, the response remained delayed.

The HDPA, after thorough investigation, determined that the controller’s actions constituted a breach of their obligations under Art. 31 GDPR, which mandates cooperation with supervisory authorities. The authority concluded that the explanations provided by the controller regarding technical issues were invalid, as the submission link had been operational within the given timeframe. In response to the breach, the HDPA imposed a fine of €5,000, aiming to both restore compliance and deter future violations.

Italian health service provider fined for biological data breach

In a recent decision, the Italian data protection regulator (‘IDPA’) fined a local health service provider €18,000 for the loss of biological data, highlighting serious lapses in data processing operations and security measures. The case originated from a complaint filed by a data subject against the local health unit, citing the loss and unlawful destruction of genetic biological data contained in histological slides stored in medical records. The IDPA found the health service provider, the data controller, in violation of several key provisions of the GDPR.

Firstly, the IDPA emphasised that the health service provider failed to adhere to the accountability principle outlined in Art. 5(2) GDPR. The provider was unable to demonstrate its data processing operations effectively, particularly concerning the decision making on the deletion or destruction of biological samples after the minimum 10-year storage period including in the guidelines of the Italian Ministry of Health. Moreover, the lack of traceability in the processing stages hindered the identification of recipients in the Medical Records Office.

Secondly, the IDPA found breaches of Arts. 5(1)(f) and 32 GDPR, which mandate the implementation of appropriate technical and organisational measures to ensure the security of data processing. The health service provider’s failure to take into account pending legal proceedings, where the samples would have been evidence, led to the destruction of stored personal data that should have been maintained.

Addressing the controller’s argument that human tissue samples should not be classified as biometric data, the IDPA firmly stated that, irrespective of this classification, the samples fell under special categories of personal data under Art. 9 GDPR. The biological materials contained information referring to the identity of individuals and revealed details about health care services, categorising them as health data under Art. 4(15) GDPR.

Record fine imposed on Open Bank in Spain for data protection violations

The Spanish data protection regulator (‘AEPD) has imposed a substantial €2.5 million fine on Open Bank for serious breaches of data protection regulations. The case originated from a complaint filed by a data subject with the Bavarian Data Protection Authority on 5 August 2021. The data subject alleged that Open Bank, in its efforts to comply with anti-money laundering regulations, had compelled them to provide proof of the origin of funds in their bank account. The crucial issue arose when Open Bank failed to offer a secure method for the data subject to submit this information, resorting to unencrypted emails.

The AEPD, acting as the lead supervisory authority due to Open Bank’s headquarters in Spain, found Open Bank in violation of Art. 25 GDPR. The controller was criticised for an incomplete data protection impact assessment (‘DPIA’) that lacked consideration for personal data processing related to anti-money laundering verifications. This failure resulted in the absence of necessary technical and organisational measures to ensure data protection compliance.

The AEPD emphasised that having protocols or templates alone was insufficient for compliance with data protection principles. Merely conducting mandatory DPIAs did not meet the requirements of privacy by design outlined in Art. 25 GDPR. Open Bank failed to communicate alternative secure methods to clients and did not implement remedial actions promptly, aggravating the violation. Furthermore, the SDPA held Open Bank in breach of Art. 32 GDPR, emphasising that standard email communication did not provide an adequate level of security for transmitting personal financial data. Consequently, the AEPD imposed a hefty fine of €2.5 million on Open Bank, comprised of €1.5 million for violating Art. 25 GDPR and €1 million for breaching Art. 32 GDPR.

Spain halts Worldcoin’s orb-scanning project amid data privacy concerns

In further Spanish news, Spain’s data protection regulator, the AEPD, has taken action against Sam Altman’s Worldcoin cryptocurrency project, demanding an immediate cessation of the collection of personal data through eyeball-scanning ‘orbs.’ The regulator expressed particular concern about the company gathering information about minors, prompting Spain to become the first European jurisdiction to act against Worldcoin.

The AEPD issued a precautionary measure, giving Worldcoin 72 hours to demonstrate compliance with the order. AEPD director Mar España Martí emphasised the need for coordinated action across European countries, asserting that the issue affects citizens throughout the European Union. Worldcoin has faced controversy globally for offering its cryptocurrency tokens in exchange for individuals consenting to have their eyes scanned by the orb. The scans serve as a means of identification, aiming to create a reliable mechanism to distinguish between humans and machines, especially as artificial intelligence advances.

Spain’s crackdown is the latest setback for Worldcoin, which has encountered challenges in various jurisdictions. The cryptocurrency token has avoided launching in the US due to regulatory concerns, while it remains unavailable in major markets like China and India. Kenyan authorities ordered the shutdown of Worldcoin’s operations, and the UK’s Information Commissioner’s Office has previously expressed its intention to investigate the project.

The Worldcoin project gained attention in Spain, where queues formed at shopping centres offering cryptocurrency in exchange for eyeball scans. The country’s data protection watchdog, the AVPD, had previously issued a warning about the GDPR’s biometric data protection rules and the need for a risk assessment related to Worldcoin’s eye-scanning technology. España Martí underscored concerns about Worldcoin’s compliance with biometric data laws, emphasising the necessity for users to receive adequate information about data usage and the right to erase it. Sharing biometric data, she warned, exposes individuals to risks such as identity fraud, breaches of health privacy, and discrimination.