Sectors

Services

Background
23 February 2024

Data Blast: UK government call for view of cyber governance, House of Lords questions facial recognition technology, European Commission initiates formal proceedings against TikTok, and more!

UK government calls for views on cyber governance code of practice

In line with the £2.6 billion National Cyber Strategy, aimed at fortifying the UK’s online landscape, the Department for Science, Innovation and Technology (‘DSIT’) has issued a call for views on a proposed Cyber Governance Code of Practice (‘CGCoP’). Recognising the importance of C-Suite engagement on cyber-security, this initiative is designed to empower directors and business leaders to navigate cyber threats effectively. The code seeks to streamline critical areas that leaders must address, simplifying the understanding of necessary actions to manage cyber security risks.

The proposed CGCoP is open for feedback from stakeholders across various sectors until March 19th 2024. The focus of this call for views is to ensure that the guidance on cyber risks is communicated in a straightforward manner and can be practically implemented.

Despite existing regulatory requirements and available guidance, many organisations grapple with the complexity of the cyber landscape and seek additional resources that illustrate ‘what good looks like.’ This sentiment has been echoed strongly in the government’s engagements on governing cyber risk over the past year with various organisations, including auditors and industry bodies.

Recognising that there is no one-size-fits-all approach to governing business risks like cyber risk, the government has proposed the Cyber Governance Code of Practice. This code aims to consolidate critical governance areas for directors in a simple, accessible format, applicable to organisations of all sizes. It would formalise the government’s expectations of directors for governing cyber risk, treating it similarly to other significant business risks.

The call for views centres on three key areas:

  1. Design of the Cyber Governance Code of Practice: Soliciting feedback on the clarity and practicality of actions outlined for directors to govern cyber risk and identifying areas where additional guidance may be required for effective implementation.
  2. Promotion and uptake of the Code: Seeking insights on strategic placement and promotion of the Code to ensure it reaches directors and becomes an integral aspect of their risk management knowledge. Exploring the potential role of other bodies in supporting Code implementation.
  3. Assurance Process for the Code of Practice: Assessing the demand for an assurance mechanism to support Code implementation, exploring who might benefit from an independently assured ‘badge,’ and evaluating associated risks.

The full Call for Views document with the proposed Cyber Governance Code of Practice is available here.

German Higher Administrative Court rules University of Hamburg’s student election data processing violates GDPR

In a recent appellate decision, a German Higher Administrative Court ruled that the University of Hamburg’s processing of personal data related to the students’ parliament election failed to comply with Art. 5(1)(c) of the GDPR.

A candidate for the student parliament election at the University of Hamburg contested the rejection of his candidacy after refusing to provide his date of birth on the candidacy registration form, which was required under the Hamburg University Election Act (‘Wahlordnung,’ or ‘WahlO’). The university argued that collecting the candidates’ date of birth was necessary to identify them unambiguously and verify their legal age. The Administrative Court of Hamburg initially held that the data subject’s incomplete form justified the rejection, prompting an appeal to the Higher Administrative Court of Hamburg (‘OVG Hamburg’).

The OVG Hamburg first established that the GDPR applied to the case, as the date of birth constituted personal data, and its use in the election process qualified as data processing. The court determined that the only legal basis for this processing could be Art. 6(1)(e) GDPR, as it was performed in the public interest, not Art. 6(1)(c) GDPR (legal obligation), as the WahlO merely constitutes an internal regulation of the University of Hamburg.

Assessing the necessity of processing the date of birth, the court invoked the data minimisation principle under Art. 5(1)(c) GDPR, emphasising that personal data should be limited to what is strictly necessary. Following CJEU judgment C-175/20, the court applied a strict standard in determining necessity.

The OVG Hamburg concluded that the university’s requirement for candidates to provide their date of birth, as mandated by s. 6(5) WahlO, was not in accordance with Article 6(1)(e) GDPR. The court found that the purpose of unambiguously identifying candidates could be achieved without processing their date of birth, rendering the obligation excessive and against the principle of data minimisation.

As a result, the OVG Hamburg overturned the previous decision, instructing the university to accept the data subject’s candidacy and include him on the electoral list. This decision serves as a significant interpretation of GDPR principles in the context of university elections, reinforcing the importance of considering fully what personal data is strictly necessary for stated purposes.

Legal concerns mount as UK lawmakers question facial recognition technology

Recent developments in the United Kingdom have brought to light growing concerns among members of the House of Lords regarding the legality of live facial recognition (‘LFR’) technology used by law enforcement agencies. The Justice and Home Affairs Committee has raised apprehensions about the absence of a legal foundation for the deployment of LFR and is urging the government to introduce legislation for parliamentary scrutiny.

LFR technology, employed by police in England and Wales since at least 2015, lacks specific legislation governing its use, unlike other biometrics such as fingerprints and DNA profiles. The House of Lords committee expressed deep concerns about the expansion of LFR without proper scrutiny and accountability. They have called on the government to enact legislation that would regulate the technology, providing a basis for parliamentary approval.

Aside from legal concerns, questions about the accuracy of LFR have been raised. An independent study funded by the Metropolitan Police found an 81% inaccuracy rate in identifying individuals flagged by the system. Civil liberties activists worry about the significant surveillance capability of LFR, with its potential to trawl through CCTV feeds and identify individuals in public without robust safeguards in place.

The UK government’s proposed Data Protection and Digital Information Bill has sparked controversy, including concerns over the removal of safeguards around biometrics and public space surveillance. The bill would abolish the role of independent oversight previously held by the Biometrics Commissioner and replace it with a Forensic Information Database Strategy Board. This has raised concerns of a potential lack of independence due to the appointment process for board members, as well as the Secretary of State’s ability to change oversight using statutory instruments, bypassing parliamentary approval.

Alarming revelation: 127 million financial records potentially breached in the last five years

A recent disclosure under the Freedom of Information Act (‘FOIA’) by the UK Information Commissioner’s Office (‘ICO’) has revealed that over the last five years, an estimated 127 million data subjects may have been affected by personal data breaches involving economic or financial data. This startling revelation, based on notifications made to the ICO by data controllers, raises concerns about the widespread vulnerability of financial personal data in the UK.

Under Art. 33 of the UK GDPR, controllers are required to notify the ICO of a ‘personal data breach’ unless it is deemed unlikely to result in a risk to individuals’ rights and freedoms. Despite occasional concerns expressed by the ICO about potential over-reporting, a significant number of notifications are still made. In response to a FOIA request, the ICO disclosed that, since October 1st 2019, there have been notifications regarding 127,147,851 data subjects involving financial and economic data.

It is important to note that the occurrence of a ‘personal data breach’ does not necessarily equate to a breach of legal obligations; a breach can occur even when the data controller has complied with all its obligations. Additionally, the term ‘involving economic or financial data’ used in the disclosure is derived from the ICO’s reporting form, providing examples such as credit card numbers or bank details; this term, however, is not explicitly defined in the data protection laws.

The revelation of 127 million financial records potentially breached over the last five years underscores the critical need for a heightened focus on data security measures, and underscores the timeliness of the government’s proposed Cyber Governance Code of Practice (see above).

European Commission initiates formal proceedings against TikTok for potential digital services act violations

The European Commission has formally launched proceedings to investigate whether TikTok, a designated Very Large Online Platform (‘VLOP’), may have violated the Digital Services Act (‘DSA’) in areas concerning the protection of minors, advertising transparency, data access for researchers, and the management of addictive design and harmful content. This article provides an overview of the proceedings and the key areas of focus for the investigation.

The Commission’s investigation will centre on the following key areas:

Systemic risks and behavioural addictions

Assessing TikTok’s compliance with DSA obligations related to the assessment and mitigation of systemic risks, specifically focusing on potential negative effects stemming from the design of TikTok’s system, including algorithmic systems that may induce behavioural addictions or create ‘rabbit hole effects.’

Examining the effectiveness of mitigation measures, including age verification tools, to prevent access by minors to inappropriate content.

Privacy, safety, and security for minors

Evaluating TikTok’s adherence to DSA obligations in implementing appropriate and proportionate measures to ensure a high level of privacy, safety, and security for minors, especially concerning default privacy settings within recommender systems.

Advertising transparency

Scrutinising TikTok’s compliance with DSA obligations to provide a searchable and reliable repository for advertisements presented on its platform.

Data access and scrutiny

Investigating TikTok’s efforts to enhance the transparency of its platform, with a specific focus on potential shortcomings in providing researchers with access to publicly accessible data, as mandated by Article 40 of the DSA.

Next steps include gathering evidence through requests for information, interviews, or inspections. The Commission has the authority to take enforcement steps, including interim measures and non-compliance decisions. TikTok may propose commitments to address concerns raised during the proceedings, which the Commission has the authority to accept.

As the investigation progresses, TikTok will be closely monitored for compliance with DSA obligations. The duration of the inquiry will depend on various factors, including case complexity and the cooperation of the company. The European Commission’s actions highlight the increasingly complex regulatory landscape for digital services providers. In the UK, TikTok could face similar scrutiny of its approach for compliance with data protection laws and with the Online Safety Act.

Legal crackdown on Lockbit: NCA and FBI target Russia-linked hackers behind Royal Mail cyber attack

In a significant development, the UK National Crime Agency (‘NCA’) and the American Federal Bureau of Investigations (‘FBI’) have taken coordinated action against the Russia-linked cyber gang Lockbit, responsible for last year’s cyber attack on the Royal Mail. The law enforcement breakthrough comes over a year after Lockbit attempted to extort £66 million in cryptocurrency from the Royal Mail and disrupted its international delivery service for weeks.

Lockbit is a criminal gang that specializes in deploying ransomware to compromise victims’ IT systems by encrypting data, and then demanding payment for its release. In the case of the Royal Mail, the gang sought a ransom of £66 million in cryptocurrency; when the Royal Mail refused to pay, Lockbit proceeded to leak some stolen files, although most were deemed innocuous and lacked sensitive or customer information.

The consequences of the cyber-attack were severe for the Royal Mail (and its customers), reportedly causing an expenditure of £10 million for repairs and upgrades to its IT systems, and disrupting Royal Mail’s international deliveries for several weeks.

This week, Lockbit’s website was taken down, replaced by a notice indicating that it is now “under the control of law enforcement.” The NCA, working in collaboration with the FBI and an international law enforcement task force named Operation Cronus, has taken charge of the situation. The joint effort involved authorities from the UK, Europol, multiple European police agencies, as well as forces from Australia, Japan, and Canada.

Lockbit’s reach extended beyond the Royal Mail, with its ransomware having targeted other high-profile entities, including semiconductor manufacturer TSMC, aerospace company Boeing, and the Industrial and Commercial Bank of China, one of the world’s largest lenders. The cyber gang’s activities highlight the global nature of cyber threats and the need for international cooperation in addressing these challenges.

Electric car charger withdrawn over national security concerns

In a move aimed at safeguarding the national grid from potential cyber threats, the UK’s Office for Product Safety and Standards has compelled electric vehicle charger company Wallbox to withdraw its Copper SB model from sale. The move followed concerns that the particular model of charger could be exploited by hostile foreign hackers in a way which could pose a significant risk to the critical infrastructure of the UK.

The concerns raised centred around the adequacy of the cybersecurity protections of the Wallbox Copper SB charger, which is Internet enabled and can be controlled via a smartphone app.  Cybersecurity experts had warned that cybersecurity levels falling below the mandated regulatory requirements meant that the chargers could be exploited by hostile nations to orchestrate large-scale attacks on the National Grid, potentially leading to electricity blackouts. The UK’s intelligence, security and cyber agency, GCHQ, has previously warned about foreign hackers targeting critical infrastructure, including the electricity grid. Despite the regulatory intervention, Wallbox is permitted to continue selling its charger until the end of June 2024.

Whilst noting that there is no evidence of a specific flaw in the Copper SB product,  Wallbox informed its customers that the device does not comply with electric vehicle charger regulations enacted in 2021. This latest regulatory intervention serves as a reminder of the increasingly complex landscape of rules for networked devices, and recalls earlier headline cybersecurity events posing risks to critical infrastructure, including those addressed in our past commentary here and here.