Data Blast: UK government publishes plans for post-Brexit reform of UK GDPR…

See below for the latest Data Blast from our legal team: UK government publishes plans for post-Brexit reform of UK GDPR; English court decision sparks concerns over use of CCTV and video doorbells for home security; UK ICO addresses emergency data sharing by universities; South Korea nears approval for EU data transfers…
UK GDPR to be overhauled in planned reforms
The Department for Digital, Culture, Media and Sport (DCMS) has launched a major consultation on proposed changes to the UK’s data protection regime. The government is keen to stress that in the post-Brexit environment, it is not seeking to ‘water down’ the UK version of the GDPR, but instead it wants to “create a new world-leading data regime that unleashes the power of data across the economy and society.” The consultation and proposed reforms are wide-reaching, but some of the major proposals include:
- The elimination of Article 22 concerning automated decision making, and instead permitting the use of solely automated AI systems on the basis of legitimate interests or public interest;
- Introducing a list of recognised ‘legitimate interests’ which allow for data processing, removing the need for a balancing test to be undertaken;
- Clarifying when processing of personal data is necessary for reasons of substantial public interest (a lawful basis for processing special category data);
- Introducing clearer tests to determine if personal data can be regarded as anonymous;
- Expanding the ability of non-commercial entities including political parties and charities to send email and text communications by relying or the ‘soft opt-in’ consent by individuals who have engaged with those organisations in the past;
- Clarifying the ability of public and private bodies to use health data for scientific and medical research;
- Reducing barriers to trade by using alternative transfer mechanisms, for example by establishing adequacy regulations for groups of countries, regions and multilateral frameworks which have shared, harmonised or common data protection rules;
- Permitting the use of analytics cookies and other similar technologies without requiring the consent of users to reduce excessive ‘cookie pop-up’ notices; and
- A fee regime applicable to individuals to address the time and costs incurred by organisations in handling subject access requests (SARs).
DCMS states that it expects that it will be possible for the UK to retain its EU adequacy decision on the basis that “European data adequacy does not mean verbatim equivalence of laws.”
As part of the proposed changes, the government is also seeking to reform the structure of the Information Commissioner Office (ICO), including establishing an independent board and a chief executive, to resemble the governance of other regulatory bodies such as the Financial Conduct Authority (FCA) and the Competition and Markets Authority (CMA).
In its response to the consultation, the ICO states that it welcomes many aspects of the DCMS proposals, for example making it easier to use, share and re-purpose data for research, and in doing more to tackle unsolicited direct marketing calls and fraudulent calls. The ICO has also said that it welcomes the proposal to introduce a more commonly used regulatory governance model for the ICO, noting, however, some concerns around accountability and independence, in light of the proposal to allow the Secretary of State to approve ICO guidance and to appoint a chief executive.
It remains to be seen which of the Government’s proposals will ultimately be implemented. The consultation remains open until November 19th 2021 and we will provide further updates in due course.
The DCMS consultation paper can be read here.
The ICO response to the consultation paper can be read here.
County Court judgment against homeowner for intrusive home surveillance systems attracts widespread misreporting on data protection aspects of the case
A judge sitting in the Oxford County Court issued a 49 page decision on October 12 2021 in which she held that the defendant’s use of home security technologies, including a Ring doorbell with a video camera and microphone, amounted to harassment and to a breach of the Data Protection Act 2018 and UK GDPR.
The judgment was widely reported following an initial misleading headline in a tabloid publication suggesting that the claimant was likely to receive a £100,000 damages award due to the defendant’s breach of data protection law (the decision notably contains no mention of the damages which might be awarded following further submissions).
In contrast to those headlines, the judgment of HHJ Melissa Clarke (who also sits as a judge in the Intellectual Property Enterprise Court), reveals an extreme set of circumstances which was held to amount to a sustained campaign of harassment against the claimant; this included attempted intimidation by alleging that a video of the claimant had been passed to the police (where the defendant also claimed to have ‘friends’). The defendant’s sustained surveillance activities were also held to be unfair processing of the claimant’s personal data, in breach of data protection law. Essential to the context of the judgment, is the judge’s pointed finding early-on that the defendant’s ‘evidence was dishonest, exaggerated or otherwise incredible.’
There was no dispute that the defendant had processed the personal data of the claimant by capturing her image on CCTV cameras installed at his property, but which covered an area beyond the boundaries of his property. Similarly, the Ring doorbell was configured to capture sound beyond the defendant’s property, including on the claimant’s property.
The judge considered whether the processing of the claimant’s personal data by the defendant’s security cameras and doorbell was justified by his legitimate interests in securing his property; in relation to incidental images captured by cameras focussed principally on the defendant’s property, the judge held that the claimant’s own privacy interests did not outweigh the defendant’s interest in securing his property. Where the defendant’s cameras were configured such they would capture more than ‘incidental’ images of the claimant, the defendant could not rely on his legitimate interests as a legal basis for capturing and storing such images; those uses breached data protection law.
The judge then focussed on the principle of data minimisation, which requires that personal data processed ‘shall be adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed.’ The judge concluded that the defendant’s security cameras, including his Ring doorbell camera, served principally to capture video images, and that their security function was served without requiring that audio recording was also enabled. Accordingly, the capture of audio from beyond the defendant’s property amounted to unfair processing and a breach of data protection law.
It is surprising that the judgment contains no mention of article 2(2)(a) of the UK GDPR, which provides that personal data processed for purely personal or household purposes falls outside the scope of the regulation. The absence of any mention of the exemption is perhaps due to the defendant having conceded before trial that his processing of the claimant’s personal data fell outside the bounds of the exemption.
Rather than a decision placing all homeowners in breach of data protection law if they operate Ring doorbells and other security devices, the judgment highlights that the processing of personal data for domestic purposes must not exceed what is genuinely necessary in the pursuit of maintaining the security of one’s home.
ICO aims to raise universities’ awareness of sharing personal data in an emergency
Earlier this month, and just in time for the new academic year, the ICO published a blog entitled “Sharing personal data in an emergency – a guide for universities and colleges.” The blog stresses that concerns over data protection should never prevent educational institutions from sharing personal data in emergency situations.
The blog reminds readers that data protection legislation allows controllers to share personal data in emergencies in order to prevent loss of life, or serious physical, emotional or mental harm and urges institutions to take whatever steps are necessary and proportionate to protect students in an emergency. The blog outlines 5 practical steps for universities and colleges to consider in order to facilitate the sharing of personal data in an emergency:
- Put an emergency plan in place and consider what types of personal data may need to be shared in an emergency, and how it can be shared securely;
- Implement data sharing agreements where there is a need for institutions to share students’ data on a more frequent basis;
- Train staff on how to handle personal information in an emergency; and
- Make full use of the ICO’s data sharing resources and guidance, including the data sharing code of practice and the data sharing information hub.
The ICO states that by ‘busting data sharing myths’ it can work with universities, educational bodies and parents to reassure them that data protection law enables data sharing to save lives and protect young people.
The ICO’s blog can be read here.
EDPB adopts opinion on draft South Korea adequacy decision
The European Data Protection Board (EDPB) has adopted a favourable opinion on the European Commission’s draft adequacy decision for South Korea, noting that South Korea provides a level of data protection essentially equivalent to that of the GDPR. The GDPR applies to controllers and processors within the EEA and restricts transfers of personal data to third countries unless adequate safeguards are in place such as standard contractual clauses. However, the European Commission (EC) has the power to determine whether a third country offers an adequate level of data protection and to issue an adequacy decision.
In adopting its opinion on the EC’s adequacy decision, the EDPB noted that there are key areas of alignment between the EU and South Korean data protection frameworks, such as in relation to the concepts of data processing; the grounds for lawful processing and transparency; and data retention, security and confidentiality.
However, despite its broadly positive opinion, the EDPB also concluded that certain aspects require further clarification and/or assessment before adoption of the adequacy decision by the EC. For example, the EDPB has asked for clarification concerning the concept of pseudonymisation and various exemptions when processing pseudonymisation information (i.e. in relation to individual data subject rights and data retention). The EDPB also requires further assessment of the impact of the limited grounds under Korean law to withdraw consent.
Once the EDPB’s outstanding concerns have been addressed, Member States will vote on whether to finally adopt the decision. To date, the EC has recognised the following jurisdictions as offering an adequate level of protection: Andorra, Argentina, Canada, Faroe Islands, Guernsey, Israel, Isle of Man, Japan, Jersey, New Zealand, Switzerland, Uruguay, and the UK.
The EDPB’s opinion can be read here.
Council admits vehicle registration plate recognition software ‘blooper’ and sets aside driver’s fine
In what ultimately proved an amusing blunder by Somerset council CCTV cameras, a vehicle owner who lives in Surrey, about 120 miles from Bath, received a penalty notice for having driven into that city’s central congestion charge zone.
The penalty notice was accompanied by images of the ‘offence’ recorded by Bath’s CCTV; rather than showing the recipient’s vehicle entering the congestion charge zone, the image showed a woman wearing a Covid face covering walking in the road whilst wearing a t-shirt with the word ‘KNITTER’ emblazoned on its front. The council’s congestion charging software incorrectly interpreted the pedestrian’s top to be the vehicle registration plate ‘KN19TER.’
It was reported that the automated fining decision was confirmed by an initial human review; this was later admitted by the council to have been an error, and the fine was annulled.
For more information please contact Partner, James Tumbridge at jtumbridge@vennershipley.co.uk