Sectors

Services

19 September 2023

Data Blast: Hungary fines municipality for vaccine survey, French MP challenges EU-US Data Agreement, Enforcement begins on EU Digital Services Act, Irish DPC fines TikTok, and more!

Hungary fines municipality for vaccine survey

The Hungarian data protection regulator (‘HDPA’) has fined a local municipality for unlawfully conducting a survey of the vaccination status of its local residents. The survey in question was not carried out under the instructions of the central government in Budapest.

The data controller, the municipality of Pestszentlőrinc‑Pestszentimre, aimed to create anonymised statistics upon which to base its pandemic decisions. The data controller lawfully obtained the names and postal addresses of all adults in the municipality. However, the questionnaire sent to residents was deemed to be in breach of the GDPR, because it was not made clear that it was optional for the recipients to answer the survey, including to provide their name, telephone number and email address; it also did not state that this data would be processed alongside sensitive health data (covid vaccination status). The final issue was that the data subjects were only able to consent (the only legal basis upon which the municipality could rely) to the processing after the questionnaire had been completed and the data submitted to the data controller.

The HDPA found that the municipality was an independent controller under Art. 4(7) of the GDPR, and that it failed to provide adequate information with regard to the data processing to the data subjects, in contravention of Arts. 12(1) and 14 GDPR. Processing personal data which the data subjects were unaware was optional resulted in excessive processing under Arts. 5(1)(b) and 5(1)(c) GDPR. The mixing of the optionally provided data with the health data meant that the data subjects had no longer consented to the processing and the municipality no longer had a legal basis for such. Accordingly, the HDPA fined the municipality 3 million Hungarian Forint (c. EUR 7,800).

Spanish Media Company fined for use of private Instagram photograph

The Spanish data protection regulator (‘SDPA’) has fined a media company for posting a photograph of a rival journalist on a blog, along with her name and age, without her consent. The photograph had been posted on the journalist’s private Instagram page, which could only be accessed by people she authorised to do so.

When the journalist became aware of the blog post, she complained to the SDPA. The data controller (the owner of the blog) sought to argue that the photograph was not personal data, because it only showed the legs of a woman, and the data subject had not proved that they were her legs (ignoring that the photograph was clearly stated to be the legs of the complainant journalist). It also tried to argue that this was a question of freedom of expression/freedom of the press and that the intent was to criticise the other journalist for her trivialisation of the profession. However, the controller did admit that it took the photograph from her private Instagram account.

The SDPA found that a picture of someone (even just their legs) may constitute personal data under Art. 4(1) GDPR. In this case, the data subject was identified by name and age, as well as by the photograph. Taking the photograph from the data subject’s private Instagram page was not an act of freedom of expression, but was rather an unauthorised publication of personal data. Accordingly, the SDPA fined the controller EUR 20,000 under Art. 6(1) GDPR.

French MP Challenges EU-US Data Agreement before EU Court

Philip Latombe, a member of the French Parliament, has announced that he is challenging the recent transatlantic data deal in the European Union’s General Court, a constituent court of the Court of Justice of the European Union (‘CJEU’), which hears actions taken against institutions of the European Union by individuals and member states.

This challenge has come sooner that some might have expected, less than two months since the European Commission and US government agreed a deal on personal data transfers from the EU to the US (the ‘Privacy Framework’). The CJEU had struck down the previous data transfer agreement, the Privacy Shield, in 2020, following a second successful legal challenge by privacy activist Max Schrems, whom many had expected to be the first to commence a legal challenge against the Privacy Framework.

In his statement regarding his challenge to the new regime, Latombe said: ‘The text resulting from these negotiations violates the Union’s Charter of Fundamental Rights, due to insufficient guarantees of respect for private and family life with regard to bulk collection of personal data, and the General Data Protection Regulation.’

Latombe has in fact filed two separate challenges – one to suspend the Privacy Framework immediately (similar to an interim injunction) and another to invalidate the Privacy Framework’s text itself. The threat of an immediate suspension of the Privacy Framework will come as a disappointment to businesses seeking certainty with regard to the transferring of personal data between the EU and the US, particularly those which had already taken steps to self-certify under the Privacy Framework.

UK Government releases interim report on the Governance of AI

The House of Commons Science, Innovation and Technology Committee has released its ninth report on the governance of artificial intelligence (available here). The Committee report highlights the following challenges for AI governance as it relates to personal data, which lawmakers must seek to meet in designing AI regulations:

  • The Privacy challenge: AI can allow individuals to be identified and personal information about them to be used in ways beyond what the public wants.
  • The Misrepresentation challenge: AI can allow the generation of material that deliberately misrepresents someone’s behaviour, opinions or character.
  • The Access to Data challenge: The most powerful AI needs very large datasets, which are held by few organisations.
  • The Black Box challenge: Some AI models and tools cannot explain why they produce a particular result, which is a challenge to transparency requirements.
  • The Liability challenge: If AI models and tools are used by third parties to do harm, policy must establish whether developers or providers of the technology bear any liability for harms done.
  • The International Coordination challenge: AI is a global technology, and the development of governance frameworks to regulate its uses must be an international undertaking.

Data protection is a central concern for the regulation of AI; many algorithms are trained on personal data, and many AI applications are directed at individuals, which engage data protection law. The UK is eager to play a leading role in AI regulation; in July 2022, the government first published its plan for an innovation-friendly approach to AI regulation (which we covered here setting out an approached characterised as distinctly ‘light touch’ in contrast to the EU’s proposed AI Act.

Local Council able to deny Freedom of Information Act Requests where release of information could lead to or assist criminal activity

South Tyneside Council (the ‘Council’) received a Freedom of Information Act (‘FOIA’) request to reveal: which areas of their Town Hall were open to public access; which were restricted; who made those decisions; and when. Due to the passage of time, the Council no longer had the information about who made the decisions or when. However, the Council did have the floor plans that set out which parts of the Town Hall were open to the public.

The Council refused to release the floor plans, relying on s31 of the FOIA, which states that information likely to prejudice the prevention of a crime is ‘exempt information’ and therefore does not have to be released to the public. The Council also relied on s38, which states that information that risks the health and safety of any individual is exempt information.

The complainant (i.e. the person who made the FOIA request) appealed to the Information Commissioner’s Office (‘ICO’). The ICO sided with the Council, having been persuaded by the Council’s arguments that:

‘disclosing the floor plans of the town hall, which show public and restricted areas, and entry and exit points, would weaken its security and make its staff, and any members of the public on the premises, more vulnerable to crime.’

The Council explained to the ICO how knowledge of the layout of the Town Hall could be used for criminal purposes, and was able to provide examples of past instances when its security had been breached by people entering the Town Hall for antisocial or criminal purposes, which could have been even more damaging, had the perpetrators had perfect knowledge of the layout of the Town Hall.

The ICO case officer had to weigh the opposing interests of the public in this case; the general presumption that information about the public sector should be available to the public against the increased risk to the local council staff, should that information be made public. In this case, the case manager concluded that:

‘the very real risk that the floor plans would be misused, and of the risk to people’s safety and security that this would represent, the Commissioner is satisfied in this case that the public interest in maintaining the exemption provided by section 31(1)(a), clearly outweighs the public interest in disclosure.’

Accordingly, the case officer agreed that the Council was entitled to rely on s31 of the FOIA to refuse to disclose the floor plans. In the light of that decision, it was unnecessary to consider the s38 arguments.

Enforcement begins on EU’s Digital Services Act

Since late August, some of the largest tech companies have had to comply with the EU’s Digital Services Act (‘DSA’). Meta, Google, and X (formerly Twitter), amongst others (a total of 19 so-called ‘Very Large Online Platforms’ or ‘VLOPs’), are required to submit annual risk assessments to the European Commission and attempt to limit the spread of illegal content. They are also required, among other obligations, to have user-friendly terms and conditions, provide an option to not have any recommendations be based on profiling, and allow researchers to access platform data when the research contributes to the detection, identification and understanding of systemic risks in the EU.

Much work remains for the European Commission before the DSA can take full effect. In particular there are a number of legal uncertainties still to be addressed before the new rules can be enforced. In order for the DSA to take full effect, the Commission has the challenging task of agreeing cooperation with national regulators, EU agencies (including Europol and the newly created European Centre for Algorithmic Transparency (‘ECAT’)) and competence centres by February 24th 2024. From that date, smaller companies will also have to comply with the DSA.

There is also uncertainty around some parts of the DSA, with Amazon and Zalando objecting to their classification as VLOPs. The Commission is reported to be planning to address these uncertainties by adopting delegated acts and secondary legislation in the near future. The current definition of VLOP is a platform with more than 45 million monthly users in the EU. VLOPs have stricter rules to follow than small companies, and if they do not comply with the DSA, can face a fine of up to 6% of worldwide turnover. Both Amazon and Zalando are currently challenging the European Commission’s classification of them as VLOPs in the European Court of Justice. We will update you with progress on those cases, as such develop.

Irish Regulator announces EUR 345 million fine for TikTok

The Irish Data Protection Commission (‘DPC’) adopted a final decision in its inquiry into TikTok Technology Limited (‘TikTok’), which provides the social media platform across the EU.

The scope of the inquiry was to examine how well TikTok had complied with its obligations under the GDPR in relation to its processing of personal data relating to child users of the platform in the context of:

  1. Certain TikTok platform settings, including public-by-default settings as well as the settings associated with the ‘Family Pairing’ feature; and
  2. Age verification as part of the registration process.

When it had concluded its investigation, the DPC submitted a draft decision to all concerned supervisory authorities (i.e. other EU data protection regulators) as required by Art. 60(3) GDPR in September 2022. There was broad consensus on the proposed finding of infringements, but there were also objections brought by data protection regulators in Italy and Germany, the latter of which wished to add a finding of ‘unfairness’ under Art. 5(1)(a) GDPR.

The DPC was unable to reach a consensus with those regulators, and therefore referred the objections to the European Data Protection Board (‘EDPB’) for a binding determination under Art. 65 GDPR. The EDPB directed that the DPC must amend the draft decision to include a finding of unfairness, as requested by the German regulators (those in Berlin and Baden-Württemberg), and extend the scope of the existing order to bring processing into compliance with that section of the GDPR as well.

The final decision, adopted on September 1st 2023, includes the following:

  • A reprimand;
  • An order requiring TikTok to bring its processing into compliance with the GDPR by taking specified actions within a period of three months from the date on which the DPC’s decision is notified to TikTok; and
  • Administrative fines totalling EUR 345 million.

TikTok has said that is considering its position; we shall provide an update if the DPC decision is appealed.