Data Blast: Italy bans ChatGPT, Texas Approves Privacy Law, Further EU/US Pitfalls, ICO Investigating use of Meta Pixel in Healthcare and more
AI and Privacy: The Italian Regulator stops ChatGPT and Replika
As many news outlets reported, the Italian regulator (‘GPDP’) has banned ChatGPT until it can demonstrate compliance with privacy regulations. It comes at a time when how to regulate AI is a focus.
ChatGPT stands for ‘Chat Generative Pre-Trained Transformer.’ It was developed by OpenAI and according to its developer, ChatGPT is ‘an artificial intelligence trained to assist with a variety of tasks.’ It is a large language model (‘LLM’), a type of AI software capable of processing and simulating human-like conversations.
OpenAI, which does not have an office in the EU, but has designated a local GDPR representative in the EEA, must inform the GPDP within 20 days the measures undertaken to implement the GPDP’s requirements, otherwise it risks a fine of up to EUR 20m or up to 4% of the annual global turnover.
In the same wave of investigations, the GPDP also banned the chatbot ‘Replika’ for violation of the GDPR for not respecting the principle of transparency and carrying out the unlawful processing of personal data of minors. This finding was due to the fact that the processing cannot be based, even if only implicitly, on a contract that the minors in question were unable to enter into. Replika is a software application equipped with a written and vocal interface which, using AI, generates a virtual friend. The GPDP found that the app presents concrete risks for minors, including providing them with answers that are unsuitable for their level of development. The chatbot’s answers were often clearly in breach of the enhanced protections that must be ensured for minors and other vulnerable data subjects. Replika also had no age verification mechanism. To open an account, a user only needed to provide their name, e-mail address and gender. As with OpenAI, Luka Inc. (the US developer of Replika) must cease processing the data of Italian data subjects and must inform the GPDP of its measures to confirm to the GPDP’s requirements within 20 days. Failing to do so risks a fine of up to EUR 20m or up to 4% of the annual global turnover.
Chinese International Data Transfers: An Update
Earlier this year, the Cyberspace Administration of China (‘CAC’) released the final version of the Measures for the Standard Contract for Outbound Cross-border Transfer of Personal Data (‘Measures’) and of the Standard Contract itself, which will come into force on June 1st 2023.
The Standard Contract sets out the contractual clauses to be entered into between the data controller transferring personal information out of China and the overseas recipient.
Standard Contract can be adopted if the following criteria are met:
- The controller is not a critical information infrastructure operator;
- The controller processes the personal data of less than one million individuals;
- Since January of the previous year, personal data of less than 100,000 individuals in aggregate or sensitive personal data of less than 10,000 individuals in aggregate have been provided to overseas recipients by the controller.
One of the following conditions must be met by the controller before personal information may be transferred out from China:
- Passing the security assessment organised by the CAC;
- Obtaining a personal information protection certification from a specialised institution approved by the CAC;
- Entering into the Standard Contract with the overseas recipient; or
- Meeting other specific conditions provided in laws or administrative regulations or by the CAC.
Further, the ‘Measures’ specifically require that the data controller collecting data in China and transferring the same out of China shall make a recordation with the CAC within 10 working days from the effective date of the Standard Contract and that the adoption of the Standard Contract must be accompanied by a personal data protection impact assessment.
It is expected that further rules and guidelines will be published by the CAC.
Texas Approves Comprehensive Privacy Law
The Texas House and Senate have both approved its comprehensive privacy law, which will cover its nearly 30 million consumers. However, the Texan law can potentially have extra-territorial affect, as it applies to any company that conducts business in Texas or with Texas residents that sells, shares, or transfers data and is not defined as a ‘small business’ by the United States Small Business Association. This law will come into force on July 1st, 2023.
The Texan privacy law requires businesses under its remit to comply with numerous privacy and security requirements, including:
- Obtaining consent from individuals before collecting or processing their personal information. Note that, unlike in some other jurisdictions, consent can be obtained through a variety of methods, including opt-in, opt-out, and implied consent;
- Implementing reasonable security measures to protect personal information from unauthorised access, use, or disclosure;
- The requirement to provide individuals with access to their personal information and the ability to correct or delete it, accommodating requests to access, amend and delete such within 45 days; and
- The requirement to notify affected individuals within 72 hours of discovering a data breach.
Following California, Colorado, Connecticut, and Virginia, this law gives individual Texans a private right of action to sue for damages, injunctive relief and other remedies. More general enforcement will be undertaken by the Texas Attorney General.
Further EU Regulator Cracks Down on transfers to the US
Following the huge fine to Meta Ireland reported on in our last data blast, several other EU data protection authorities have recently taken steps to ban transfers of personal data from their EU Member State to the US.
The Belgian regulator has banned the transfer of tax data of US citizens resident in Belgium from Belgium to the US. It found that the agreement signed between Belgium and the US under the US Foreign Account Tax Compliance Act (‘FATCA’) was not in line with the EU GDPR, and the Belgian tax authority ought to have conducted an impact assessment on it.
The Belgian FACTA agreement required Belgian banks to inform US authorities if a US citizen living in Belgium had a Belgian bank account. One data subject, having been informed that his bank was legally obliged to inform the tax authorities that they had an account, as well as his name, address, jurisdiction of residence, tax identification number, date of birth, account balance, account number and other information relating to his bank assets, and the Association Accidental Americans filed a complaint to Belgian regulator requesting that the transfer of data between the Belgian and US authorities cease.
The Belgian tax authority relied on Article 96 GDPR, given that FACTA predated the GDPR. The Belgian regulator found that Article 96 GDPR was not a shield against the GDPR applying, but rather than actions taken under earlier international agreements must still be reviewed with regard to the principles of the GDPR, including purpose limitation, data minimisation, and necessity. The Belgian regulator followed the decision in Schrems II and banned any further data processing pursuant to the Belgian FACTA agreement. It found that there was no way that such processing could comply with the GDPR without an adequacy decision and a framework for transfer to the US.
Norway Bans Mass-Processing for Official Statistics
Statistics Norway (‘SSB’), Norway’s national statistics institute, instructed the main grocery chain stores in Norway (covering 99% of the Norwegian grocery market) to submit purchasing data on a regular basis, including:
- Name of item
- Price per item
- Total amount of the receipt
- Payment method
- Amount per payment method
- Start and end time of the purchase
- ID of returns
- ID for terminated purchase
- ID of offers/discounts
SSB wanted this data from the grocery stores’ point of sale systems so that SSB would have this information in real time. While this data is not personal data, the plan was to connect this data with transactional data, which would then allow the purchasing data to be connected with specific individuals.
SSB claimed that this processing was lawful under the Norwegian Statistics Act duty to provide information, which states that ‘any person must provide the data that are necessary to develop, produce or disseminate official statistics if so ordered by Statistics Norway.’ SSB had undertaken data protection impact assessments with regard to this processing from January 2021 ad October 2021 to June 2022.
After one of the grocers complained to the regulator, they examined the impact reports. The regulator found that information about nearly all grocery purchases for everyone in Norway would be collected and stored indefinitely, without allowing the data subjects to exercise their rights (due to exceptions in the national regulations).
The Regulator found that despite operational independence, SSB was part of the Norwegian state, due to being funded from the national budget. This meant that SSB had to comply with the European Convention on Human Rights, including the right to privacy. The real-time, continuous nature of this processing was in breach of that right. Therefore, the regulator found that, in this context, SSB could not rely on the Norwegian Statistics Act as a legal basis to process the purchasing data, and doing so was in breach of the GDPR. The impact assessment were inadequate, and SSB had a sufficient understanding of data protection that it should have done better. Accordingly, the regulator banned SSB from further processing of purchase data taken real-time from point-of-sale systems.
UK ICO Investigating NHS Trust/Charity use of Meta Pixel
After an investigation by The Observer newspaper, it has come to light that 20 NHS Trusts and seven mental health charities used the Meta Pixel (a small piece of code that creates a transparent pixel on the webpage) to track usage on their respective websites. However, this tracking tool also transferred personal data about individuals (including sensitive health data) to Meta. Meta uses the data that the Pixel provides to improve its targeted advertising.
The Observer found that the quality of privacy policies on the charities’ websites varied considerably. Some did not even mention the use of the Meta Pixel. However, even where it was mentioned, data transferred occurred on loading the landing page, before the user had the chance to decline cookies. When a user clicked a link which referred to ‘support with abuse,’ ‘support with suicidal thoughts’ or ‘support with self-harm,’ the fact that that individual had clicked that link was then matched with the user’s IP address and often with their Facebook account ID, which made them easily identifiable.
All the NHS Trusts and most of the charities have now removed the Meta Pixel from their webpages. Facebook has stated that organisations should not use the Meta Pixel to collect or share sensitive data, and claims to have filters to weed out sensitive data it receives by mistake. However, past research in these filters has suggested they do not always work, and Facebook itself has admitted the system does not catch everything. The ICO has already opened a formal investigation into the NHS Trusts use of the Meta Pixel, and has ‘noted the new findings’ with regard to the charities’ use. A spokesperson for the ICO has said:
‘Organisations must provide clear and comprehensive information to users when using cookies and similar technologies, especially where sensitive personal information is involved. We are continuing to review the findings and to investigate the potential extent of any personal data collected and shared with third parties via the use of pixels.’