23 August 2022

Data Blast: French Guidance on Cookie Walls; France Widens Fining Powers for CNIL and more…

See below for the latest Data Blast from our legal team: French Guidance on Cookie Walls; France Widens Fining Powers for CNIL; Remember to Restrict Access of Ex-Employees; Big Brother is Not Allowed to Watch you Have a Break; Action Needed When Users Unsubscribe; and ICO Revises Approach to Public Sector Enforcement…

French Guidance on Cookie Walls

As we mentioned in a previous data blast, the French data protection regulator (the CNIL) had attempted to exert a blanket ban on ‘cookie walls’ (i.e. requiring a user to agree to allow cookies to be downloaded on their computer to access a free website) through draft guidance. When using cookie walls, websites often give the consumer the option to pay a subscription fee (known as a ‘paywall’) to avoid the cookie wall so that the website owner is compensated for the loss of advertising revenue.

The Conseil d’État (the legal adviser of the executive branch and the highest court for administrative justice in France) held on June 19th 2020 that the CNIL could not create binding law by way of a ‘soft law’ instrument such as their draft guidance.

In the wake of the Conseil d’État ruling, the CNIL has now published preliminary criteria for the assessment of the legality of cookie walls. The criteria focus on the most commonly observed practices, but each case must be considered on its own facts.

1. Does the Internet user who refuses cookies have a fair alternative to access the content?

a) The publisher has to avoid creating an imbalance to the detriment of the internet user, and to this effect, ensure the ease of access for the user to this alternative. This could happen when the publisher has exclusivity on the relevant contract/service, or whether the internet user has few or no alternatives to the service, and so has no real choice as to the use of cookies, e.g. in the case of dominant or essential service providers.

2. If there is an alternative paywall, is the price reasonable?

a) The fee must not be so as to deprive internet users of real choice. The reasonable fee level should be set on a case by case basis, and the CNIL encourages publishers to be open about their analysis and reasoning behind costs.

b) The structure of the payment must also be reasonable – sometimes micropayments from a virtual wallet could be more appropriate than a subscription requiring the registration of card data.

c) Where the user must set up an account, the purpose of such must be specific and transparent – e.g. to allow the user to benefit from a subscription on multiple devices.

3. Can the cookie wall cover “all” cookies indiscriminately?

a) Cookies should be able to be refused based on their purpose, so if one can only accept or refuse all cookies this can affect freedom of choice and validity of consent.

4. The user chooses paid access without consenting to cookies: in what (limited) cases can cookies still be deposited?

a) In principle, no non-essential cookies should be placed when the internet user has opted for paid access. However, the publisher may request, on a case by case basis, that the customer accept cookies. Such cookies could include those needed to play content hosted on a third-party site, or for a service requested by the user.

The use of cookies in France must still be in compliance with the GDPR. Therefore the cookies must still have a legal basis (and it is not clear that consent will always be applicable), limited retention periods for data, and proper safeguards if the data will be transferred abroad.

While the guidance of the CNIL (and any changes that are made to such by the Conseil d’État) only applies to France, it may well have a knock-on effect. Companies operating in multiple jurisdictions are unlikely to want to have cookie policies that vary massively between different states. It is much easier for such companies to set the bar for their cookies to that of the most stringent regulator and apply that across the board.

The current situation is that, while cookie walls are not wholly prohibited in France, their use requires careful prior assessment. Unfortunately, the CNIL preliminary criteria do not provide much certainty for websites looking to implement cookie walls. A number of points in the criteria will require further information, including the notion of a ‘fair price’ for paywalls, and which specific websites will be considered ‘dominant or essential service providers.’

France Widens Fining Powers for CNIL

Staying with France, the Conseil d’État last month ruled that the CNIL can impose fines not just within the so-called OSS (‘One-Stop-Shop’) enforcement mechanism under the GDPR.

In late 2020, the CNIL issued Amazon a EUR 35 million fine for the use of advertising cookies on the computers of customers using without consent or satisfactory information. The CNIL found that Amazon has breached Article 82 of the French Data Protection Act (which was the domestic legislation implementing the e-Privacy Directive). The Conseil’s ruling confirmed that the fine was appropriate, given the scale and seriousness of the breaches and Amazon’s financial position.

Commenting on this fine being outside the GDPR OSS, the judgment states that the CNIL is: 

competent to sanction breaches of Article 82 of the French Data Protection Act, even in cases where the data controller is not established in France, but has an establishment on French territory involved in activities related to the processing carried out, in this case the promotion and marketing of advertising tools by the company Amazon Online France.’

Remember to Restrict the Access of Ex-Employees

The Danish DPA (‘DDPA’) has reprimanded a municipality for breaching Article 32(1) GDPR by not restricting a terminated employee’s access to its electronic case and document management system (‘SBSYS’).

In the summer of 2021, the DDPA conducted inspections of a number of municipalities, focussing on their way of administering access rights to the personal data of children and young people, especially relating to schooling. With regard to one municipality, it found that while the municipality had a comprehensive information security handbook setting out procedures for access rights management, in one instance it had not followed its own procedures. This was because the municipality did not routinely review whether terminated employees had an active user account to access SBSYS. This meant that the DDPA assumed the user in question had access to SBSYS, even after their resignation as an employee.

The DDPA found that the data controller must always identify risks with regard to data processing and implement adequate security measures to protect data subjects against such risks. Such measures should ensure that only users with a work-related need can access SBSYS. In addition to a procedure for removing access for employees upon the termination of their employment, the DDPA found that there must be a control procedure to check that the access was in fact removed.

The DDPA found that, under Article 32(1) GDPR, the municipality did not take appropriate technical and organisational measures to ensure an appropriate level of security regarding the risk involved in processing the personal data. This is because, firstly, it did not remove access to its document management system from an employee after that employee resigned, and secondly, it did not carry out checks to ensure that ex-employees no longer had rights to access the system.

It is important for all businesses to have adequate policies in place when it comes to the termination of employees. However, it is just as important to ensure, and to be able to demonstrate, that those policies were actually put into practice, so that no breach of the GDPR occurs.  

Big Brother is Not Allowed to Watch You Have a Break

The Hungarian DPA (‘HDPA’) has fined a car repair shop HUF 500,000 (c. EUR 1,300) for failing properly to inform its employees about CCTV surveillance and using such surveillance in areas intended for work breaks. The complaint came from the co-owner of the property, which also operated a car repair shop. The complaint said that the other co-owner installed six CCTV cameras in the shop and would not remove them or give information about their operation.

The data controller responded to the HDPA saying that the cameras were to protect the tools and valuable objects in the shop. It said that the cameras were not used for surveilling public areas. One of the cameras was in the kitchen; this area was not used for work, but did have the company’s safe deposit box. Another was in the office/customer waiting room, in which the controller claimed administrative work was carried out. That room also had the cash register, bankcard reader and cash desk.

The HDPA found that, as the controller stated in its privacy policy that the legal basis for the data processing was not consent, he ought to have conducted an assessment on whether there was a legitimate interest for the use of cameras; his failure to do so was in breach of Article 6(1)(f) GDPR. However, the HDPA note that the degree of danger of work carried out and high value of assets on the premises could justify the use of CCTV surveillance. The HDPA ordered the controller to amend the privacy policy to reference explicitly the legitimate interest of the controller. The HDPA also ordered the controller to remove references to irrelevant legislation and to legislation no longer in effect.

The HDPA also required the controller to change the angles of the cameras in the office/customer waiting room and the kitchen so that they do not result in unjustified surveillance of employees. This is in line with the principles of purpose limitation under Article 5(1)(b) GDPR and data minimisation under Article 5(1)(c) GDPR. In particular, the HDPA was concerned that the camera in the kitchen was pointed toward the dining table (rather than the safe deposit box), so it was not being used for the protection of individuals or property.

Finally, the HPDA required the controller to amend the privacy notice of the camera system in line with Article 13(1) GDPR and Article 13(2) GDPR. The controller had sent a photograph to the HDPA showing that it had placed the notice on the entrance to the building on May 8th 2021 alerting entrants to the camera system, but as the system went live on March 5th 2021, the requirement of adequate information prior to processing was not met at the outset.

The level of fine took a number of factors into account. The aggravating factoring included the fact that the infringement was still in progress during the investigation and that the infringement concerned fundamental privacy rights. However, the mitigating factors were that only two of the six cameras were facing problematic areas, the infringements only affected a limited number of people (six employees in total), the data subjects did not suffer specific harm or damage due to the infringement, there was nothing to suggest that the violation was intentional, and it was the controller’s first offence. The mitigation was for the most part successful, hence the extremely modest fine by data protection standards.

Action Needed When Users Unsubscribe

The Danish DPA has recommended a fine of DKK 1 million (GBP 115,350) against the largest publishing house in Denmark, Gyldendal, for failing to delete the personal data of c. 685,000 unsubscribed members of Gyldendal’s book clubs that had been stored for longer than needed.

The publisher had the data stored in a so-called ‘passive database’ and had no policy or procedures for deleting it. Had the DDPA not intervened, that data would never have been deleted. Some of the information had been stored in the database for more than 10 years.

The DDPA found that Gyldendal had kept the data for longer than necessary, in violation of the GDPR principles of storage limitation and accountability. In Denmark the DPA does not issue fines directly but gives recommendations to the police who then investigate and bring a charge, with the eventual fine being issued by a court.

ICO Revises Approach to Public Sector Enforcement

The ICO announced a new approach to fines involving the public sector, placing an emphasis on the Commissioner’s discretion to reduce the impact of fines on public bodies (fines against public sector authorities effectively ‘just moving tax-payer money around’). This discretion will be coupled with better engagement with public bodies, including publishing the lessons learned and sharing good practice for other institutions from which to learn. This new approach is set to be trialled over the coming two years.

The ICO’s press release states that in practice this will mean fewer fines for public bodies, and more use of the ICO’s wider powers, including warnings, reprimands and enforcement notices, with fines reserved for the most serious cases.

When a fine is imposed, the ICO will publish the amount of the fine that would have been levied for a private company, so that others know what the level of wrongdoing was and the expected punishment. That amount is likely to be reduced for the public body.

An example of the scale of reduction of the fines for public bodies was given in the treatment of The Tavistock and Portman NHS Foundation Trust, which was found to have disclosed the email addresses of 1,781 adult gender identity patients by using the ‘cc’ field rather than the ‘bcc’ field. With the revised approach, the ICO reduced the fine from £784,800 to £78,400.

For more information please contact Partners, James Tumbridge at or Robert Peake at