24 September 2020

Data Blast: Salesforce and Oracle face class actions; UK firm fined £130K for marketing calls and more…

See below for the latest Data Blast from our legal team: Salesforce and Oracle face class actions; UK firm fined £130K for marketing calls; UK increases oversight of law enforcement data sharing with US; Portland adopts strictest US limits on facial recognition technology; Brazils data protection law goes live…

Salesforce and Oracle face class action suits in the Netherlands and UK

On August 14th, privacy campaign group ‘the Privacy Collective’ announced plans to pursue class action suits against Salesforce and Oracle for GDPR violations, including aggregating information collected from websites and profiling, which could cost the two companies up to £10 billion in fines. One of the actions is to be launched in the Netherlands, with plans for a similar claim in the UK to be commenced as a collective action.

It appears the Privacy Collective’s claims are being underwritten by a litigation funder, Innsworth Advisors. Ian Garrard, managing director of Innsworth, said in a statement ‘the development of class action regimes in the UK and the availability of collective redress in the EU/EEA means Innsworth can put money to work enabling access to justice millions of individuals whose personal data has been misused.’ The international law firm Cadwalader is acting in relation to the UK claim.

The Privacy Collective alleges that Oracle’s Bluekai platform and Salesforce DMP (formerly called Krux) misused consumers’ data by aggregating information collected from myriad different websites in order to create profiles of individuals for the purpose of facilitating targeted advertising, all without a proper legal basis. It is also alleged that the firms deployed specially developed cookies in order to track individuals’ activities and create user profiles.

The issues raised in the claims are tied to the Real Time Bidding (RTB) process that underpins online and mobile advertising, which has increasingly come under the spotlight of privacy activists and regulators. We previously reported (here) on the Norwegian Consumer Council’s complaint concerning the dating app Grindr and the collection and sharing of sensitive personal data with advertising networks. The UK’s Information Commissioner has also previously stated her belief that the RTB process is not currently compliant with data protection law, though no enforcement action has been taken and the ICO announced in May 2020 that their inquiries into RTB were being paused indefinitely.

Oracle and Salesforce have disputed the claims by the Privacy Collective, and we will report further as formal responses are made before the courts.

UK ICO fines Welsh direct-marketing firm £130,000 for direct marketing calls

On September 10th, the UK Information Commissioner’s Office (ICO) issued a £130,000 fine to Swansea-based CPS Advisory for making more than 100,000 unauthorised direct marketing calls to people regarding their pensions.

Amendments to the Private and Electronics Communications Regulation (PECR) in 2019 stipulate that firms can only make live calls to individuals regarding work or pension schemes if they are authorised by the Financial Conduct Authority (FCA) or a manager or trustee of a pension scheme. In either case, the recipient’s consent must be given, or a prior relationship with the caller must be in place.

In a statement regarding the fine, ICO said ‘nwanted pension calls can cause real distress and even significant financial hardship to often vulnerable people… Businesses making direct marketing calls are responsible for understanding their responsibilities under the legislation, ignorance is no excuse.’

An ICO investigation revealed that between January 11th and April 30th 2019 CPS Advisory made 106,987 calls to people without the right to do so, as the firm was neither a trustee nor a manager of a pension scheme, and was not FCA authorised. In addition, the evidence showed that CPS Advisory had purchased the contact information used for the calls, and could not demonstrate that the individuals had provided consent to receive such calls. The ICO considered the calls to be a significant intrusion into the privacy of the recipients.

UK Statutory Instrument concerning data transfers to the United States comes into force

On September 18th, a statutory instrument came into force which expands the responsibilities of the UK’s Investigatory Powers Commissioner. The change follows the invalidation of the Privacy Shield by the Court of Justice of the European Union (CJEU) in the Schrems II decision this summer (previously covered here), and in the lead up to the end of the Brexit transition period.

Statutory Instrument 2020/1009 adds an obligation for the Investigatory Powers Commissioner to keep under review the compliance of public authorities with the terms of the UK-US Bilateral Data Access Agreement 2019, which is intended to facilitate data sharing for the purpose of combatting serious crime. This addition to the Commissioner’s duties appears to be in response to commentary by the CJEU in the Schrems II decision, and also the observations raised in an open letter by the European Data Protection Board (EDPB) in June concerning the likelihood of a post-Brexit adequacy decision for the UK (previously covered here). According to the EDPB, a main obstacle to a post-Brexit UK adequacy decision is whether safeguards in the Withdrawal Agreement covering personal data would apply to US law enforcement requests under the US-UK agreement.

As we have previously noted, under EU data protection law, foreign jurisdictions are subject to greater scrutiny than Member States when assessing whether data is adequately protected from inappropriate access by law enforcement bodies. It is that heightened level of scrutiny which the UK will face in seeking an adequacy decision from the European Commission, so that personal data can continue to flow from the EU without additional safeguards; SI 2020/1009 will be one element of the UK’s effort to secure adequacy.

Portland, Oregon City Council first to ban Facial Recognition Technology in public

On September 9th, the city of Portland, Oregon adopted the most wide reaching restrictions to date in the US on the use of facial recognition technology (FRT), banning the private-sector use of FRT in public places within the city. The ban will take effect on January 1st 2021; it was passed unanimously by the Portland City Council, citing issues surrounding gender and racial bias in relation to FRT, and the over-surveillance of marginalised communities.

Under the ban, private entities may not use FRT in places of public accommodation within Portland, save where it is necessary (a) to comply with federal, state or local laws, (b) for user verification purposes when accessing a user’s personal or employer-issued communication device, or (c) for social media applications utilising automatic face detection services.

Importantly, the new law also creates a private right of action for damages arising from a private entity’s violation, which may calculated on the basis of US$1,000 per day for each day of the violation, whichever is greater. Portland City Council previously passed an ordinance banning city government and police use of FRT, similar to laws passed in Boston and San Francisco.

Brazilian Data Protection signed into law

We previously reported (here) that the Brazilian senate had refused the Brazilian President’s proposed delay of the country’s data protection law until May 2021. The Brazilian President, on September 18th 2020, signed a bill bringing the country’s Data Protection Act into force, with retrospective effective from August 16th 2020. The sanctions provisions for breaches of the law will not take effect until August 21st 2021.

For more information please contact Partner, James Tumbridge at