Sectors

Services

Background
30 September 2018

Human error main cause of data breaches, British Airways hacked, Brazil adopts new rules

New Data on self-reported breaches might surprise you

New statistics regarding self-reported UK data breaches indicate that human error is seven times more likely than hacking events to be the reported cause of a data breach.

Data released under the Freedom of Information Act indicates that of the 2,124 data breaches reported in the UK in 2017-18, only 292 were deemed to be cyber attacks; the remainder were attributed to human error. These statistics combine instances of groups self-reporting data breaches to the ICO, as well as macro data from annual reports. The most common breaches resulting from human error were sending data to the wrong recipient, either by email or by post, as well as loss or theft of paperwork containing personal data. 

The ICO has stated that self-reported data breaches in 2017-18 totalled 3,156, up almost a third from the previous year.  They attribute this increase to the GDPR mandate to report serious breaches, as well as greater awareness as to what constitutes a data breach.

The analysis found that the average fine handed out by the ICO for breaching the 1998 Data Protection Act was £85,000. However, these fines are expected to increase under the 2018 Data Protection Act, as organisations may now be fined up to 4% of annual global revenue or €20 million, whichever is higher, for failure to comply with the new law.

On top of self-reported incidents, the ICO is also tasked with investigating data subject complaints, the total number of which was greater than 21,000 during 2017-18. This is up from the 18,350 made during 2016-17.

BA suffers breach of customer financial data                                        

On September 7th, BA chief executive Alex Cruz announced that BA had suffered a hack of customer personal and financial data. All told, roughly 380,000 customer transactions were affected.

The breach, which reportedly has been resolved, occurred between the evenings of August 21st and September 5th within 72 hours. BA is said to have notified the ICO and affected customers. Given that financial data was compromised, Cruz explained that “our number one purpose is contacting those customers that made those transactions to make sure they contact their credit card bank providers.”

Cruz noted that the breach was first discovered the evening of September 5th, at which point BA began notifying affected customers and the relevant authorities. However, several customers have expressed concern over BA’s response, stating that despite making bookings in the timeframe outlined by BA, they were not contacted. The situation worsened for BA, when it was announced that they now face a £500m group-action suit over the breach, which is being brought on behalf of affected customers. 

Cyber-security firm RiskIQ claims that the hack could have been caused by a “skimming” script, similar to the recent cyber attack which recently affected Ticketmaster’s website. Such attacks are becoming increasingly common, as large websites frequently source and embed code from third-parties, where malicious may have been inserted.

The ICO has begun an inquiry into the breach, and it is possible BA could face a considerable fine, particularly in light of the ICO’s enhanced fining ability under the 2018 Data Protection Act.  As one of the first high profile investigations to fall entirely under the new data protection rules, the ICO’s approached will be closely watched as an indication of how data security will be enforced. 

UK ICO confirms that students can request exam details under the GDPR

The Information Commissioner’s Office (ICO) has confirmed in guidance that under the General Data Protection Regulation (GDPR), students can now make Subject Access Requests (SARs) to obtain certain information concerning their exams for free from examination bodies. Such exams include GCSEs, A-levels, Scottish Highers and university exams. The information that students will be able to request includes examiners’ comments, marks and the minutes of appeals panel hearings. However, the ICO is keen to point out that the GDPR does not give students the right to receive copies of their marked papers, nor does it provide a right to request the re-marking of examination scripts.

It has been predicted that examination boards may be inundated with requests under the new rules. The Data Protection Act 2018 requires examination boards to provide the requested information within 40 days if exam results have already been published.  Therefore, it is likely that in many cases students will not receive the information requested before the examination boards’ approaching deadlines at the end of September to request a review of the marking.

Brazil adopts new data protection legislation with similar provisions to the GDPR   

The National Congress of Brazil is set to adopt new data protection legislation when the General Data Protection Law (LGPD) finally comes into force in February 2020. The LGPD (Law 13,709/2018), which has many similarities to the EU’s GDPR, protects personal data regardless of how it is stored or collected.

Although the LGPD provides ten bases under which personal data may be lawfully processed, obtaining express consent will be the default requirement, with individuals being given an opportunity to opt-out. The other bases include performance of a lawful agreement, compliance with legal obligations, legitimate interests, and for the protection of life. Similar to the GDPR, there are further restrictions on the processing of ‘sensitive data’, which the law defines as personal data concerning racial or ethnic origin, religious or political beliefs, genetic or biometric data, or data relating to health or sex life.

As with the GDPR, the law also gives data subjects the right to access their personal information and to rectification of any inaccuracies. Sanctions for non-compliance with the new law include fines of up to 2 per cent of a company’s annual turnover, limited to 50 million Reals per individual breach. In contrast, fines under the GDPR are more severe at 20 million euros or 4 per cent of annual global turnover (whichever is greater).

The LGPD will not apply to inbound data collected abroad that is not shared with processors in Brazil. However, it will have extraterritorial effect and will apply to foreign entities if they process personal data collected in Brazil, or personal data related to individuals located in Brazil. The LGPD also limits the transfer of data outside of Brazil with cross-border data transfer mechanisms closely resembling those available under the GDPR.

As the largest economy in South America, the introduction of the LGPD in Brazil is sure to have a significant impact on business operations throughout Latin America and will almost certainly affect European organisations with Brazilian operations. Although companies already operating under the GDPR will be well placed, they should use the next 18 months to identify any gaps in internal policies and procedures to ensure they will be LGPD-compliant when the time comes.

No right to be forgotten for murderers

The European Court of Human Rights (ECtHR) has ruled that two half-brothers, Lanfred Lauber and Wolfgang Werle, convicted 25 years ago for the murder of a famous German actor, do not have a right to require their names be deleted from online reporting of the case.

The court’s decision in ML and WW v Germany, represents the latest conflict between article 8 and 10 of the European Convention on Human Rights  as it relates to ‘the right to be forgotten.’  The judgement is indicative of a preference for the freedom of expression (article 10) over the right to privacy (article 8), despite the opposite approach being taken by the CJEU in Google Spain four years ago. In the ML decision, the court unanimously held that Lauber and Werle’s article 8 rights had not been violated.

The case has been ongoing for the past 10 years, as the half-brothers’ release date has approached. The 1990 murder and subsequent trial attracted considerable media attention, and the half-brothers, citing their article 8 privacy rights, brought proceedings against several media organisations demanding that archived documents be anonymised.   

The ECtHR was tasked with considering whether prior German court decisions took proper account online search engines, and found that decisions regarding what information should be published in news reports was up to journalists, provided they comply with the industry’s ethical norms. Furthermore, the ECtHR held that any obligation to determine a report’s lawfulness following a later individual request would run the risk of the press failing to preserve archives, or omitting identifying information from the outset. 

Questioning Local Councils’ Use of Data Analytics

We have previously reported on the use of artificial intelligence systems by police forces in determining matters such as eligibility for certain early release initiatives for offenders. This week, it has been reported that local councils in the UK have been working with service providers to develop ‘predictive analytics’ systems which employ algorithms to identify families which might require the attention of the council in respect of child services. The report states that the data of at least 377,000 individuals has so far been incorporated into the predictive systems. The range of data which councils are considering including in the systems covers School attendance and exclusion records; Housing Association data on repairs and arrears, and police records on anti-sociable behaviour and domestic violence. It is also reported that the Brent Council is currently developing a system to predict vulnerability to gang exploitation. At this early stage, the Information Commissioner’s Office has indicated that it will be asking questions of local councils reported to be using predictive analytics, in order to ensure that such uses are compliant with data protection law. We shall be monitoring developments and will report further in a future update.

Share