The 10 data privacy fails of the decade – and what we learnt from them

privacy
(Image credit: Pixabay)

Today marks one of the most important days in the calendar for professionals in data  - Data Privacy Day! 

As we enter the 2020s, let’s take a look back over the data privacy fails that shaped the previous decade – and what we learnt from them -  so we can ensure the next 10 years are remembered for championing greatness in data privacy, and produce a decade of privacy wins.

1. Data privacy fails happened in the most unexpected of places…

Imagine buying an app-controlled, Bluetooth connected vibrator to spice up your love life for when your partner isn't in town. It’s all fun and games until you discover your partner hasn’t been controlling it...it’s actually been hacked by a total stranger. 

Believe it or not, this actually happened in 2016, when it was discovered that anyone with a Bluetooth connection could hijack certain sex toys and control them because of their total lack of security protection. 

And if that’s not off-putting enough, it turned out the company was collecting and storing personal data gathered by the vibrator’s app - without their users’ consent. The app tracked the toys’ temperature and vibration intensity when paired with it - so essentially, the company ended up with large data files that detailed the exact sexual stimulation requirements of their customers. 

There is definitely such a thing as too much information...

Vibrators are not the only unusual objects that were hacked over this past decade. In 2017, cybercriminals managed to hack into a casino in North America through its internet-connected fish tank!

The aquarium in the lobby was fitted with a smart thermometer to regulate the tank’s temperate. It was through this device that the hackers were able to exploit a vulnerability and get a foothold in the network. Once there, they managed to access the high-roller database of gamblers and pull it back across the network, out the thermostat, and up into the cloud. You could say, they went fishing...

What have we learnt?

People should be able to buy things as personal as vibrators and as innocuous as fish tanks in safety. It’s simply astonishing that a vibrator was left so insecure when the risk of assault was so obvious. And it was even worse that the company was behaving so invasively as to capture such personal data without consent. While you could argue that the casino should have known better than to put a smart fish tank inside its security perimeter, the risk of exploiting a vulnerability to gain access to other systems has been well known for years, and the fish tank manufacturer simply should not have put its clients at such risk.

As the Internet of Things continues to grow, more devices will begin to come online, and these devices will come in many shapes and sizes. It’s crucial that the manufacturers of these devices follow a Privacy by Design model, and ensure that privacy and security are baked into products right from the start of the development lifecycle – not tacked on at the end. It’s far less hassle to think about data privacy at the beginning, and work it into a product, than to fix security flaws later down the line – if that’s even possible.

The adoption of IoT technology means cybercriminals can be more imaginative with their cybersecurity attacks, and these incidents are compelling reminders that the IoT devices are vulnerable to being hacked or compromised. The problem often occurs when manufacturers focus solely on the performance and usability of IoT devices, and ignore security measures and encryption mechanisms. Simple cybersecurity protocols such as authentication through OAuth, secure storage, penetration tests, and regular audits should be standard for internet-connected devices.

It’s also important for consumers to remember that any object, no matter how innocuous, that can connect to the internet has the potential to get hacked. Be vigilant, keep your operating systems and software up-to-date, use strong passwords, and if at all possible keep internet of things devices separated from important data..

2. The data privacy fail that stopped Harry from having his surname on his schoolbook…  

No-one wants to lose their identity, but an overenthusiastic reading of the GDPR in 2019 nearly led to just that. A primary school banned the use of children’s surnames on textbooks, in order to comply with (their perception of) GDPR regulations.

The bizarre situation led to a young boy, known as Harry Szlatoszlavek, being labelled as ‘Harry2’ by his classmates. ‘Harry2’ even received a Christmas card from another boy which read: ‘To Harry2 from Jack2.’, and Harry’s mum claimed the school was trying to take away his identity.

The numeral was used so teachers could differentiate between Harry and another boy in his class, also called Harry. The school believed that the use of surnames on children’s workbooks would fall foul of GDPR regulations, say if the books were to be taken out of the classroom during parent’s evenings, so they opted to ban surnames completely. 

However, a spokesperson for the Information Commissioner’s Office set matters straight: ‘While data protection law gives special status to the data of children, it does not prevent a child’s full name being written on a schoolbook.’

What have we learnt?

Educating all employees on privacy regulations is important – especially if you regularly handle special category data or personal data that belongs to children. Mistakes can be just as bad when they cause organisations to take privacy too far as when they don’t go far enough.

It can also be helpful to have specialist support available to answer questions and check decisions are appropriate. Not all organisations need to have Data Protection Officers in place to do this, but every organisation should think about how it will get the right privacy advice when it needs it. As this case shows, identities can be lost in more ways than one. 

3. The data privacy fail that tried to ignore the ICO

With any data privacy incident, the one person you really, really don’t want to get on the wrong side of is the Information Commissioner. And ignoring her is a pretty good way to do just that. 

In May 2018, someone submitted a Subject Access Request (SAR) to Hudson Bay Finance. It was ignored. After a period of about 5 months, having still not heard from the company, the person reported the incident to the ICO – who then tried to contact the company in order to get it to complete the SAR.

The ICO wrote several letters to Hudson Bay Finance, which were then marked as “return to sender”. They didn’t just ignore them, they actively returned their letters - the audacity!

They also hung up the phone several times when the ICO tried to call to discuss the SAR, and eventually stopped answering. Unsurprisingly, that did not go down well and lead to Hudson Bay Finance being named and shamed on the ICO’s website and being served with an enforcement notice. Which I presume the company hasn’t ignored, seeing as it’s a criminal offence… 

What have we learnt?

Ignoring the ICO was never going to be a good idea.

SARs can be a real headache and complicated to complete – especially if you handle a lot of personal data. However, this is all the more reason to make sure you have the correct processes in place so you can handle them effectively. 

It’s equally important to train all of your staff so they can recognise a SAR and respond accordingly. Additionally, once your protocols are in place, it’s a good idea to get an external company in that can test them on a regular basis to make sure your processes work. SARs are often connected with a complaint or a concern and so they should be taken very seriously.  

4. The data privacy fail which didn’t fully understand the concept of “anonymising”… 

Would you say your mobile number is a pretty good identifier of you? A Danish taxi company didn’t think so. In the autumn of 2018 Taxa 4x35 was fined DKK 1.2 million for “anonymising” personal data, which sounds unexpected, until you find out their method of “anonymising” it was to keep people’s personal phone numbers and simply erase their names. This meant information on the customer’s taxi trips (including location histories) could still be traced back to the customer via the phone number and the information was identifiable for three years longer than necessary. 

What have we learnt?

Data only becomes anonymous when it can no longer be associated with an individual, even in combination with other data. That is very difficult to achieve in practice and Taxa 4x35 is not the only company to confuse ‘anonymous’ with ‘pseudonymous’ data.

Taxa 4x35 had also fallen into the trap of publishing a privacy notice that complied with the law, but did not reflect its actual practices. While it had correctly identified appropriate retention periods for the data, its systems did not allow it to erase all personal data at the end of those periods. Combined with the failure to fully understand the requirements for ‘anonymous’ data, this lead it to pay lip service to privacy instead of complying with the law and keeping its customers safe.

5. The data privacy fail that spied on football fans…

What happens on a lads’ day at the football, stays at the football, right?

Well not if you had the La Liga app. In the summer of 2019 La Liga was fined €250,000 for spying on its app’s users. While the app did ask for permission to use the microphone and track users’ locations, it did not explain that these functions would be turned on remotely - as often as once a minute - in order to establish whether users were watching matches in locations that did not have the correct licences to show them. 

The company wanted to make sure sports bars weren't showing matches by using cheaper home cable or TV packages, rather than more expensive commercial subscriptions. It was certainly a devious method for keeping an eye out for pirates, but as the Spanish regulator ruled, it was also a huge breach of privacy.

What have we learnt?

Companies need to be able to demonstrate that the processing they do is necessary, proportionate and balanced against the impact on individuals. In this case, it is hard to see how spying on users simply to protect La Liga’s revenues could meet these tests. 

The new privacy regulations now mean that consent needs to be freely given, specific, informed and unambiguous. In this case, whilst the company did ask for permission to use the microphone, they weren’t clear on their intentions behind it or how they planned to use it – so consent was not informed.

When collecting and managing any form of personal data, you need to ensure your privacy notices detail exactly what you intend to do with it and how you will use it, but without going into lengthy jargon that will put your users off from reading it.

If you don’t specifically state how you intend to use the data, it’s usually because you know what you’re doing is a breach of privacy. We might speculate that La Liga didn’t specifically tell users how often it was going to access the microphone because most people don’t like the idea of a stranger listening in on their private conversations in the pub. If you can’t be honest with your customers about your practices, you probably shouldn’t be doing them.

6. The data privacy fail that left special category data out for all to see…

Imagine a total stranger stumbling onto your medical history because its been dumped outside a pharmacy for all to see. That could have happened to you, if you get your pills from Doorstep Dispensaree. The pharmacy was found to have dumped approximately 500,000 documents in unlocked crates, disposal bags and a cardboard box in a rear courtyard of the premises.

These documents included names, addresses, dates of birth, NHS numbers, medical and prescription information. Unsurprisingly, the ICO fined the pharmacy £275,000 for its negligent handling.

What have we learnt?

It’s important to keep in mind that data isn’t always digital, and you should take just as much care of your paper documents as your digital files. In the case of Doorstep Dispensaree, not only were the paper files accessible to anyone who could access the courtyard, many of them were badly water damaged meaning that any information that Doorstep Dispensaree needed may not have been available. In the case of health records, that could have serious consequences.

If you haven’t already, create a paper document destruction policy and make sure that it is completely and clearly explained to staff, as well as emphasising why they need to follow it. 

7. The data privacy fail that came from the inside…

If you work at oil giant Shell, you probably don’t want your details to get into the hands of people who are completely opposed to the company’s activities. 

In 2010, a very disgruntled former employee of Shell emailed a database containing the contact details of 170,000 Shell workers to campaigning groups that were against the company's activities, including Greenpeace and anti-Shell campaign website royaldutchshellplc.

The database contained names, telephone numbers and further details on permanent and contract employees and could have had serious consequences for those individuals.

What have we learnt?

You can build a wall, set up perimeter defences, and spend huge sums of money maintaining it all. But if your enemy is within, that wall is not going to help. So how do you fight a rogue employee?

Prevention is always better than cure. You should start by considering what practical steps you can take in connection with disgruntled employees, as part of any wider ongoing data compliance programme. Access controls preventing individuals from downloading large data sets or accessing data without a valid business reason can make these types of breaches much more difficult. 

However, a skilled and motivated employee can circumvent even the best controls. Data Loss Prevention technologies are a good way to stay on top of everything even if you do suffer a data breach through an unhappy employee. Platforms such as BreachTrak allow you to keep an eye on your data even when it leaves the building and moves beyond your direct control, meaning you can monitor your employees’ data usage for suspicious activities and act immediately if you spot any red flags. 

8. The data privacy fail that revealed a little too much…

We can’t talk about data breaches without referencing the breach that caused more red faces than most in the past decade...

Ashley Madison, as it was then called, was a website that famously facilitated married spouses to have affairs. This obviously didn’t sit well with quite a lot of people, and that included a group of hackers. In 2015, the organisation became a target and suffered a massive security breach which exposed over 300 GB of user data - including users’ real names, banking data, credit card transactions, and secret sexual fantasies. A cheating spouse’s worst nightmare. 

What have we learnt?

Ashley Madison knew it had huge amounts of very sensitive personal data and that keeping it secret was critical in order for its business to work. It should also have realised that its business model put it at enhanced risk of targeted attacks. As a result, its risk assessment should have told it that it needed leading edge cybersecurity practices in order to protect its users’ privacy.  

That was not what happened. Ashley Madison’s use of the MD5 hash protocol to protect users’ passwords was an obvious error, but this isn’t the only mistake they made. The subsequent audit revealed that the entire platform had serious security problems, which had not been fixed because they were the work of a previous development team.

Companies with lots of sensitive information should make strenuous efforts to protect it, including making sure that action plans to identify and fix vulnerabilities are tracked at board level. It shouldn’t be possible to lose track of serious vulnerabilities simply because key individuals have moved on. 

You should also hire an external company to test your security measures via penetration testing and vulnerability assessments. Whilst a data breach is not something any organisation wants to go through, it is something you need to be prepared for, just in case. It’s also important to make sure your Incident Response plan includes Crisis Communications, so you’re not scrambling around when the hackers strike. 

9. Data privacy fails that occurred through pure laziness...

In the space of six months, a security researcher found thousands of files from dozens of computers, phones and flash drives, most of which contained personal data from their previous owners. And all he had to do was scour second-hand stores for donated and refurbished tech!

The experiment revealed how problematic discarded technology can be. The researcher bought 85 devices for $650, and found over 366,300 files, including images and documents. After an analysis of each device, the researcher found email addresses, dates of birth, Social Security and credit card numbers, driver’s license data and passport numbers. 

However, it’s not just phones and computers that contain a wealth of personal information which needs to be disposed of. It was recently brought to the Vehicle Remarketing Association’s attention that used cars often still have satnav data and phone records, listing addresses and personal phone numbers.

What have we learnt?

It is increasingly common for organisations to allow employees to use their own devices at work, and this fail shows just how problematic that can be. Where organisations control the acquisition and disposal of devices, they are usually (though admittedly not always!) careful to ensure any personal data is removed. However, when individuals control their own devices it is far more likely that they will miss this essential step. Organisations remain responsible for the personal data they control, no matter who owns the hardware that stores it. 

Organisations must have a policy which outlines what happens to phones and laptops when they are either no longer in use, or are being given to another member of staff to use. Ensuring the device is stripped of all personal information before its rehomed is a process organisations need to have in place – especially if you don’t want that data falling into the wrong hands. Also remember, it’s not just phones - think of everything an employee has that can be connected to the internet - company cars need to fall into this protocol too. 

10. The data privacy fails that happened because of earwigging robots…

In 2018 a woman complained to Amazon after an Alexa device recorded a private conversation between her and her husband and sent it to a random number in their address book – all without their permission.

But Alexa isn’t the only robot that’s been up to no good. Imagine being woken up several times during the night by your hotel’s concierge. This is what happened to guests in a hotel in Japan. 

The hotel had to ‘fire’ half of its robot staff because they had so little regard for the guests’ privacy. The in-room concierge would wake up loudly snoring guests because it triggered the robot to ask "Sorry, I couldn't catch that. Could you repeat your request?" several times in a night.

What have we learnt? 

When introducing new technologies, it is important to fully test them to ensure they work as intended and are no more privacy intrusive than whatever they replace. 

Training your team in Privacy by Design ensures that privacy is baked into the products and services you choose and use right from the start of a project lifecycle. Introducing new technologies alone is not enough to solve this challenge – the crucial first step is changing employee mindsets towards privacy so they understand and believe in its importance.

We’ve learnt a lot about data privacy over the past 10 years, and its important that we learn lessons from those who have failed this time round so these mistakes aren’t repeated. The new privacy laws are a good first step, but good privacy management is an ongoing process that needs consistent review and updating, it’s not a tick-box exercise. So whilst it’s on your mind, double check your security systems, refresh your Incident Response Plan, refine your SAR process or book your employees’ next privacy training session. And remember, it’s never a good idea to ignore the ICO! 

Let’s look back on this next decade as one full of privacy wins. 

Camilla Winlo is Director of Consultancy Services at DQM GRC

Camilla Winlo

Camilla Winlo is Director of Consultancy Services at DQM GRC. 

She lead the Consultancy Services division of DQM GRC (part of GRC International), which was established in 1996 to give organisations “Confidence in Data”. Her phenomenal team of commercially-minded, data-savvy specialists supports organisations to deliver privacy-conscious solutions to data-driven innovation challenges.

She has almost a decade of experience of commercialising regulatory change, including as part of the leadership team developing and launching three new financial services businesses.