Coronavirus tracking app has major security flaws

(Image credit: Shutterstock)

Serious security flaws have been discovered in the contact-tracking app currently being trialled by the NHS to stop the spread of Covid-19.

A team of senior security researchers found several issues that could affect user privacy and even sabotage the app itself.

The app is currently undergoing testing in the Isle of Wight before a potential nationwide rollout, and has been touted by the UK government as a key weapon to help halt the outbreak.

NHS coronavirus app

The team behind the report was made up of independent researcher and lecturer Dr Chris Culnane and Vanessa Teague, CEO at Thinking Cybersecurity.

Among the "varied" issues uncovered by the pair were multiple weaknesses in the registration process that could allow attackers to steal encryption keys. 

This could then allow intruders to stop users being notified if one of their contacts had tested positive for Covid-19, or even potentially being able to send out fake alerts.

The app was also found to be storing unencrypted data on handsets that could potentially be used by law enforcement agencies to determine when two or more people met.

The team also found that the app was generating a new random ID code for users once a day, as opposed to a rival app developed by Apple and Google which generated a new code every 15 minutes for greater security. 

Apple and Google's app looks to work across Android and iOS devices using low-energy Bluetooth signals to build up a map of the people a user has been in contact with.

Teague and Culnane recommend that the NHS move from the "centralised" approach it currently uses, where the data is shared and contact tracked on a central server system, to a "decentralised" one, where the matching occurs on user devices.

"There can still be bugs and security vulnerabilities in either the decentralised or the centralised models," Teague noted.

"But the big difference is that a decentralised solution wouldn't have a central server with the recent face-to-face contacts of every infected person.

"So there's a much lower risk of that database being leaked or abused."


The team said it had shared its findings with the National Cyber Security Centre (NCSC), which in turn, told the BBC it was already aware of most of the issues raised and was in the process of fixing them.

"It was always hoped that measures such as releasing the code and explaining decisions behind the app would generate meaningful discussion with the security and privacy community," an NCSC spokesperson said in a statement.

"We look forward to continuing to work with security and cryptography researchers to make the app the best it can be."

“This app was never going to be perfect right from the outset, but it is refreshing to hear that the government are listening to independent research and taking on the suggestions with the next revisions," commented Jake Moore, Cybersecurity Specialist at ESET.

"Like with many apps, the first version is rarely even of any use but it gets it onto peoples phones, where they can easily roll out newer versions.

"Once the majority of people have the app, then its intentions are clearly going to have better effects. However, the biggest issue is around the distinct lack of legislation protecting this data. Not knowing if and how the data could be used in the future – or even if it will be deleted – is important to the users. It is vital that the public’s privacy is at the core. Without this, the public may turn its back on the app before it has had enough time to roll out to the correct amount of people and come into any sort of effect.”


Mike Moore
Deputy Editor, TechRadar Pro

Mike Moore is Deputy Editor at TechRadar Pro. He has worked as a B2B and B2C tech journalist for nearly a decade, including at one of the UK's leading national newspapers and fellow Future title ITProPortal, and when he's not keeping track of all the latest enterprise and workplace trends, can most likely be found watching, following or taking part in some kind of sport.