Could brainwaves replace your passwords?

Brain

Moving beyond the traditional password and improving security measures has led us down the path of biometrics, but even as we travel along that avenue, things are branching out swiftly with exploration of the use of the brain as a method of authentication – although there are some major concerns around privacy aspects here.

Yes, forget fingerprint sensors, iris scanning or facial recognition logins, the future could see folks using brainwaves by having an EEG (electroencephalogram) hooked up to their noggin, which is an extremely accurate way of verifying the identity of the user, and has other benefits besides.

Abdul Serwadda, who is an assistant professor in the department of computer science at Texas Tech University, is carrying out research into how brainwaves can be used as effective authentication – or indeed misused.

In the positive department, one of this system's major strengths is the fact that it can continuously verify a user. With a typical login, you authenticate with your password, fingerprint or whatever, and that's it – away you go.

If, for example, you leave your computer suddenly for an emergency, and someone else hops onto the machine, they'd have access. But with brainwaves, we're talking about behavioural authentication, and continually reading the brain for constant user verification, so the system will always know that the user in front of the device is the rightful owner.

In case you think this all sounds rather ambitious for contemporary times, Serwadda points out that several research groups over in the US have recently showed off systems using brainwave authentication, and he notes that: "The EEG has become a commodity application. For $100 (around £80, AU$130) you can buy an EEG device that fits on your head just like a pair of headphones."

Of course, any kind of widespread adoption for authentication usage is still a good way off, and that's possibly just as well, as there are the aforementioned concerns about privacy implications due to the data which these sort of setups record from users.

Template troubles

The worry is that EEG 'authentication templates' built up around the user could be hacked from the system, and the data therein could potentially reveal information about the user's emotional state, characteristics and so forth – such as whether they might be an alcoholic, which is cited by the assistant professor as an example.

It's not just hackers, or indeed brainwave authentication that we need to worry about – but also simple apps. As Serwadda told Digital Trends: "You don't even have to go to hackers to find who will abuse this. The app developer who posts a brain measuring app on the market is the first guy who might abuse this. The app can do whatever it likes with the data and communicate over the internet to send 'its findings' to the malicious app developer."

These sort of brain-sensing apps are already out there, available to download.

At any rate, it's clear that this is an area where researchers and developers need to tread carefully. Serwadda's research is all about emphasising this fact, and underlining that EEG authentication systems need to be designed to perform their intended task – user verification – while reading as little as possible in the way of information that could reveal personal details about the user.

The Texas Tech University article outlining his findings notes that these brain-reading systems can be tweaked to hugely reduce the amount of data they collect which could potentially expose private information, with only a slight reduction in the performance levels of the authentication itself.

Darren is a freelancer writing news and features for TechRadar (and occasionally T3) across a broad range of computing topics including CPUs, GPUs, various other hardware, VPNs, antivirus and more. He has written about tech for the best part of three decades, and writes books in his spare time (his debut novel - 'I Know What You Did Last Supper' - was published by Hachette UK in 2013).