Ever get the feeling that your voice assistant of choice isn't doing what you actually want it to do? Well, it might not be your dodgy accent to blame as a bunch of researchers have found ways to manipulate the likes of Siri and Alexa, using white noise and a series of commands not audible to us puny humans.
According to the New York Times, researchers at the University of California, Berkeley, have revealed how they managed to embed commands directly into music or spoken word that could potentially get a voice assistant to add something to a shopping list or control parts of a phone.
This is something that has only been done in lab conditions but speaking about the findings, which follow on from a study conducted in 2016 by the team, Nicholas Carlini, one of the paper's authors said: “My assumption is that the malicious people already employ people to do what I do.”
Carlini went on to note that: “We want to demonstrate that it’s possible, and then hope that other people will say, ‘Okay this is possible, now let’s try and fix it.’”
Apple, Amazon and Google have responded to the New York Times' article, noting that steps are always taken to make sure smart speakers are secure and, in the case of the Apple Homepod is designed to "prevent commands from doing things like unlocking doors," and it noted that "iPhones and iPads must be unlocked before Siri will act on commands".
It does highlight the potential issues with voice assistants going forward, though, and how important it is that companies make sure that voice controlled goes through the same security measures as any other operating system.
- The best Amazon Echo: all the smart speakers compared