Criminals could use 'skill squatting' to hijack your smart speaker

Amazon Echo speaker

Researchers have demonstrated how crooks could use the idiosyncrasies of voice recognition to carry out unwanted commands on a smart speaker. 

A team from the University of Illinois showed that by giving a malicious application or Alexa skill a name that sounds the same as a legitimate one, criminals could trick a device into triggering it – a tactic it calls 'skill squatting'.

The words didn't even have to be exact homophones. Results varied depending on the speaker's accent and gender, but the team found that 'coal' was easily misinterpreted as 'call', 'dime' as 'time' and 'wet' as 'what'.

There are already some examples of this happening on the Alexa Skill Store. For example, both 'cat facts' and 'cat fax' give information about cats, but from different providers.

Sounds suspicious

The principle is much like domain squatting (also called cybersquatting). Domain squatters register domain names that are identical or similar to names used by real companies. The squatters use these domains to trick people into viewing their own content, or offer to sell them to the business whose name they're using at an inflated price.

The university's researchers used Amazon Alexa, but the same principle could apply to other voice-activated virtual assistants, including Google Home, Siri and Cortana. It's a thorny problem, and as voice recognition is integrated into ever more products, it will be increasingly important to solve.

Via Ars Technica

Cat Ellis

Cat is the editor of TechRadar's sister site Advnture. She’s a UK Athletics qualified run leader, and in her spare time enjoys nothing more than lacing up her shoes and hitting the roads and trails (the muddier, the better)