Voice deepfakes are getting easier to spot

A digital padlock on a blue digital background.
(Image credit: Shutterstock / vs148)

New research has shown that voice deepfakes are becoming easier to spot as synthetic recreations of real voices, thanks to the anatomy of our vocal tracts.

Researchers at the University of Florida have devised a method of simulating images of a human vocal tract’s apparent movements while a voice clip - real or fake - is played back. 

Professor of Computer and Information Science and Engineering Patrick Traynor and PhD student Logan Blue wrote that they and their colleagues found that simulations prompted by voice deepfakes weren’t constrained by “the same anatomical limitations humans have”, with some vocal tract measurements having “the same relative diameter and consistency as a drinking straw”.

Deepfakes and the media literacy problem

Though scientists are starting to spot voice deepfakes with simulation and anatomical comparison, the risk of an ordinary person being tricked by any deepfake - which could lead to identity theft - remains a problem. 

Ordinary people don’t yet have access to these tools. Even if they ever will, users will still struggle to interpret that data until intuitive and widely available audio-based detection tools materialize.

Because it’s so hard for normal eyes and ears to spot deepfakes, expert advice on doing so isn’t widely known or available yet. People are also less primed to be healthily critical of what they see and hear over the internet, the phone, or any medium that puts a level of disconnect between what’s really happening.

“Disbelief by default”, where people become skeptical about everything they see and hear that isn’t from a trusted source, is a useful tactic here. The problem within the problem is that not everyone will adopt that strategy, as they don’t understand the threat and refuse to engage with it.

Media literacy has been a vital skill for a number of years now, as anyone has been able to come across election disinformation or baseless conspiracy theories, but schools aren’t interested in teaching it, and there is still the issue of bridging this skill gap in adults.

That skill gap is how fake news proliferated and embedded itself in our societies, and relationships with our loved ones.  For this reason, anyone concerned about the media literacy of those close to them  should consider investing in identity theft protection for families.

The rise of the convincing audiovisual deepfake has once again raised the need for a structured, widespread program to educate users in media literacy, and the importance of applying critical thinking to anything with only the thinnest veil of authenticity around it.

Via The Conversation

Luke Hughes
Staff Writer

 Luke Hughes holds the role of Staff Writer at TechRadar Pro, producing news, features and deals content across topics ranging from computing to cloud services, cybersecurity, data privacy and business software.