This scary AI breakthrough means you can run but not hide – how AI can guess your location from a single image
AI knows where you’ve been
There’s no question that artificial intelligence (AI) is in the process of upending society, with ChatGPT and its rivals already changing the way we live our lives. But a new AI project has just emerged that can pinpoint the location of where almost any photo was taken – and it has the potential to become a privacy nightmare.
The project, dubbed Predicting Image Geolocations (or PIGEON for short) was created by three students at Stanford University and was designed to help find where images from Google Street View were taken. But when fed personal photos it had never seen before, it was even able to accurately find their locations, usually with a high degree of accuracy.
Jay Stanley of the American Civil Liberties Union says that has serious privacy implications, including government surveillance, corporate tracking and stalking, according to NPR. For instance, a government could use PIGEON to find dissidents or see whether you have visited places it disapproves of. Or a stalker could employ it to work out where a potential victim lives. In the wrong hands, this kind of tech could wreak havoc.
Motivated by those concerns, the student creators have decided against releasing the tech to the wider world. But as Stanley points out, that might not be the end of the matter: “The fact that this was done as a student project makes you wonder what could be done by, for example, Google.”
A double-edged sword
Before we start getting the pitchforks ready, it’s worth remembering that this technology might also have a range of positive uses, if deployed responsibly. For instance, it could be used to identify places in need of roadworks or other maintenance. Or it could help you plan a holiday: where in the world could you go to see landscapes like those in your photos? There are other uses, too, from education to monitoring biodiversity.
Like many recent advances in AI, it’s a double-edged sword. Generative AI can be used to help a programmer debug code to great effect, but could also be used by a hacker to refine their malware. It could help you drum up ideas for a novel, but might assist someone who wants to cheat on their college coursework.
But anything that helps identify a person’s location in this way could be extremely problematic in terms of personal privacy – and have big ramifications for social media. As Stanley argued, it’s long been possible to remove geolocation data from photos before you upload them. Now, that might not matter anymore.
Get daily insight, inspiration and deals in your inbox
Sign up for breaking news, reviews, opinion, top tech deals, and more.
What’s clear is that some sort of regulation is desperately needed to prevent wider abuses, while the companies making AI tech must work to prevent damage caused by their products. Until that happens, it’s likely we’ll continue to see concerns raised over AI and its abilities.
You might also like
Alex Blake has been fooling around with computers since the early 1990s, and since that time he's learned a thing or two about tech. No more than two things, though. That's all his brain can hold. As well as TechRadar, Alex writes for iMore, Digital Trends and Creative Bloq, among others. He was previously commissioning editor at MacFormat magazine. That means he mostly covers the world of Apple and its latest products, but also Windows, computer peripherals, mobile apps, and much more beyond. When not writing, you can find him hiking the English countryside and gaming on his PC.