ChatGPT’s free macOS app had a big, worrying security hole
A huge privacy slip-up, but it's been fixed
One of the key concerns raised during the rising artificial intelligence (AI) wave is what it all means for user privacy. With data leaks and copyright infringement seemingly rampant, a lot of people are worried about whether their information might end up in the wrong hands.
Those concerns won’t be eased with the news that the recently released ChatGPT app on macOS has been caught storing user conversations in plain text, potentially making them available to any other app (or user) on the Mac (via AppleInsider). It’s been revealed that this issue was present from the app’s June 25 release until it was patched on June 28. It was first discovered by user Pereira Vieito, who detailed the vulnerability on Threads.
Apple’s guidelines state that apps should keep data in ‘sandboxes’ to ensures that nothing is available to other apps without explicit user permission (this also includes access to your photos, calendar details, text messages and more). However, it seems that ChatGPT’s developer OpenAI decided to ignore this and simply stored conversations in an unencrypted and freely available form.
It’s not only other apps that could potentially have accessed your conversations with ChatGPT – any Mac malware that made it onto your computer would have been able to sweep up everything you had typed into ChatGPT, too. Considering some of the sensitive information people enter into the chatbot, that could have had very serious consequences.
What's notarization?
When an app is submitted to Apple’s App Store, it undergoes a process called notarization. This is where Apple checks the app across various criteria, one of which is that everything is sandboxed properly and inaccessible to outside apps.
The problem here, though, is that the ChatGPT Mac app is distributed from OpenAI’s website, not from the App Store. As such, it was never notarized by Apple, allowing this situation to arise.
In a statement to The Verge, OpenAI said that “we are aware of this issue and have shipped a new version of the application which encrypts these conversations. We’re committed to providing a helpful user experience while maintaining our high security standards as our technology evolves.”
Get daily insight, inspiration and deals in your inbox
Sign up for breaking news, reviews, opinion, top tech deals, and more.
While ChatGPT’s Mac app is the culprit in this instance, in theory any app distributed outside the App Store and not notarized by Apple could be similarly guilty. It’s a reminder that you should only install apps that you trust, and even then, you should take reasonable precautions not to divulge anything too private, lest another situation like this arises with a different app.
You might also like
Alex Blake has been fooling around with computers since the early 1990s, and since that time he's learned a thing or two about tech. No more than two things, though. That's all his brain can hold. As well as TechRadar, Alex writes for iMore, Digital Trends and Creative Bloq, among others. He was previously commissioning editor at MacFormat magazine. That means he mostly covers the world of Apple and its latest products, but also Windows, computer peripherals, mobile apps, and much more beyond. When not writing, you can find him hiking the English countryside and gaming on his PC.