ExpressVPN uncovers 3.7 million items of leaked AI chatbot data. A reminder of how vital encryption is
Leaked data points included voice and text messages, and even private audio recordings lasting up to 4 hours
Sign up for breaking news, reviews, opinion, top tech deals, and more.
You are now subscribed
Your newsletter sign-up was successful
- ExpressVPN uncovered massive amounts of AI chatbot leaked data
- The databases were not encrypted
- ExpressVPN urges users to be vigilant
If you heard that up to 3.7 million pieces of private user data had been made public, you might well assume it was the report of a major hack. However, a recent investigation published by ExpressVPN proves just how easy it is to lose your privacy when basic security measures such as password protection and encryption aren’t in place.
Conducted by cybersecurity researcher Jeremiah Fowler, the report uncovered a case in which massive amounts of customer data were leaked from AI-powered chatbots used by retailers for customer service.
If you're on this page, chances are that the best VPNs are already protecting your digital privacy while you browse or stream content online, thanks to their top-notch encryption features.
Article continues belowBut when a retailer or third-party service hasn’t taken adequate measures to protect your data, even the most tech-savvy users may not realize the enormous risks that they are subject to if leaked information falls into the wrong hands.
The findings
Fowler discovered three separate publicly accessible databases that were neither password-protected nor encrypted and contained 3.7 million records, including personal data such as email and home addresses, and phone numbers.
To give an example of the vastness of the data exposed, even an initial sampling included 1,422,577 audio recordings of customers. In terms of data, even at a glance, this included text transcripts totalling 3.9TB, 207,381 Excel files, and audio recordings totaling 415.2GB.
The limited sample contained transcripts and audio files belonging to Sears Home Services, a US retail and repair business that has embraced AI chatbots in English and Spanish for the purposes of automating their scheduling, phone calls, and online chats.
The files contained 54,359 complete transcripts of the conversations customers had with AI chatbots and their corresponding audio recordings.
Fowler pointed out that the system also continued to record audio files if the customer had not hung up properly, meaning that the audio files contained up to four hours of background conversation and vast amounts of biometric voice data.
The expert provided an overview of the data presented, sharing screenshots of filesystem structures and the types of files they contained. These illustrated how the data could be accessed, including how audio files could be played in any web browser and the convenient user interfaces provided for interacting with the filesystem.
How to stay safe
While Fowler stated that public access to the data was immediately restricted after he sent a responsible disclosure notification to Sears Home Services' parent company, Transformco, he remained concerned.
The investigation highlighted that with AI-driven automation being capable of storing massive amounts of highly sensitive data, there is a significant risk that some companies will act irresponsibly and leave user data exposed — a bleak scenario when estimates say that deepfake‑enabled fraud losses are forecast to reach US $40 billion by 2027.
This vast amount of data could enable hackers to link identities or replicate users’ digital profiles for criminal purposes; in such cases, virtual private network (VPN) tools prove useless if the weak link is the very company to which you have voluntarily entrusted your data via chatbots or other apps.
ExpressVPN urges users to remain vigilant and offers practical advice, including using strong passwords and taking extra precautions in sensitive situations.
Also, be cautious when receiving unsolicited emails, text messages, or phone calls that reference information you may have previously shared with a company or service.
And with the rise in voice cloning scams, agree on a password with family and friends to use in the unlikely event that you receive a call from them asking for money or help.
Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button!

Silvia Iacovcich is a tech journalist with over five years of experience in the field, including AI, cybersecurity, and fintech. She has written for various publications focusing on the evolving regulatory landscape of AI, digital behavior, web3, and blockchain, as well as social media privacy and security regulations.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.