ChatGPT's new privacy tools should not just be for businesses willing to pay

Human resource technology concept. Using software, digital platforms, tools, and cloud-based technologies that securely store data, automate processes, and provide analytical tools to make decisions.
(Image credit: 3rdtimeluckystudio via Shutterstock)

Earlier we reported on Microsoft’s $10 billion deal with OpenAI (the company behind the popular chatbot) for a slightly altered version of ChatGPT, which would allow companies to use the bot without worrying about their privacy. This newer version would operate through dedicated servers and store data on cloud server units solely for business. Data on this isolated server will not communicate with the main ChatGPT system. 

That kind of service comes at a price, of course, and enterprise customers could be looking at a hefty fee to enjoy the secure experience. The report notes that as it would be considered a ‘tailored’ or custom experience it’s likely to cost up to ten times more than the price of ChatGPT’s plus version, which currently costs you $20.

The move to construct private ChatGPT may help attract large firms and big tech companies that have policies in place forbidding employees from using the bot, or help squash concerns that keep big businesses from trusting the service - particularly after the ChatGPT blackout and subsequent data leak.

While it’s good to see Microsoft and OpenAI take steps to develop more secure measures for users, it’s frustrating to see privacy put behind a steep paywall. As we noted above, the cost to use the service will cost ten times more than what they might be paying for a ChatGPT Plus subscription, so if you want better privacy you’d better be ready to put down some serious cash.
 

It's spreading!

The move from Microsoft seems like a rushed attempt to make ChatGPT more profitable and appeal to big businesses and banks to hopefully bring in high-profile users. Of course, it makes sense to keep sensitive data away from public language models to prevent information from being shared outside organisations, but you can’t help but wonder if this could be the start of the AI invasion into so many aspects of our lives.

Regardless of how you feel about the immerging technology, you have to admit that it may be a little too soon to start implementing it into our banking system or in professional spheres like LinkedIn. If this is the start of further integration, how easy will it be for users to opt out if you’re given the option? Or do we all just have to deal with the fact that in the very near future, in some small way, you will cross paths with ChatGPT.

The whole move feels very rushed and once again an example of companies like Microsoft, ChatGPT and even Google Bard placing more importance on the ‘when’ and not the ‘why’. If they can develop more secure ways of navigating the AI sphere, why keep that security behind a paywall? 

Muskaan Saxena
Computing Staff Writer

Muskaan is TechRadar’s UK-based Computing writer. She has always been a passionate writer and has had her creative work published in several literary journals and magazines. Her debut into the writing world was a poem published in The Times of Zambia, on the subject of sunflowers and the insignificance of human existence in comparison.

Growing up in Zambia, Muskaan was fascinated with technology, especially computers, and she's joined TechRadar to write about the latest GPUs, laptops and recently anything AI related. If you've got questions, moral concerns or just an interest in anything ChatGPT or general AI, you're in the right place.

Muskaan also somehow managed to install a game on her work MacBook's Touch Bar, without the IT department finding out (yet).