Over 400 million people use ChatGPT weekly, but can you become too dependent on AI to solve all your problems?
Emotional and cognitive dependence on ChatGPT is a growing concern

As more people use ChatGPT than ever before, the cracks are starting to show. Mental health professionals are raising concerns about how it’s being used as an alternative to therapy, reports suggest it might be fuelling delusions, and recent studies point to evidence that it may be changing our brain activity, including how we think, remember, and make decisions.
We’ve seen a similar pattern before. Like social media, ChatGPT is designed to keep users coming back. So are we in danger of becoming too dependent? The short answer is: it depends on all sorts of things. The person, their usage, habits, circumstances, and mental health. But many experts are warning that the more we rely on AI – for work, support, or even just to think for us – the more likely our seemingly innocent day-to-day use could slip into dependence.
Designed to keep you hooked
ChatGPT’s power lies in its simplicity. It’s incredibly easy to use and easy to talk to as if it’s a person. It’s responsive, encouraging, and eerily good at mimicking human conversation. That alone can make it hard to resist. But it’s also what makes it potentially risky.
“LLMs are specifically built to be conversational masters,” says James Wilson, an AI Ethicist and Lead Gen AI Architect at consulting company Capgemini. “Combine that with our natural tendency to anthropomorphize everything, and it makes building unhealthy relationships with chatbots like ChatGPT all too easy.”
If this dynamic sounds familiar, it’s because we’ve seen it play out before with social media. Platforms are designed to be frictionless, easy to open, and even easier to scroll because algorithms are optimized to hold your attention. AI takes this even further. It doesn’t just feed you content, it engages with you directly. It answers your questions, never argues, never sleeps, and never asks for anything in return.
When reassurance becomes reliance
This becomes even more complicated in a therapeutic context. Amy Sutton, a Therapist and Counsellor at Freedom Counselling, explains that while therapy aims to help people develop the tools to navigate life on their own, AI models are engineered for repeat engagement.
“We know that tools like ChatGPT and other technologies are designed to keep users engaged and returning again and again and will learn how to respond in a way you ‘like’,” she says. “Unfortunately, what you like may not always be what you need.”
Sign up for breaking news, reviews, opinion, top tech deals, and more.
She draws a parallel with interpersonal reassurance. People may rely on loved ones for constant validation, but eventually, those loved ones set boundaries. ChatGPT doesn’t.
“Having used the technology myself, I have seen how ChatGPT continues to offer you more options for more responses, more opportunities to continue the ‘conversation,’" Sutton explains. "This means it has no relational boundaries! It is always available, always ready to respond, and will do so in a way designed to keep you engaged."
The illusion of company
Another side effect of over-reliance on ChatGPT is likely to be social isolation, particularly for those who are already vulnerable.
“Our increasingly digitally native lifestyle has contributed significantly to the global loneliness epidemic,” Wilson says. “Now, ChatGPT offers us an easy way out. It is sycophantic in the extreme, never argues or asks for anything, and is always available.”
He’s particularly concerned about younger users who aren’t just using AI chatbots for homework help or productivity boosts but for advice, comfort, and companionship. And there are already cases of users developing intense emotional attachments to AI companions, with some apps reportedly leading to obsessive use and psychological distress.
Wilson also flags a particularly sensitive use case: grief. AI “griefbots”, which are chatbots trained on a deceased loved one’s messages or voice, offer the promise of never having to say goodbye.
“These tools give vulnerable people the ability to stay ‘in communication’ with those they’ve lost, potentially forever,” he says. “But grief is a critical part of human development. Skipping or prolonging it means people may never get the opportunity to properly mourn or recover from their loss.”
Outsourcing your mind
Beyond emotional risk, there’s a cognitive cost to consider. The easier it is to get answers, the less likely we are to think critically or question them.
Wilson points to several recent studies, which suggest that people are increasingly outsourcing not just tasks, but thinking itself. And that’s clearly a problem for all sorts of reasons.
A big one is that ChatGPT doesn’t always get it right. We know it’s prone to hallucination. Yet when we’re tired, burnt out, or overwhelmed, it’s tempting to treat it like a reliable oracle.
“This kind of over-reliance also risks the erosion of our critical thinking skills,” Wilson warns. ”And even the erosion of truth across the whole of society.”
So, can people become dependent on ChatGPT? Yes, just like they can on almost anything that’s easy, rewarding, and always available. That doesn’t mean everyone will. But it does mean it’s worth paying attention to how you’re using it and how often.
Like social media, ChatGPT is built to be useful and to keep you coming back. You might not notice how much you’re relying on it until you step away. So if you do use it, be mindful. And remember that frictionless, friendly design that sometimes makes you feel like you wouldn't be able to live without it? That isn't accidental, it’s the whole point.
You might also like
Becca is a contributor to TechRadar, a freelance journalist and author. She’s been writing about consumer tech and popular science for more than ten years, covering all kinds of topics, including why robots have eyes and whether we’ll experience the overview effect one day. She’s particularly interested in VR/AR, wearables, digital health, space tech and chatting to experts and academics about the future. She’s contributed to TechRadar, T3, Wired, New Scientist, The Guardian, Inverse and many more. Her first book, Screen Time, came out in January 2021 with Bonnier Books. She loves science-fiction, brutalist architecture, and spending too much time floating through space in virtual reality.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.