I looked into how AI chatbots respond to emotions — and what I found out about the ‘ELIZA effect’ completely changed how I think about using them

Young female scientist checking computer, 60s/70s vintage style.
(Image credit: Getty Images / StephanHoerold)

For years, the big concern about tech has been that it's hijacking our attention with features like infinite scroll, autoplay and push notifications, which were all designed to keep us glued to our screens. But with AI, something has changed. It doesn’t just want your attention — it wants something much deeper: emotional connection.

"We are moving from an era of attention exploitation into one of attachment exploitation," says Tara Steele, Director at the Safe AI for Children Alliance. AI interacts continuously, remembers personal details, and responds in ways that feel attentive and human-like. Over time, that can shift AI from feeling like a useful tool you use to a companion you need.

Researcher Zak Stein, founder of the AI Psychological Harms Research Coalition, calls this the “attachment economy”. In an interview with Stein for the Center for Humane Technology, a sharp distinction is made: "Attention is about where you focus. Attachment is about who you are."

Article continues below

Attachment by design

AI is able to exploit our emotions because many chatbots are designed to feel like you’re chatting with another human.

This is clear in so many of the design choices, like typing or thinking indicator dots that give the impression someone is composing a reply, conversational memory that recalls your preferences and history. And, I think most importantly, language that validates and mirrors your emotions back to you.

Psychologists call this the ELIZA effect, named after a chatbot built in 1966 by MIT scientist Joseph Weizenbaum. ELIZA mostly rephrased what you said back as a question, mimicking a therapist.

But Weizenbaum was surprised to find that people quickly began confiding in it, even though they knew it was a program. Modern AI can make this tendency even stronger because it produces more fluent and convincing responses.

James Wilson, a Global AI Ethicist and author of Artificial Negligence, calls some of these features "chatbait", an evolution of clickbait. "Every single response from your chatbot ends with something to entice you to keep the conversation going," he says. "'Would you like me to turn that into a song?' 'Where do you want to go next?'"

He says that certain companies, like Replika and Character.ai, have anthropomorphized their chatbots aggressively, and the overly-validating and even sycophantic language compounds it. "The underlying LLMs are trained so they will always behave in a manner that tries to make you feel super-human," Wilson says. "'Oh, you are so right!' 'That's a fantastic idea!'"

And, of course, none of this is accidental. The success of AI is measured in engagement, growth and market dominance. So getting users emotionally attached means they will stay, pay and then keep on paying.

Steele explains that this makes the influence of AI feel "more personalized, more persistent, and more deeply embedded than anything we have seen with traditional digital media — and far less likely to be recognized as influence at all."

Two hands are touching against a vibrant crimson background, electrical energy light surrounding their connection.

One in five kids and teens in high school in the US say they or someone they know has had a romantic relationship with AI. (Image credit: Getty Images / Yana Iskayeva)

The damage we don’t see

There have already been alarming cases of people forming deep, emotional bonds with AI that have resulted in lives being derailed, psychiatric crises, and in the most tragic instances, deaths.

But, in his interview with the Center for Humane Technology, Stein argues that we should also be watching the less visible cases. Not assuming everyone is experiencing AI psychosis, but something harder to spot. "The most devastating thing from a widespread mental illness standpoint are the subclinical attachment disorders, which basically means you prefer to have intimate relationships with machines rather than humans. And this includes friends, intimate relationships, and parents."

So we may soon see a huge influx of people choosing AI relationships over “real” ones. And there are signs it’s already happening. One in five kids and teens in high school in the US say they or someone they know has had a romantic relationship with AI. In the UK, 64% of children aged 9 to 17-years old are already using chatbots.

All of this really matters because human relationships do things AI can't. Therapist Amy Sutton from Freedom Counselling points out that genuine, secure attachment requires something AI will never offer.

"A secure relationship is about two individuals able to be separate and together, sometimes disagreeing, upsetting each other and working it through," she says. "In short, healthy relationships need each person to get things wrong. To be annoying, upsetting and aggravating — to be flawed."

But we know that AI has no interest in conflict. It only wants to keep you engaged. This is especially concerning for children who might be forming their earliest understanding of what a relationship feels like through interactions with AI systems designed to never disagree with them.

Rear view of a mature woman looking out of her bedroom window on a bright winter's day.

Loneliness has become a silent epidemic in modern life. (Image credit: Getty Images / Justin Paget)

The loneliness loop

But I do think that only blaming design misses something. New technologies are created all the time that don't gain traction. Maybe the attachment economy is landing because it's meeting a need that already exists.

We know that so many people are lonely; communities have been hollowed out, support systems have thinned. Technology isn't solely to blame for that. But there's something particularly bleak about an industry that helped erode human connection now packaging a simulation of it and selling it back to us.

"It's no surprise that tech companies are selling the solution to the problem they've created," Sutton says. "Sell us on the promise of greater human connection, create loneliness, then sell us the solution to it."

She compares AI attachment to junk food: "It's the junk food of connection. It's easily available, tastes great, satiates an appetite, but with no real nourishment — and very quickly you come back for more."

This reminds me of the way Tristan Harris of the Center for Humane Technology says we are becoming "coffin builders". We are designing, using and strengthening AI systems that could render humans obsolete.

Steele warns that we need to act soon. "If AI systems are increasingly designed to occupy roles that were once reserved for human relationships, we risk eroding the boundary between assistance and attachment in ways society is not yet ready for," she says.

I’ve been writing about AI for more than a year now, and the argument for widespread AI use is always the same: it’s just a tool. But that distinction between tool and companion is only useful if the people building these systems respect it. Right now, it seems like many of them don’t.


Google logo on a black background next to text reading 'Click to follow TechRadar'

Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds.


TOPICS
Becca Caddy

Becca is a contributor to TechRadar, a freelance journalist and author. She’s been writing about consumer tech and popular science for more than ten years, covering all kinds of topics, including why robots have eyes and whether we’ll experience the overview effect one day. She’s particularly interested in VR/AR, wearables, digital health, space tech and chatting to experts and academics about the future. She’s contributed to TechRadar, T3, Wired, New Scientist, The Guardian, Inverse and many more. Her first book, Screen Time, came out in January 2021 with Bonnier Books. She loves science-fiction, brutalist architecture, and spending too much time floating through space in virtual reality. 

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.