‘We have to learn to embrace the imperfect nature of human solutions…’ — what we lose when AI starts doing all our thinking at work
The hidden mental cost of letting AI do too much of our thinking
Sign up for breaking news, reviews, opinion, top tech deals, and more.
You are now subscribed
Your newsletter sign-up was successful
The AI work dream we were sold went like this: use AI and work gets easier, days feel calmer, and your mind is finally free to focus on what matters. More interesting tasks, more creative thinking, more energy left for life outside work. But that promise is already starting to crack.
A 2025 MIT report suggests that around 95% of generative AI pilots inside companies are failing to deliver on their promises. Other research, including work from the Stanford Social Media Lab in collaboration with research team BetterUp Lab, suggests AI tools can end up creating more work, not less.
Much of the conversation has focused on what this means for businesses and the bottom line. But as evidence also grows linking heavy AI use to weaker critical thinking and learning skills, a more urgent question emerges: what is using AI at work doing to our minds?
To explore that, I spoke to Ellen Scott, journalist, Digital Editor of Stylist and author of Working on Purpose. She’s spent years thinking deeply about modern work, its demands, contradictions, and emotional toll. Scott describes the effect AI is having on our working lives as “smoothout”: the gradual erosion of friction, challenge, and agency in the name of ease. I wanted to understand what smoothout looks like in practice, why it’s so common, and how we might resist it.
What is smoothout?
The over-reliance on [AI] isn’t good for our minds
Ellen Scott
“Smoothout is a term I coined to describe a specific type of burnout that comes from overusing AI,” Scott explains. While it isn’t a formal medical diagnosis, she uses it to describe a pattern she’s increasingly seeing emerge around AI use at work, particularly as employers push wider adoption of generative tools.
Scott describes smoothout as “a cousin of burnout”, because the symptoms often overlap: tiredness, low mood, stress, fatigue, a loss of motivation. It also shares similarities with 'boreout', the mental health impact of being consistently understimulated at work.
But the key difference is the trigger. With smoothout, the cause isn’t overwork or boredom, but reliance on AI tools in place of challenge. “When we don’t have sufficient challenges, or the opportunity for the mental-health-boosting experience of mastery, our sense of accomplishment drops,” Scott explains. “We become disengaged, and negative stress symptoms are triggered.”
Sign up for breaking news, reviews, opinion, top tech deals, and more.
She says that the problem is how quickly we now turn to AI at the first hint of difficulty – especially at work. In doing so, we might lose something important. “We rid ourselves of the opportunity for challenge and the healthy form of stress associated with that, known as eustress,” she explains. “Smoothout happens when work and life become too smooth. Our brains crave friction.”
Why our brains need challenge
There’s a solid body of research that suggests our brains thrive on the right amount of challenge. Psychologist Mihaly Csikszentmihalyi, the author of Flow: the psychology of optimal experience, showed that people are most engaged and fulfilled when the challenge of a task matches their skill level. So, too easy and we disengage, too hard and we shut down.
Educational theory echoes this. Psychologist Lev Vygotsky’s work on the Zone of Proximal Development shows that learning happens best when we’re stretched just beyond what we can do without help. Cognitive research on “desirable difficulties” reaches a similar conclusion – tasks that require effort lead to deeper learning than frictionless shortcuts.
One way to understand what’s happening with AI is through the idea of cognitive offloading, which is using tools or external systems to reduce mental effort. We’ve always done this, think writing notes, setting reminders and using calculators. Offloading can be helpful because it frees up cognitive resources and allows us to focus elsewhere.
But AI isn’t just another notepad or calculator. It can think with us, or even instead of us, across a wide range of tasks. That’s why early research suggests the cognitive offloading that happens when we use AI may be fundamentally different, with potential consequences for learning, memory, and skill development.
How to prevent smoothout
AI should be used to do the parts of work that aren’t beneficial for mental or physical wellbeing.
Ellen Scott
The answer isn’t to stop using AI entirely, especially if it’s part of your job, but to think more carefully about when you reach for it.
“I’m not anti-AI in all cases,” Scott explains. “But it’s the over-reliance on it that isn’t good for our minds.” She says the key is whether AI is replacing the parts of work that challenge us, rather than supporting us through the parts that drain us.
“AI should be used to do the parts of work that aren’t beneficial for mental or physical wellbeing,” she says, like the monotonous, administrative tasks that add little meaning. Not the “meaty” parts of work that create challenge, engagement, and a sense of fulfilment.
Which tasks fall into each category will differ from person to person. As a writer and editor, Scott values the challenge of writing an introduction to an article, but doesn’t need the mental stretch of processing invoices. Using AI for the latter makes sense. Using it for the former does not.
“I care about writing. It’s the part of my job I most value, and my writing skills are something I always want to hone and develop. If I outsource that challenge, I deny myself the opportunity to do that,” Scott says.
Her advice is to ask yourself some simple but uncomfortable questions: which tasks distract from the point of your job without offering challenge or meaning? You could use AI for those. And which ones actually matter to you? Resist the temptation to use AI, even when it feels uncomfortable.
Learning to notice, acknowledge, and accept discomfort is a big part of this process. “Even if it’s difficult, we need to push through and do certain things ourselves,” Scott says. “We have to learn to embrace the imperfect nature of human solutions, rather than defaulting to the smoother versions produced by AI.”
If the line isn’t clear, she also suggests paying attention to how you feel. Not just while you’re working, but afterwards. Are you engaged, depleted, restless, detached? Scott says that journaling can help people spot patterns between how they use AI and how their work affects them emotionally.
Underlying all of this is a bigger point: work does give many of us a sense of purpose. While plenty of people feel burned out or disconnected from their jobs, the idea that AI will remove work altogether and make us happier by default is likely misguided. “In my book, Working On Purpose, there's a section about why work is good for us and why I see fulfilling work as a human right,” Scott says. “I believe that bosses have a moral responsibility to ensure that the work they pay people to do is enjoyable, interesting, and fulfilling.”
And sometimes, smoothout might be a bigger signal you need to pay attention to. “If the bulk of your work can be done by AI with no real harm, that may be a sign you need a new challenge,” Scott says. Work shouldn’t be an impossible slog, but it shouldn’t be a frictionless slide either. The task now is to resist defaulting to AI at every point of difficulty and instead find the level of challenge that keeps you engaged, learning, and feeling mentally well.
Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button!
And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.

Becca is a contributor to TechRadar, a freelance journalist and author. She’s been writing about consumer tech and popular science for more than ten years, covering all kinds of topics, including why robots have eyes and whether we’ll experience the overview effect one day. She’s particularly interested in VR/AR, wearables, digital health, space tech and chatting to experts and academics about the future. She’s contributed to TechRadar, T3, Wired, New Scientist, The Guardian, Inverse and many more. Her first book, Screen Time, came out in January 2021 with Bonnier Books. She loves science-fiction, brutalist architecture, and spending too much time floating through space in virtual reality.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.