Using ChatGPT on your smartwatch could go very wrong, and I'm never doing it

Woman using AI on her smartwatch
(Image credit: TiernyMJ / Shutterstock)

ChatGPT is coming for all aspects of our lives, and now its influence apparently extends to our physical health and wellbeing. And we should be concerned about this, given the AI chatbot’s questionable track record.

Recently, Amazfit announced that ChatGPT will arrive on its GTR 4 smartwatch (it’s unclear at the moment if it’ll land on any of Amazfit’s other best smartwatch models), labeled as ‘ChatGenius’. The demonstration showed the user asking ChatGenius questions on “How to improve running performance” and “how to improve sleep quality”, with the results shown in a readout accessible via the digital crown.

The answers were fairly top-line in the demonstration, with the running performance query answered with “focus on proper nutrition and hydration” and “make sure you’re getting enough sleep and taking regular breaks from running”. While seems harmless enough, this is a road that potentially leads to a dangerous destination.

At present, anyone can ask ChatGPT health-related questions on a computer or phone, and get a similar generic auto-generated answer. However, smartwatches can have a big influence on the user’s health and wellbeing, and just like the user in the demonstration, smartwatch users are more likely to be using the feature as advertised and asking it for health and fitness tips. And to me, that is a giant mistake.

Watch Amazfit's ChatGenius demonstration here:

You don’t have to look around the internet for long to find reams of dubious health and fitness advice, often from influencers on Instagram, YouTube, and TikTok. Whether you’ve got someone telling you to eat raw liver and animal organs, do kettlebell swings in an unorthodox way, run barefoot, or take untested supplements for big results, it’s easy to trip up and get hoodwinked.

When you’ve been training the right way for months and seeing frustratingly very little results, it’s tempting to take the advice of someone who looks like a Greek god on social media, regardless of their qualifications – or lack thereof.

ChatGPT is no different. An AI gathering data from its users and dispensing fitness advice without oversight from qualified nutritionists and personal trainers, ought to be cause for concern.

My colleagues have already tested the AI service to its limit – it failed to help program a game, and it even cheats at Tic-Tac-Toe – and found some gaping flaws in the current iteration of the technology. It should not be dispensing health advice best left to qualified professionals, however innocuous it may seem at first.

ChatGPT on Amazfit GTR 4

(Image credit: Amazfit)

ChatGPT is fed information via data crawling websites, and there is so much fitness noise out there that some of this misinformation would have doubtless been swept up in the crawl. I would not trust ChatGPT, for example, to write me an 18-week training plan helping the user to run their first marathon, or a diet plan to build muscle, based on how horrifically inaccurate the service can be. Such things need to be properly vetted.

There are health considerations, sure, with the largely unregulated service potentially recommending some dangerous or unhealthy practices, but mental health considerations come into play too. When ChatGPT begins a conversation with a user based on a “lose weight fast” query, it has no way of knowing if that particular user suffers from anorexia or bulimia, for example.

If users question the service on the fastest way to build muscle, steroid recommendations are a risk – and how long will it take for the folks at Amazfit and ChatGPT to shut down this particular line of questioning?

At the moment, this video from Amazfit is all we have to go on when it comes to the smartwatch maker’s plans for ChatGenius. Based on the questions asked during this brief demonstration, there are clearly plans for ChatGPT to answer health and fitness-related queries. But I hope there are some severe guardrails or limiters in place, or things are about to go very wrong, very quickly.

I wouldn’t trust a medical diagnosis from a doctor without a medical degree, and neither, I imagine, would you. It would mean the doctor is unqualified. So why would you take fitness advice from something which doesn’t even have a physical body? ChatGPT is not qualified to answer these questions, and throwing up an answer cobbled together from snippets of online health advice, which massively varies in quality, can only lead to disaster.

Matt Evans
Fitness, Wellness, and Wearables Editor

Matt is TechRadar's expert on all things fitness, wellness and wearable tech. A former staffer at Men's Health, he holds a Master's Degree in journalism from Cardiff and has written for brands like Runner's World, Women's Health, Men's Fitness, LiveScience and Fit&Well on everything fitness tech, exercise, nutrition and mental wellbeing.

Matt's a keen runner, ex-kickboxer, not averse to the odd yoga flow, and insists everyone should stretch every morning. When he’s not training or writing about health and fitness, he can be found reading doorstop-thick fantasy books with lots of fictional maps in them.

TOPICS