'Copilot is for entertainment purposes only': Even Microsoft's official terms and conditions say you really shouldn't be using its AI at work

Copilot keyboard button
(Image credit: Microsoft)

  • Microsoft has clarified some of the terms and conditions associated with Copilot
  • Responsibilities have been shifted onto the users for the AI tool
  • Despite being for "entertainment purposes," it's still heavily marketed toward workers

In a major twist of events, Microsoft has re-affirmed Copilot is for "entertainment purposes only" and that, if used for work, it should be used as the first of multiple stages of fact-checking, rather than being relied upon.

"It can make mistakes, and it may not work as intended," the company wrote. "Don’t rely on Copilot for important advice. Use Copilot at your own risk."

Though the company very much wants businesses and employees to continue using Copilot for work, there's a clear shift in responsibility to the user here, clearing Microsoft of any accusations of false information.

Article continues below

Microsoft says "use Copilot at your own risk"

In a roundabout way, Microsoft is effectively admitting to the risk of AI hallucination amid ongoing concerns about copyrighted content, IP ambiguity and output legitimacy.

With this in mind, the company clearly wants us to think of Copilot as a tool, not a decision-maker, and for users to independently fact-check outputs and be cautious with any sensitive, protected data.

"You agree to indemnify us and hold us harmless... from and against any claims, losses, and expenses... arising from or relating to your use of Copilot," Microsoft added in another paragraph.

More broadly, the company also notes that prompts and responses may be used to improve Copilot, however enterprise versions have additional protections to safeguard sensitive information. In other words, users retain the rights to their inputs, however Microsoft still has the right to use the data for improving the service.

However, while Microsoft's efforts to push some responsibility onto the users' shoulders has hit the spotlights, it's not the only company with such terms. OpenAI, Google and Anthropic all state similar advisories in their terms, including user responsibility and no guarantee of accuracy.

The shift in responsibility from AI vendor to user is an ongoing change that companies are asserting as the industry still works out what the legal risks could be, but with Microsoft still selling Copilot tools to business users and consumers, it's clearly a term rewording exercise than a total shift in behavior.


Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button!

And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.

With several years’ experience freelancing in tech and automotive circles, Craig’s specific interests lie in technology that is designed to better our lives, including AI and ML, productivity aids, and smart fitness. He is also passionate about cars and the decarbonisation of personal transportation. As an avid bargain-hunter, you can be sure that any deal Craig finds is top value!

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.