'That language is no longer reflective of how Copilot is used today': Microsoft says Copilot isn't just for 'entertainment purposes only'

Sharp-eyed users spotted Microsoft describing its Copilot AI as "for entertainment purposes only"

Microsoft Copilot logo and branding pictured on a smartphone screen, with smartphone placed on top of a laptop keyboard.
(Image credit: Getty Images)

Microsoft has confirmed that terms describing Copilot as for “entertainment purposes only" need to be updated.

The terms of use for Copilot have raised eyebrows, sparking bemusement online and coming amid wider concerns that the use of AI in businesses is leading to "workslop" that employees spend half their day fixing.

The terms state: "Copilot is for entertainment purposes only. It can make mistakes, and it may not work as intended. Don’t rely on Copilot for important advice. Use Copilot at your own risk."

That document was last updated back in October, when they were "rewritten and reorganized."

Previous versions of the terms stretching back to 2023 didn't include that exact same line, but did note: "The Online Services are for entertainment purposes; the Online Services are not error-free, may not work as expected and may generate incorrect information."

Microsoft confirms T&Cs changes

Though the language isn't entirely new, it was spotted by users and posted to social media at a time when Microsoft is facing some backlash to how heavily it's embedding Copilot into Windows.

Last month Microsoft announced plans to dial back some AI features in Windows amid a growing backlash against Copilot, and in January revealed plans to allow admins to remove the Copilot app completely if it wasn't being used.

The tech giant has also recorded its worst quarter since the 2008 financial crisis, with one analyst suggesting the firm is "in a pickle" over issues with Copilot, particularly adoption rates.

Microsoft said it would update the disclaimer as the terms are now out of date.

"The ‘entertainment purposes’ phrasing is legacy language from when Copilot originally launched as a search companion service in Bing," a Microsoft spokesperson told PCMag.

"As the product has evolved, that language is no longer reflective of how Copilot is used today and will be altered with our next update."

ITPro approached Microsoft for comment, but did not receive a response by time of publication.

AI warnings

While reports noted social media users laughing that Microsoft's language echoes the legalese used by psychics and ghost hunters, such disclaimers are also common among AI developers — though they warn about potential risks rather than declaring their products "entertainment".

The terms for xAI state that its service "may contain errors, defects, bugs or inaccuracies that could fail or cause corruption or loss of data and information”.

These particular terms request users agree not to "rely on output as the sole source of truth or factual information, or as professional advice."

OpenAI features the same language, also warning the service may not be "error free" while Anthropic warned that Claude outputs “may not always be accurate" and "actions may not be error free".

Microsoft also adds in its terms: "always use your judgment and check the information you get from Copilot before you make decisions or act."

FOLLOW US ON SOCIAL MEDIA

Follow ITPro on Google News and add us as a preferred source to keep tabs on all our latest news, analysis, views, and reviews.

You can also follow ITPro on LinkedIn, X, Facebook, and BlueSky.

Freelance journalist Nicole Kobie first started writing for ITPro in 2007, with bylines in New Scientist, Wired, PC Pro and many more.

Nicole the author of a book about the history of technology, The Long History of the Future.