'That language is no longer reflective of how Copilot is used today': Microsoft says Copilot isn't just for 'entertainment purposes only'
Sharp-eyed users spotted Microsoft describing its Copilot AI as "for entertainment purposes only"
Sign up today and you will receive a free copy of our Future Focus 2025 report - the leading guidance on AI, cybersecurity and other IT challenges as per 700+ senior executives
You are now subscribed
Your newsletter sign-up was successful
Microsoft has confirmed that terms describing Copilot as for “entertainment purposes only" need to be updated.
The terms of use for Copilot have raised eyebrows, sparking bemusement online and coming amid wider concerns that the use of AI in businesses is leading to "workslop" that employees spend half their day fixing.
The terms state: "Copilot is for entertainment purposes only. It can make mistakes, and it may not work as intended. Don’t rely on Copilot for important advice. Use Copilot at your own risk."
That document was last updated back in October, when they were "rewritten and reorganized."
Previous versions of the terms stretching back to 2023 didn't include that exact same line, but did note: "The Online Services are for entertainment purposes; the Online Services are not error-free, may not work as expected and may generate incorrect information."
Microsoft confirms T&Cs changes
Though the language isn't entirely new, it was spotted by users and posted to social media at a time when Microsoft is facing some backlash to how heavily it's embedding Copilot into Windows.
Last month Microsoft announced plans to dial back some AI features in Windows amid a growing backlash against Copilot, and in January revealed plans to allow admins to remove the Copilot app completely if it wasn't being used.
Sign up today and you will receive a free copy of our Future Focus 2025 report - the leading guidance on AI, cybersecurity and other IT challenges as per 700+ senior executives
The tech giant has also recorded its worst quarter since the 2008 financial crisis, with one analyst suggesting the firm is "in a pickle" over issues with Copilot, particularly adoption rates.
Microsoft said it would update the disclaimer as the terms are now out of date.
"The ‘entertainment purposes’ phrasing is legacy language from when Copilot originally launched as a search companion service in Bing," a Microsoft spokesperson told PCMag.
"As the product has evolved, that language is no longer reflective of how Copilot is used today and will be altered with our next update."
ITPro approached Microsoft for comment, but did not receive a response by time of publication.
AI warnings
While reports noted social media users laughing that Microsoft's language echoes the legalese used by psychics and ghost hunters, such disclaimers are also common among AI developers — though they warn about potential risks rather than declaring their products "entertainment".
The terms for xAI state that its service "may contain errors, defects, bugs or inaccuracies that could fail or cause corruption or loss of data and information”.
These particular terms request users agree not to "rely on output as the sole source of truth or factual information, or as professional advice."
OpenAI features the same language, also warning the service may not be "error free" while Anthropic warned that Claude outputs “may not always be accurate" and "actions may not be error free".
Microsoft also adds in its terms: "always use your judgment and check the information you get from Copilot before you make decisions or act."
FOLLOW US ON SOCIAL MEDIA
Follow ITPro on Google News and add us as a preferred source to keep tabs on all our latest news, analysis, views, and reviews.
You can also follow ITPro on LinkedIn, X, Facebook, and BlueSky.
Freelance journalist Nicole Kobie first started writing for ITPro in 2007, with bylines in New Scientist, Wired, PC Pro and many more.
Nicole the author of a book about the history of technology, The Long History of the Future.
-
German authorities want your help finding the hackers behind GandCrab and REvilNews Daniil Maksimovich Shchukin and Anatoly Sergeevitsch Kravchuk are believed to have made millions from ransomware as a service schemes
-
Enterprises are 'paralyzed by a lack of understanding' with AINews It's not the tech that's the problem, it's your business case, says Forrester
-
‘Fragmentation is poison’: How Microsoft is targeting disparate data to boost AI adoptionNews Amir Netz, the co-creator of Microsoft's Power BI service, tells ITPro that business context is key to effective AI deployment.
-
Microsoft is rolling out Copilot Cowork to more customersNews Use of Copilot Cowork has been limited to select customers so far
-
Satya Nadella needs to remember the Streisand effect for 'AI slop'Opinion Attempts to discourage criticism may backfire for Microsoft’s CEO
-
Microsoft has a new AI poster child in Anthropic – and it’s about timeOpinion Microsoft is cosying up to Anthropic at a crucial time in the race to deliver on AI promises
-
Anthropic's Claude Cowork tool is coming to Microsoft CopilotNews The new Copilot Cowork tool will be made available through a new Microsoft 365 tier at the end of March
-
Microsoft Copilot bug saw AI snoop on confidential emails — after it was told not toNews The Copilot bug meant an AI summarizing tool accessed messages in the Sent and Draft folders, dodging policy rules
-
If Satya Nadella wants us to take AI seriously, let’s forget about mass adoption and start with a return on investment for those already using itOpinion The Microsoft chief said there’s a risk public sentiment might sour unless adoption is distributed more evenly
-
Satya Nadella says a 'telltale sign' of an AI bubble is if it only benefits tech companies – but the technology is now having a huge impact in a range of industriesNews Microsoft CEO Satya Nadella appears confident that the AI market isn’t in the midst of a bubble, but warned widespread adoption outside of the technology industry will be key to calming concerns.
