Content creators wary of UK's AI copyright consultation
Balancing the interests of content creators and AI firms is a tricky business


Content creators are concerned about the use of their materials for AI training, following the launch of a government consultation that some say appears to favor AI firms.
The government says the consultation aims to improve legal clarity for the creative and AI sectors over how copyright protected materials are used in model training.
The proposals, lawmakers said, are intended to be fair, giving creators greater control over how their material is used by AI developers, enhancing their ability to be paid for its use, while giving AI developers wide access to material to train their models.
Similarly, the plans aim to provide greater transparency among AI firms over the data used to train AI models, as well as how AI-generated content is labelled.
"We are setting out a balanced package of proposals to address uncertainty about how copyright law applies to AI so we can drive continued growth in the AI sector and creative industries, which will help deliver on our mission of the highest sustained growth in the G7 as part of our Plan for Change," said secretary of State for science, innovation and technology Peter Kyle.
"This is all about partnership: balancing strong protections for creators while removing barriers to AI innovation; and working together across government and industry sectors to deliver this."
But balancing the needs of creators and AI firms is a tricky tightrope to walk, and many industry observers and creators are skeptical.
Get the ITPro daily newsletter
Sign up today and you will receive a free copy of our Future Focus 2025 report - the leading guidance on AI, cybersecurity and other IT challenges as per 700+ senior executives
"The outcome will likely depend on getting the details right and whether ministers can get the balance right," said Matthew Sinclair, senior director of the Computer & Communications Industry Association (CCIA).
"In particular, it will be critical to ensure that the transparency requirements are realistic and do not ask AI developers to compromise their work by giving away trade secrets and highly sensitive information that could jeopardize the safety and security of their models."
Given long-running issues relating to the use of copyrighted materials in AI training, creators are concerned. Earlier this week, a series of publishers and author groups launched the Creative Rights In AI Coalition, calling for royalties to be paid to the creators of text, audio, or video used to train AI models.
"We are eager to see the development of a vibrant licensing market and support the sectors which rely on us for their future prosperity, but we can only do so with a robust copyright framework which preserves our exclusive rights to control our works and thereby act as a safeguard against misuse," the coalition said.
RELATED WHITEPAPER
Amanda Brock, CEO at OpenUK, points out that the 10-week consultation and implementation period kicks any decision well into next year. Other countries, she added, have had clarity on fair-use exceptions for AI training for some time, and seen growth in their AI industries as a result.
Brock further warned that balancing the interests of creators and the AI industry is easier said than done.
"Any decision to require royalty or license payment for data usage as an outcome of the consultation inevitably disadvantages UK AI innovation," she said.
"The consultation outcome is predictable and the facts don’t change. There is direct conflict between the interests of these competing sectors."
Brock noted that licensing or royalty models are never going to be a long-term answer and merely represent a stopgap solution.
"In the same way as the linking agreements to connect web sites used in the early 2000s were soon made redundant by technology, content licensing is always going to be inhibitive of innovation in an AI world," she said.
"At best it will enable a few large companies with the wherewithal to enter licensing agreements to work with it."
Emma Woollacott is a freelance journalist writing for publications including the BBC, Private Eye, Forbes, Raconteur and specialist technology titles.
-
Security experts issue warning over the rise of 'gray bot' AI web scrapers
News While not malicious, the bots can overwhelm web applications in a way similar to bad actors
By Jane McCallion Published
-
Does speech recognition have a future in business tech?
Once a simple tool for dictation, speech recognition is being revolutionized by AI to improve customer experiences and drive inclusivity in the workforce
By Jonathan Weinberg Published
-
DeepSeek and Anthropic have a long way to go to catch ChatGPT: OpenAI's flagship chatbot is still far and away the most popular AI tool in offices globally
News ChatGPT remains the most popular AI tool among office workers globally, research shows, despite a rising number of competitor options available to users.
By Ross Kelly Published
-
Productivity gains, strong financial returns, but no job losses – three things investors want from generative AI
News Investors are making it clear what they want from generative AI: solid financial and productivity returns, but no job cuts.
By Nicole Kobie Published
-
Legal professionals face huge risks when using AI at work
Analysis Legal professionals at a US law firm have been sanctioned over their use of AI after it was found to have created fake case law.
By Solomon Klappholz Published
-
Microsoft says AI tools such as Copilot or ChatGPT are affecting critical thinking at work – staff using the technology encounter 'long-term reliance and diminished independent problem-solving'
News Research from Microsoft suggests that the increased use of AI tools at work could impact critical thinking among employees.
By Nicole Kobie Published
-
Future focus 2025: Technologies, trends, and transformation
Whitepaper Actionable insight for IT decision-makers to drive business success today and tomorrow
By ITPro Published
-
Looking to use DeepSeek R1 in the EU? This new study shows it’s missing key criteria to comply with the EU AI Act
News The DeepSeek R1 AI model might not meet key requirements to comply with aspects of the EU AI Act, according to new research.
By Rory Bathgate Published
-
The DeepSeek bombshell has been a wakeup call for US tech giants
Opinion Ross Kelly argues that the recent DeepSeek AI model launches will prompt a rethink on AI development among US tech giants.
By Ross Kelly Published
-
OpenAI unveils its Operator agent to help users automate tasks – here's what you need to know
News OpenAI has made its long-awaited foray into the AI agents space
By Nicole Kobie Published