The House of Lords Communications and Digital Committee has urged the UK government to give serious consideration to the impact of artificial intelligence (AI) on original content.
The Communications and Digital Committee’s AI, copyright and the creative industries – 4th report of session 2024-26, published on 6 March 2026, warned that it would be a very “poor bet” for the government to allow changes to copyright that could undermine the UK’s creative industries.
The report’s authors questioned tech industry claims that introducing a new commercial text and data mining (TDM) exception for AI training would significantly expand the AI sector. They warned that weakening the UK’s copyright law in this way would exacerbate existing harms to rightsholders and stall the emerging licensing market.
Some of the witnesses who gave evidence told the committee that technical measures are essential in supporting attribution, transparency, licensing and remuneration in a licensing-first system.
John Collomosse, director of DECaDE, the UKRI Next Stage Centre for the Decentralised Digital Economy, told the committee that, at each stage of the AI lifecycle, the challenge facing rightsholders is how to meaningfully control whether their work is used and under what terms, and how they can gain an opportunity to share in the value created by the model outputs.
Another witness, Eugene Huang, senior strategy advisor and co-founder of ProRata.ai, said there should be mechanisms in place that can support remuneration linked to both inputs and outputs of AI models.
One of the proposed ways to protect copyrighted content is the idea of open provenance standards, such as the Coalition for Content Provenance and Authenticity (C2PA) specification, which, when combined with watermarking and fingerprinting, would mean durable, machine-readable signals are attached to individual assets.
The committee was told that this would need to be created “at source” by creators and rightsholders to enable stakeholders along sector-specific supply chains to understand this data and manage it effectively.
Collomosse said that once provenance is established in this way, it becomes easier to confirm authenticity and “assign ownership”. He said rightsholders could then “set up their own preferences and licensing schemes on top of that using other standards”, enabling remuneration models to be layered onto individual assets rather than relying solely on blanket licensing.
The committee is calling on the government to develop a licensing-first regime, underpinned by robust transparency, that safeguards creators’ livelihoods while supporting sustainable AI growth.
Committee chair Barbara Keeley warned: “Our creative industries face a clear and present danger from uncredited and unremunerated use of copyrighted material to train AI models. Photographers, musicians, authors and publishers are seeing their work fed into AI models, which then produce imitations that take employment and earning opportunities from the original creators.”
Watering down the protections in our existing copyright regime to lure the biggest US tech companies is a race to the bottom that does not serve UK interests Barbara Keeley, Communications and Digital Committee
The industry’s response appeared to show there is less appetite for such controls.
In the report, Google argued that, while governments may be tempted to promote a “one-size-fits-all” standard, it is important that industry retains “flexibility and room for the evolution of standards and best practices”.
Meta said prescribing specific approaches “would risk freezing in time the current state of development while new standards and new tools are still emerging, locking the UK out as the rest of the world continues to innovate”.
Microsoft said any legislative approach should recognise that provenance tools do not work equally well for all types of content, remain “technology agnostic”, and avoid mandating requirements in areas “where they are not yet feasible or effective”.
Keeley noted that while AI may contribute to future economic growth, the UK creative industries create jobs and economic value now. “Watering down the protections in our existing copyright regime to lure the biggest US tech companies is a race to the bottom that does not serve UK interests. We should not sacrifice our creative industries for AI jam tomorrow,” she said.
“The government should now make clear it will not pursue a new text and data mining exception with an opt-out mechanism for training commercial AI models. Instead, it should focus on strengthening UK protections for creators, including against unauthorised digital replicas and ‘in the style of’ uses of creators’ work and identity,” Keeley added.
“The government’s task should be to create the conditions that will allow a licensing-first approach to AI training to flourish, backed by effective transparency requirements and technical standards for data provenance and labelling, so that rightsholders and developers can participate confidently in this emerging market,” she continued.
Keeley called on the government and AI industry in the UK to be based on transparent and responsible use of training data. “We are calling on the government to embrace the opportunities this presents, and to demonstrate its commitment to the UK’s gold-standard copyright regime and our outstanding creative industries in its forthcoming economic assessment and update on AI and copyright,” she said.
Sign Up For Daily Newsletter
Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.