Menu
blockchain

AI and copyright: House of Lords Committee sets out 'licensing-first' approach

Posted on 9 March 2026

Reading time 7 minutes

In brief 

  • The House of Lords Communications and Digital Committee has published a report on AI, copyright and the creative industries, unequivocally recommending that the Government rule out a commercial text and data mining exception with rightsholder opt-out, and instead adopt a 'licensing-first' approach that provides a framework for both the creative industries and the AI sector to thrive. 
  • The report calls for stronger protections for rightsholders, including introducing sufficiently granular transparency obligations around AI training data (going beyond the high-level summaries required under the EU AI Act), new protections for identity and style imitation, and robust labelling requirements for AI-generated content. 
  • With the Government's own report on copyright and AI due by 18 March 2026, press reports have suggested it plans to consult further before it publishes its final position. The Committee urges the Government to publish its evidence-based final decision within the next 12 months, warning that continued uncertainty is hampering the development of a functioning licensing market.

Just a few weeks before the Government is required to publish a report on the use of copyright works in the development of AI systems – as required under the Data (Use and Access) Act 2024 – the House of Lords Communications and Digital Committee has published its own report on "AI, copyright and the creative industries". The report is unequivocal in its recommendation that the Government should rule out a commercial text and data mining exception with opt-out model. Noting that, in 2023, the UK's creative industries contributed £124 billion in GVA and employed 2.4 million people, whilst in 2024 the AI sector as a whole contributed £11.8 billion in GVA and employed around 86,000 people, the report concludes that both should be provided with a framework to thrive, incorporating a 'licensing-first ecosystem'

In the meantime, press reports suggest that the Government will not be publishing its final position in the forthcoming report (due to be published by 18 March), but will instead be consulting further on how to proceed. Any further delay will be disappointing for rightsholders, given that the ongoing uncertainty that it contributes to is having a significant impact on development of a licensing framework. The Committee expresses a concern that the Government is still some way from setting out a clear policy direction, and calls upon it to publish its evidence-based final decision on its approach to AI and copyright within the next 12 months.

The Committee's recommendations

The report makes the following recommendations

  • Pending publication of its proposed policy direction, the Government should rule out a commercial text and data mining exception with opt-out model (as has been the case in Australia), and provide clear messaging that its expectation is that commercial AI developers operating in the UK should obtain appropriate licences. It should also rule out any reform of copyright legislation that would remove the incentive to license copyright works for AI training, and should instead focus on strengthening licensing, transparency and enforcement under the UK's existing 'gold standard' copyright framework.
     
  • The Committee welcomes the Government's 'reset' from its initial position – that a text and data mining exception with opt-out option was its preferred model – but is critical of the delay caused by the ongoing consultation which has itself contributed to uncertainty about the UK's position and prevented a fully formed licensing market to emerge. As the Committee notes, the main uncertainty for large AI developers appears to be whether their current and proposed training practices would withstand legal challenge in any proceedings.
     
  • Protections should be introduced for identity, style imitation(i.e., against 'in the style of' uses),and digital replicas, with particular concern expressed around voice actors being replaced by synthetic voices. This could be achieved through the introduction of a UK 'personality right' (which might sit alongside trade marks and passing off as opposed to copyright). It would require careful consideration to ensure that freedom of expression and other legitimate uses are not adversely affected.
     
  • Sufficiently granular transparency around AI training data should be introduced as a statutory obligation.The report identifies increased transparency as the 'key battleground' in developing an effective licensing-first ecosystem,with wider policy implications. The Committee heard evidence that the approach under the EU AI Act of requiring aggregate high-level summaries is inadequate, as it does not provide sufficient granularity to allow rightsholders to determine if and how their copyright works have been used. Instead, rightsholders want disclosure at individual work-level – of what content has been accessed and how; what specific works have been used in AI training and fine-tuning, and for what purpose; and details of how works are stored, processed, and retained in AI models. Further, metadata attached to any works used should be recorded.

    The Committee recognises the challenges for AI developers (for example, as to administrative burden, trade secrets etc) but is persuaded that a feasible solution meeting both sides' needs can be achieved. It suggests, for example, that a useful compromise may be to require commercial AI providers to make more granular confidential disclosures about their training data and methods to an identified regulator –rightsholders could then query via that regulator whether their content has been used for AI training, without requiring public disclosure of the whole training set.
     
  • Issues around territorial scope are identified as a particularly complex issue to resolve. The report notes that large-scale generative AI model training does not typically take place in the UK. The EU's approach is to require compliance, including with transparency obligations, for all models that are placed on the market in the EU, even where the relevant training takes place outside the EU.

    Whilst the Committee heard compelling evidence that any UK transparency regime should, in principle, apply to all models available on the UK market, regardless of where they are trained, it underscores the importance of recognising the territorial nature of copyright. It therefore puts forward a relatively limited proposal: that the Government design transparency requirements so as to minimise incentives for UK-based developers to relocate training abroad,or for model providers to delay release of new models in the UK. It will be particularly interesting to see how the Government addresses this specific aspect in its report, as it is a matter upon which it is required to set out its position.
     
  • Technical tools should be promoted to support a licensing-first approach, including in relation to standards for rights reservation, data provenance, and labelling of AI-generated content. The report recognises the challenges involved with site-level rights reservation frameworks and the multiple issues that have arisen in relation to the EU's opt-out regime. The Secretary of State gave evidence to the Committee that there is currently no workable opt-out proposal on the table – the Committee's firm view is that an opt-out model should be abandoned, and the starting point should be a 'licensing-first' approach. It notes as encouraging that industry is bringing forward new technical solutions around robust provenance, consent and licensing signals.

    The Committee also considers that robust, visible labelling of AI-generated content must form a core part of the Government's approach to AI and copyright, and recommends that it consider bringing forward legislation to place labelling duties on AI developers, service providers and online platforms.
     
  • Whilst a market for licensing content for AI use is emerging, more can be done to create the necessary conditions for a fair and inclusive licensing market. A number of licensing arrangements are being entered into between rightsholders and AI companies, though these have tended to be between US firms and between large individual developers and publishers, raising concerns about scalability. The Committee also notes the key role that collective management organisations (CMOs) can play in establishing an accessible UK licensing market, given the existing strong CMO infrastructure, as well as the concern that individual creators should receive their fair share of value generated by AI uses. The report therefore recommends that the Government consider options such as equitable remuneration for AI uses of works and performances, and that it support creator-first remuneration models.
     
  • The Government should prioritise development and adoption of sovereign AI models, in the form of a strategy to build responsibly trained domestic AI capability and reduce reliance on imported models (pointing to the Swiss AI Initiatives' Apertus system as an example).
How can we help you?
Help

How can we help you?

Subscribe: I'd like to keep in touch

If your enquiry is urgent please call +44 20 3321 7000

I'm a client

I'm looking for advice

Something else