The Copyright and AI consultation – how to respond as an illustrator
The UK Government has launched a consultation on ‘Copyright and Artificial Intelligence’ that proposes significant changes to existing copyright law. With the 25th February deadline approaching, the AOI urges all illustrators to respond to the consultation directly and share their views. This isn’t just another policy paper – it’s a pivotal moment that could reshape how creators protect and control their work.

The Copyright and Artificial Intelligence consultation proposes an exception to your copyright that would allow commercial use of text and data mining – the ‘scraping’ of your work from the internet to train generative AI by AI developers – for no payment. There are 4 options in the consultation, but the Government have declared a preferred option (no 3) where they are offering a ‘rights reservation’ (also known as an ‘opt out’) to creatives whose work could be scraped, which if activated would – in theory – stop developers from using your artwork. However, there has been no proposal for how a technical opt out solution would work, and even the EU have yet to produce one after adopting an opt out position several years ago.
To make sure the government understand the impact on you as a creator, please write in your own words, and share your personal experiences, insights and opinions.
This can cover as many questions as you wish to – there is certainly no need to complete the entire consultation.
We have highlighted the main points in the consultation and suggest you respond to these questions.
The consultation closes on Tuesday 25th February, 2025. You can access the consultation document here and submit your responses via Citizen Space here. Alternatively, you can send responses to in an email to [email protected].
While the AOI will be submitting a comprehensive consultation response that represents members and the wider illustration community – including crucial data from our recent AI survey which you’re welcome to reference – it’s essential for individual creators to make their voices heard too. Your personal stories and experiences will illustrate to the government exactly how these changes could impact real livelihoods and careers. Your views matters, consider taking the time to share your perspective on this important consultation.
Options
Consultation Section B, paragraphs 67-73
Question 4: Do you agree that option 3 – a data mining exception which allows right holders to reserve their rights, supported by transparency measures – is most likely to meet the objectives set out above?
Answer: No
Option 3 (the government’s preferred option) would allow AI companies to scrape and use your artworks to generate potentially competing AI-generated works, without permission or a licence – directly threatening your career and livelihood.
For a response, in your own words, you could highlight that an ‘opt-out’ (or ‘rights reservation’, as the consultation wording refers to an opt out) places an unfair burden on creators. As a creator, you’d first need to become aware that opting out is necessary, then navigate a complex and unclear process to prevent your work from being used. Freelancers lack the time, resources, or expertise to track and protect their works effectively.
Question 5. Which option do you prefer and why?
Answer: Option 1: Strengthen copyright requiring licensing in all cases
AI developers are claiming there is ‘uncertainty’ around copyright in relation to scraping and training AI on that scraped work. There is no uncertainly around copyright – AI developers need to licence and pay for any use of artworks in AI training.
A main point in your response could focus on how synthetic images generated by AI would be trained on your illustrations, but then compete with your work in the illustration marketplace.
The strengthening of copyright protection is essential to ensure that generative AI does not devalue or replace human creativity. Copyright exists to protect your work, and the solution is to require AI developers to seek permission via a licensing system to use illustrations in AI training. This, if you wished to, would allow you to negotiate licensing deals and be paid appropriately.
Government’s Proposed Approach – Exception with Rights Reservation
Question 6: Do you support the introduction of an exception along the lines outlined above?
Answer: No
You can say in your own words how a copyright exception with an opt out would not work for illustrators, and that explicit permission needs to be obtained from illustrators by AI developers before they use images in copyright.
You could comment that for an AI system to work in an ethical and legal way it is essential that it operates with transparency. Currently opt outs of works via Have I Been Trained, for example, offer no verification of an opt out from the actual generative AI platforms. (see haveibeentrained.com)
Question 8: What other approach do you propose and how would that achieve the intended balance of objectives?
Copyright law and licensing are already effective.
You could say in your own words, referring to your own work, how your income is reliant on control of your artwork, on being able to licence, and how transparency in licensing is important so that you aware of all relevant details.
You could mention that as generative AI competes with the artwork it is trained on, any proposal that allows this without licensing would not be acceptable to you.
Question 9: What influence, positive or negative, would the introduction of an exception along these lines have on you or your organisation? Please provide quantitative information where possible.
In your own words, you can say what might happen if this proposal goes ahead. That you would find an opt out system a huge burden as a freelancer (and that you may not trust that the opt out is respected by AI developers), the time and investment taken in creating your illustrations, your concerns about potentially competing with AI generated works in the marketplace and therefore reduced demand.
You may wish to comment on a requirement for retrospective compensation for work that has already been scraped and used in image datasets such as LAION 5B
You can note that you are aware that there is no way to remove your images from an existing dataset which includes previous scrapes of the internet.
If you try out Have I Been Trained and discover your work is included there, you can talk about the number of works you have found and how your work is already being used for training without compensation.
Transparency
Consultation Section C.4, paragraphs 103-108
Question 22: Do you agree that AI developers should disclose the sources of their training material?
Answer: Yes
You may want to mention that there should be no valid reason for not revealing what their AI has been trained on if a developer has sourced data/images legally.
If a developer has not disclosed their training data, you don’t know if your images have been used in a dataset without your permission, and so it would not be possible to enforce your rights. You can emphasise that transparency is essential for enforcement of copyright, fair licensing, accountability, and trust between AI companies and creators.
AI Output Labelling
Consultation Section D.5, paragraphs 163-168
Question 45: Do you agree that generative AI outputs should be labelled as AI generated? If so, what is a proportionate approach, and is regulation required?
Answer: Yes
All outputs from a generative AI must be labelled to ensure that there is transparency over what is AI generated and what is created by a human. It must not be voluntary labelling.
In your own words, you can explain how clear, consistent, mandatory labelling would protect your work by preventing misattribution, unfair competition and unauthorised use.
You can mention that as, up to now, most generative AI developer have been secretive about the datasets they use, you believe that it should be made a legal requirement that labelling is added to AI output images. This might be the only way you could trust that it is actually being done.
Back to News Page