For many artists, AI is no longer a distant development—it is already shaping how their work is used, shared, and understood.
The question is not simply whether artistic work is protected, but whether artists have visibility over how it is being used and whether that use is agreed.
Copyright has long provided artists with an essential framework. It defines ownership, regulates use, and supports the ability to build a sustainable practice. It is not only legal infrastructure, but cultural structure—a way of recognising that artistic labour has boundaries, and that those boundaries matter.
Artificial intelligence does not remove that structure, but it does place it under new pressure.
Across the UK, the European Union, and the United States, governments and regulators are working through how existing copyright frameworks apply to AI systems trained on large-scale datasets. The approaches differ—opt-out models, transparency requirements, and emerging licensing structures—but the direction is broadly the same: adaptation rather than replacement.
What is changing is not the existence of copyright itself, but the conditions in which it operates.
What artists are actually worried about
Alongside policy debate, there is a more immediate reality for artists.
Across studios, galleries, and professional networks, similar questions are being asked: can my work be reproduced without me? Has my work already been used to train systems without my knowledge? What happens to authorship when images can be generated instantly?
These concerns are widely shared—and for many artists, they come with a mix of uncertainty, frustration, and, at times, anger. The idea that years of work can be absorbed into systems without consent, attribution, or compensation raises difficult questions about fairness and respect for creative labour.
For many, this does not feel like a grey area. It can feel uncomfortably close to appropriation, regardless of whether existing legal frameworks are yet able to define it fully in those terms.
This is not simply a technological development. It is a structural shift in how artistic work is used, circulated, and understood.
At the same time, something else is becoming clearer. As synthetic imagery increases in volume, the role of process, intention, and sustained practice becomes more visible. In that sense, value is not disappearing, it is being clarified.
From copying to systems of synthesis
AI systems are affecting artists in more than one way.
They are trained on vast datasets drawn from existing images and texts, but they are also used directly by individuals who upload artworks, feed them into AI tools, and ask for new images in the style of a named artist. In practice, that means both single works and broader visual languages can be taken, repurposed, and imitated without consent.
For artists, this is where the issue becomes especially acute. It is not only about large-scale systems operating in the background. It is also about the visible, everyday use of AI to copy, mimic, and recirculate artistic work in ways that existing protections struggle to address.
These developments raise serious questions about authorship, control, and the limits of current frameworks.
A global system still taking shape
Regulatory responses across jurisdictions remain in development.
In the UK, discussions have explored frameworks for AI training, including opt-out mechanisms intended to balance innovation with rights protection. In the European Union, regulatory focus has centred on transparency, particularly around training data. In the United States, legal cases are beginning to test how copyright law applies to AI training, but no unified framework has yet emerged.
Across all regions, copyright is being reinterpreted in real time.
A system under pressure
Alongside these developments, there is a growing perception among artists that regulation has not kept pace with technological change.
Governments are balancing competing priorities: supporting innovation, maintaining global competitiveness, and responding to concerns around intellectual property. There is also an awareness that over-regulation may risk disadvantaging domestic technology sectors in an increasingly competitive international landscape.
For many artists, however, this balance does not feel neutral. There is a growing concern that current approaches may place greater weight on the needs of technology development than on the rights of individual creators.
In conversations across the sector, including those within the Visual Artists Association and wider policy discussions, a clear pattern is emerging. Artists are not only concerned about how their work is used—they are increasingly concerned about how it enters these systems in the first place.
Many artists are asking for consent-based models, where artistic work is explicitly licensed or agreed upon, rather than retrospectively restricted.
The issue is not only how work is removed. It is how it is taken in.
Particular concern centres on images used without permission — scraped, shared, or reproduced — and then incorporated into training datasets. This raises a fundamental question for policy: if a work enters a system without consent, what protections meaningfully remain once it is inside it?
At this point, the distinction between opt-out and consent is no longer just technical. It becomes a question of principle.
Comparisons with other regions are helping to shape this discussion. Within the European Union, regulatory approaches have placed greater emphasis on transparency, seeking clearer accountability around the use of creative work. In the UK, the approach has remained more open, with ongoing discussions around flexible frameworks.
This raises an important question for UK policy: whether a more structured, transparent model could offer artists greater clarity and confidence.
Recent consultations led by the UK Intellectual Property Office have explored how copyright might adapt to AI systems, particularly in relation to text and data mining. Proposals such as opt-out models have been presented as a way of balancing innovation and rights. However, many artists and organisations have made clear that opt-out mechanisms place too much responsibility on individual creators, rather than establishing clear conditions of consent from the outset.
Organisations such as DACS continue to play an important role in supporting artists’ rights through licensing and royalties, but the scale and speed of AI introduce conditions that extend beyond existing frameworks.
The contrast is particularly visible when compared with other creative industries. In music, licensing frameworks, royalty systems, and collective management organisations provide established mechanisms for attribution, remuneration, and compensation. While not without challenges, these systems offer a level of infrastructure that visual artists have historically lacked.
As AI expands, that gap is becoming harder to ignore. This is not simply a legal issue. It is also a question of how consistently creative labour is recognised, respected, and remunerated across the cultural sector.
Where this is already visible
These shifts are already playing out in practice. Artists are encountering AI-generated works that closely resemble their own. Institutions are adapting, redefining how authorship and originality are assessed. Competitions are refining criteria, and professional conversations are increasingly centred on attribution and distinction.
There is also growing discussion around clearer signals of authorship—such as “human-created” identifiers—not as restriction, but as a form of clarity within increasingly hybrid visual environments.
These are not signs of collapse. They are signs that the system is adjusting, even if not yet evenly.
Why this moment is recalibrating value
Context is increasingly part of the work. Clear framing—titles, statements, and presentation—can help maintain authorship and intent as images move further from their original setting.
Process is becoming visible value. Development, iteration, and material engagement all contribute to how practice is understood and recognised.
Intent is more durable than style. What distinguishes practice is not appearance alone, but the conceptual direction and continuity behind it.
A moment of recalibration, not loss
There are historical parallels, but they are not exact. The rise of photography disrupted painting’s traditional role and forced a rethinking of what painting could be. That process was uneven and, at times, deeply challenging for artists working at the time.
A similar period of adjustment now appears to be underway in visual art, although the scale and speed of AI make this moment distinct.
As synthetic production expands, the qualities that define human-made practice—process, intention, material engagement, and provenance—become more visible by contrast.
AI does not erase value. It shifts attention towards what has always made artistic practice meaningful in the first place.
Three considerations for artists working now
It would be misleading to frame this moment as one of collapse.
Copyright remains active. Legal protections remain in place, even as their application is being tested in new ways.
Artificial intelligence introduces uncertainty, but it also brings into sharper focus what has always defined artistic value: intention, process, and recognition.
For many artists, this moment does not feel neutral. It raises legitimate concerns about how work is used, how value is attributed, and how creative labour is respected.
And yet, there are reasons for confidence.
Artistic practice has always adapted in response to change. What defines it—thought, process, and intent—cannot be replicated at scale, and continues to hold its value within an evolving landscape.
As these systems develop, there is increasing attention on how artists’ rights are understood and protected. Conversations around consent, transparency, and fair use are evolving—and increasingly include how artists are credited and remunerated when their work informs or is used within AI systems.
Because while images can now be generated more easily, meaningful artistic work still depends on human decision-making, experience, and perspective.
The continuity of ideas, the development of practice, and the intent behind the work remain distinct—and they remain central to how artistic value is recognised.
The question is not whether artistic value will endure.
It is how it continues to be understood—and how artists are supported, recognised, and fairly remunerated in sustaining it.
Laura O’Hare is co-founder of the VAA, host of its quarterly Legal Hour, and a former solicitor of 15 years. She works at the intersection of artists, institutions, and policymakers on the future of creative rights.