Art vs AI: artists’ uprising takes on the bots

Artificial Intelligence (AI) performs impressively, but much of it is based on human work that was taken without payment. The government thinks this is fine. Copyright holders beg to differ

Paul McCartney performs "Medley"
(Image credit: Todd Owyoung/NBC via Getty Images)

Generative artificial intelligence (AI) is already capable of impressive feats, but it requires a constant diet of human-generated content to grow its capabilities. Much of that content has been scraped from the web without the consent (or indeed knowledge) of its original creators.

Developers typically argue that “fair use” exemptions (allowing the use of copyrighted material, typically excerpts under specific conditions) should apply to what they do. Publishers, music companies and authors bitterly disagree – while legal systems and governments are racing to catch up. For the UK government, the dilemma is that it desperately wants this country to attract AI companies to scale up and drive economic growth. But it also needs to protect Britain’s world-class and highly tax-generative creative industries.

Nothing is certain yet, but a consultation on its draft proposals closed on 25 February, amid a chorus of disapproval from many of the UK’s leading creative artists. Until now, the UK has had one of the strongest copyright regimes in the world.

Subscribe to MoneyWeek

Subscribe to MoneyWeek today and get your first six magazine issues absolutely FREE

Get 6 issues free
https://cdn.mos.cms.futurecdn.net/flexiimages/mw70aro6gl1676370748.jpg

Sign up to Money Morning

Don't miss the latest investment and personal finances news, market analysis, plus money-saving tips with our free twice-daily newsletter

Don't miss the latest investment and personal finances news, market analysis, plus money-saving tips with our free twice-daily newsletter

Sign up

Essentially, the government is proposing exemptions from existing copyright law for AI web-crawlers, where the onus would be on a rights holder to opt out of their content being taken free of charge and for them to trace how it is being used. That would be a big and radical change, and it’s got whole industries in the creative sector worried.

Would that break international law?

It’s not clear. Some lawyers argue that the government’s proposals would very likely breach an international treaty to which the UK is a party, namely the Berne Convention. However, Peter Kyle, the technology minister overseeing the planned legislation, insists it will meet all international obligations. According to Kyle, the government “won’t legislate until the tech companies can prove that the technology can deliver the transparency that they have said that they can, and that we will find ways for the creative arts industry to make money in the digital age”.

Artists are unconvinced. Last month more than 1,000 artists, including Kate Bush, Annie Lennox and Cat Stevens, backed the release of a silent album, titled Is This What We Want?, containing nothing but background noise.

Separately, in a letter to The Times, three musical knights of the realm – Andrew Lloyd Webber, Elton John and Paul McCartney – warned against the proposals, arguing that the current copyright system “is one of the main reasons why rights holders work in Britain”.

What’s their case?

The creative artists say the government’s proposals will “smash a hole in the moral right of creators to present their work” and jeopardise a £126 billion industry that employs 2.4 million people in the UK.

They say that Britain’s creative industries “want to play their part in the AI revolution”, but they need to do so from a firm intellectual-property base. If not, Britain will lose out on its best growth opportunity. There is “no moral or economic argument for stealing our copyright”, the artists say. “Taking it away will devastate the industry and steal the future of the next generation.” In the US, the Authors Guild and 17 individual authors, including Jodi Picoult and Jonathan Franzen, are suing OpenAI and Microsoft for copyright infringement, alleging “systematic theft on a mass scale”.

What are other states doing?

In the US, the Trump administration repealed Biden-era regulation, adopting a more light-touch approach to foster growth and innovation. Indeed, part of what’s driving the British government line is its wish to position the UK as an AI-friendly powerhouse, aligned with the US on tech policy.

In the EU, an Artificial Intelligence Act passed last year obliges tech firms to comply with the EU’s strict 2019 copyright law. That law includes an exemption for “text and data mining”, but it was framed before the emergence of mass-market generative AI, and was not intended to let the world’s largest companies harvest vast amounts of intellectual property. As a result, there are legal disputes between creators/publishers and AI firms in many European countries. Everywhere, the lack of regulatory clarity – and obvious potential for commercial disputes in a fast-moving sector – means that courts are being kept busy. In the US, several high-profile cases filed by publishers against AI companies are working their way through the US legal system. Here, Getty Images is pursuing a closely watched case against AI image-generation company Stability AI.

How will all this play out?

“These disputes are a classic example of what happens when new technologies outpace laws written for an earlier era,” says John Thornhill in the Financial Times. But the overriding principle – that no one should profit from another’s intellectual property without consent – should remain “inviolable”.

Legislation will be part of this emerging landscape, but so will market-based solutions and innovations aimed at facilitating licensing deals that compensate content creators for letting AI companies scrape their property. Already, high-profile publishers have struck ad hoc deals with AI companies. Axel Springer, News Corp and the FT have signed agreements with OpenAI, while Agence France-Presse (AFP) has partnered with Mistral.

Meanwhile, several start-ups are experimenting with new economic models. Human Native, for example, is creating a “two-sided marketplace allowing AI creators to license data from content creators”. TollBit enables AI bots and data scrapers to pay websites directly for their content. And ProRata is developing an “answer engine” that would pay a share of an AI company’s revenue to content creators whenever their work appeared in its responses. Nascent market mechanisms are developing that “could enable mutually beneficial solutions”.


This article was first published in MoneyWeek's magazine. Enjoy exclusive early access to news, opinion and analysis from our team of financial experts with a MoneyWeek subscription.

Explore More

Simon Wilson’s first career was in book publishing, as an economics editor at Routledge, and as a publisher of non-fiction at Random House, specialising in popular business and management books. While there, he published Customers.com, a bestselling classic of the early days of e-commerce, and The Money or Your Life: Reuniting Work and Joy, an inspirational book that helped inspire its publisher towards a post-corporate, portfolio life.   

Since 2001, he has been a writer for MoneyWeek, a financial copywriter, and a long-time contributing editor at The Week. Simon also works as an actor and corporate trainer; current and past clients include investment banks, the Bank of England, the UK government, several Magic Circle law firms and all of the Big Four accountancy firms. He has a degree in languages (German and Spanish) and social and political sciences from the University of Cambridge.