The heat is being turned up on AI. More than 1,000 musicians, including Kate Bush, Damon Albarn and Annie Lennox have released a silent album – yes you read that right – protesting against plans to let artificial intelligence companies use their copyright-protected music without permission.
The recordings of empty studios and venues that make up Is This What We Want? come as celebrities warn of the dangers of allowing AI to take their work for nothing. Paul McCartney, Elton John, Björn Ulvaeus from Abba, authors Richard Osman and Val McDermid, and actress Julianne Moore are among those calling for greater protection against AI model developers scraping their material.
In the UK, things came to a head after the government invited consultation for its forthcoming AI Bill. That window for responses ended on Tuesday night.
Currently, the proposal is to permit AI companies to train their algorithms for no payment. There is a plan for an exemption, enabling the creatives, individuals and companies to block the trawling, but they must police it themselves and that is time-consuming and onerous. The creative industries, which include music, theatre, film, design, architecture, publishing, writing, advertising and journalism, are pressing for AI developers to automatically cough up.
AI’s large language models behind the likes of ChatGPT for words, Stable Diffusion for images and Suno for music, harness vast amounts of data from the internet. Their software is programmed to spot patterns in that information, enabling them to predict the next word in a sentence, produce realistic images or original-sounding audio.
Use of this content, from books, music albums, newspapers and magazines, photographs, pictures, drawings, designs and other work protected by copyright has led to a surge of lawsuits from authors, book and news publishers, music companies and artists.

Newspapers via the News Media Association are presenting a united front, launching the Make It Fair campaign to highlight how their expensively obtained and choreographed content is at risk of being exploited because of a suggested weakening of the copyright laws. They maintain the tech giants must be forbidden from simply using their words and pictures to feed the building of AI.
This week I was part of a panel at a packed London Press Club event to discuss the threat afforded to journalism by unbridled AI development. Facing me across the room in the grand, historic Stationers’ Hall near St Paul’s Cathedral was a glorious stained-glass window depicting William Caxton proudly presenting his nascent printing press to King Edward IV.
From Caxton onwards, publishing and technology have moved together, hand in hand, feeding off and benefiting each other. That remains the case today. AI is a valuable tool, astonishingly quick and reasonably reliable. Indeed, a show of hands indicated that many of the journalists in the hall used AI as a matter of course in their daily work. I can see why. In that sense I’m glass half-full, I’m not opposed to AI per se and see its professional uses.
Some media companies, rather than oppose it, have signed licensing agreements with AI firms to receive compensation for their journalism.
They are the minority. Being permitted to take, without reimbursement, also turns AI into a weapon and that makes me glass half-empty. The content that is being harvested by AI has been produced by trained, experienced professionals who know the difference between truth and a lie, who can spot a story of public interest, who report on wars and disasters and put their own lives on the line, who provide expert analysis, who interview and pursue those in power, in public life and business, and bring them to account.
Journalism is under enormous financial pressure. Jobs are disappearing and sometimes with them entire publications and platforms. Local newspapers in particular are disappearing, in print and digitally. Every day they were distributing details of court cases, local authority goings-on, planning applications, and events that affect people’s lives. Now they’ve vanished. People have nowhere to turn for reliable, trusted information. They simply do not know what is occurring in their neighbourhoods.
The threat is much broader than jobs and titles. All told, the creative industries generate more than £120 billion a year for the UK alone. We exploit and lose them at our collective economic peril.
It’s much wider than that, however. I am old enough to have become a journalist when its exponents used typewriters. I was the first on my first trade magazine never to use one, I could deploy Alan Sugar’s just-launched Amstrad word processor. I remember as well, what it was like to be a journalist and not have the use of mobile phones. I can recall when Google became a thing.
It's easy to say that AI is just another step along that path, how it’s merely continuing this evolution. It’s not. Unlike them, AI threatens wholesale job cuts and closures, at the national and local level. Sure, some posts went because of previous advances, but they were nothing like the numbers we will see in the future if no monetary checks are imposed, if AI continues on its relentless, profiteering march.
Into that vacuum will step other voices which care little for accuracy and worse, peddle untruth and distortion. Society cannot allow that to happen.
There could be another ultimate loser, which is AI itself. For it to function successfully, AI must be constantly fed; it relies on a non-stop diet of existing material. Then it can collate, digest and dissect, working its technical wizardry at incredible speed to generate usable answers and solutions. But without that supply, it withers and becomes nothing.
In the longer term, if it’s not careful, AI might well be the monster that eats itself. Smarter, more enlightened and visionary, possibly less greedy AI exponents should realise this. If they pay for content, just as they pay their software technicians and developers and processor manufacturers, AI and all those who create content for a living can prosper.