Nearly 10,000 authors have released an "empty" book to protest AI companies using their work without permission.
This is one of the biggest joint actions by the creative community against Big Tech since the rise of generative AI.
The book, titled "Don't Steal This Book," contains no text other than a list of the authors' names. About 1,000 free copies are being handed out at the London Book Fair, which runs through Wednesday at London Olympia in Hammersmith.
The book comes out just a week before the U.K. government is set to release a report on the economic impact of proposed copyright law changes.
Some of the authors who signed include Nobel Prize winner Kazuo Ishiguro, "Thursday Murder Club" author Richard Osman, historical novelist Philippa Gregory, "Slow Horses" writer Mick Herron, "Noughts and Crosses" author Malorie Blackman, and bestselling Irish writer Marian Keyes.
Ed Newton-Rex, a composer and advocate for artists' copyright who led the campaign, said the AI industry was "built on stolen work, taken without permission or payment." He also said, "This is not a victimless crime. Generative AI competes with the people whose work it is trained on, robbing them of their livelihoods. The government must protect the U.K.'s creatives and refuse to legalize the theft of creative work by AI companies."
The book's back cover carries a direct message for lawmakers: "The U.K. government must not legalize book theft to benefit AI companies."
The protest is aimed at a government proposal that would allow AI companies to use copyrighted works without the owner's permission, unless the owner actively opts out. Critics say this opt-out system violates the fundamentals of copyright law, is difficult to enforce, and places too much responsibility on individual creators.
The proposal faced further criticism after a public consultation last February, during which only 3% supported the opt-out idea. The government later said this was no longer its "preferred option," but campaigners warn that ministers are now considering a "commercial research exception." This would allow AI companies to train commercial models on authors' work without their consent or payment.
The Financial Times reported that someone familiar with the government's discussions said a final decision would be "kicked down the road." This suggests that pressure from the creative sector is making a difference.
Blackman said the authors' position is simple. "It is not in any way unreasonable to expect AI companies to pay for the use of authors' books," she said.
The campaign highlights broader frustration across the global creative industries about how AI developers obtain their training data. Last year, Anthropic, the company behind the Claude AI chatbot and a major player in the field, agreed to pay $1.5 billion to settle a class-action lawsuit. Book authors claimed the company used pirated copies of their works to train its main product.
Publishers are also responding through official channels. Publishers' Licensing Services, a nonprofit, is launching a collective licensing program at the London Book Fair. This program will allow AI developers to obtain legal, paid access to published works.
A report from the Publishers Association, released with the initiative, found that the AI licensing market already exists and is growing. This challenges the tech industry's claim that copyright exceptions are needed for innovation.
A government spokesperson said, "The government wants a copyright regime that values and protects human creativity, can be trusted, and unlocks innovation. We will continue to engage closely with the creative sector on this issue, and we will meet our commitment to update Parliament by March 18."