LONDON — When the European Parliament passed sweeping new laws governing the use of artificial intelligence (AI) last March, the “world first” legislation was hailed as an important victory by music executives and rights holders. Just over one year later — and with less than three months until the European Union’s Artificial Intelligence Act is due to come fully into force — those same execs say they now have “serious concerns” about how the laws are being implemented amid a fierce lobbying battle between creator groups and big tech.
“[Tech companies] are really aggressively lobbying the [European] Commission and the [European] Council to try and water down these provisions wherever they can,” John Phelan, director general of international music publishing trade association ICMP, tells Billboard. “The EU is at a junction and what we’re trying to do is try to push as many people [as possible] in the direction of: ‘The law is the law’. The copyright standards in there are high. Do not be afraid to robustly defend what you’ve got in the AI Act.”
One current source of tension between creator groups, tech lobbyists and policy makers is the generative AI “Code of Practice” being developed by the EU’s newly formed AI Office in consultation with almost 1,000 stakeholders, including music trade groups, tech companies, academics, and independent experts. The code, which is currently on its third draft, is intended to set clear, but not legally binding, guidelines for generative AI models such as OpenAI’s ChatGPT to follow to ensure they are complying with the terms of the AI Act.
Those obligations include the requirement for generative AI developers to provide a “sufficiently detailed summary” of all copyright protected works, including music, that they have used to train their systems. Under the AI Act, tech companies are also required to water mark training data sets used in generative AI music or audio-visual works, so there is a traceable path for rights holders to track the use of their catalog. Significantly, the laws apply to any generative AI company operating within the 27-member EU state, regardless of where they are based, acquired data from, or trained their systems.
“The obligations of the AI Act are clear: you need to respect copyright, and you need to be transparent about the data you have trained on,” says Matthieu Philibert, public affairs director at European independent labels trade body IMPALA.
Putting those provisions into practice is proving less straight-forward, however, with the latest version of the code, published in March, provoking a strong backlash from music execs who say that the draft text risks undermining the very same laws it is designed to support.
“Rather than providing a robust framework for compliance, [the code] sets the bar so low as to provide no meaningful assistance for authors, performers, and other right holders to exercise or enforce their rights,” said a coalition of creators and music associations, including ICMP, IMPALA, international labels trade body IFPI and Paris-based collecting societies trade organization CISAC, in a joint statement published March 28.
Causing the biggest worry for rights holders is the text’s instruction that generative AI providers need only make “reasonable efforts” to comply with European copyright law, including the weakened requirement that signatories undertake “reasonable efforts to not crawl from piracy domains.”
There’s also strong opposition over a lack of meaningful guidance on what AI companies must do to comply with a label, artist or publisher’s right to reserve (block) their rights, including the code’s insistence that robots.txt is the “only” method generative AI models must use to identify rights holders opt out reservations. Creator groups says that robots.txt – a root directory file that tells search engine crawlers which URLs they can access on a website — works for only a fraction of right holders and is unfit for purpose as it takes effect at the point of web crawling, not scraping, training or other downstream uses of their work.
“Every draft we see coming out is basically worse than the previous one,” Philibert tells Billboard. “As it stands, the code of practice leaves a lot to be desired.”
Caught Between Creators, Big Tech and U.S. Pressure
The general view within the music business is that the concessions introduced in the third draft are in response to pressure from tech lobbyists and outside pressure from the Trump administration, which is pursuing a wider deregulation agenda both at home and abroad. In April, the U.S. government’s Mission to the EU (USEU) sent a letter to the European Commission pushing back against the code, which it said contained “flaws.” The Trump administration is also demanding changes to the EU’s Digital Services Act, which governs digital services such as X and Facebook, and the EU’s Digital Markets Act, which looks to curb the power of large digital platforms.
The perception that the draft code favors Big Tech is not shared by their lobby group representatives, however.
“The code of practice for general-purpose AI is a vital step in implementing the EU’s AI Act, offering much-needed guidance [to tech providers] … However, the drafting process has been troubled from the very outset,” says Boniface de Champris, senior policy manager at the European arm of the Computer and Communications Industry Association (CCIA), which counts Google, Amazon, Meta and Apple among its members.
De Champris says that generative AI developers accounted for around 50 of the nearly 1,000 stakeholders that the EU consulted with on the drafting of the code, allowing the process “to veer off course, with months lost to debates that went beyond the AI Act’s agreed scope, including proposals explicitly rejected by EU legislators.” He calls a successful implementation of the code “a make-or-break moment for AI innovation in Europe.”
In response to the backlash from creator groups and the tech sector, the EU’s AI Office recently postponed publishing the final code of practice from May 2 to an unspecified date later this summer to allow for changes to be made.
The AI Act’s key provisions for generative AI models come into force Aug. 2, after which all of its regulations will be legally enforceable with fines of up to 35 million euros ($38 million, per current exchange rate), or up to 7% of global annual turnover, for large companies that breach the rules. Start-up businesses or smaller tech operations will receive proportionate financial punishments.
Creators Demand Stronger Rules
Meanwhile, work continues behind the scenes on what many music executives consider to be the key part of the legislation: the so-called “training template” that is being developed by the AI Office in parallel with the code of practice. The template, which is also overdue and causing equal concern among rights holders, will set the minimum requirements of training data that AI developers have to publicly disclose, including copyright-protected songs that they have used in the form of a “sufficiently detailed summary.”
According to preliminary proposals published in January, the training summary will not require tech companies to specify each work or song they have used to train AI systems, or be “technically detailed,” but will instead be a “generally comprehensive” list of the data sets used and sources.
“For us, the [transparency] template is the most important thing and what we have seen so far, which had been presented in the context of the code, is absolutely not meeting the required threshold,” says Lodovico Benvenuti, managing director of IFPI’s European office. “The act’s obligations on transparency are not only possible but they are needed in order to build a fair and competitive licensing market.”
“Unless we get detailed transparency, we won’t know what works have been used and if that happens most of this obligation will become an empty promise,” agrees IMPALA’s Philibert. “We hear claims [from the European Commission] that the training data is protected as a trade secret. But it’s not a trade secret to say: ‘This is what I trained on.’ The trade secret is how they put together their models, not the ingredients.”
“The big tech companies do not want to disclose [training data] because if they disclose, you will be able to understand if copyrighted material [has been used]. This is why they are trying to dilute this [requirement],” Brando Benifei, a Member of the European Parliament (MEP) and co-rapporteur of the AI Act, tells Billboard. Benifei is co-chair of a working group focused on the implementation of the AI Act and says that he and colleagues are urging policymakers to make sure that the final legislation achieves its overarching aim of defending creators’ rights.
“We think it is very important in this moment to protect human creativity, including the music sector,” warns Benifei, who this week co-hosted a forum in Brussels that brought together voices from music and other media to warn that current AI policies could erode copyright protections and compromise cultural integrity. Speakers, including ABBA member and CISAC president Björn Ulvaeus and Universal Music France CEO Olivier Nusse, stressed that AI must support — and not replace — human creativity, and criticized the lack of strong transparency requirements in AI development. They emphasized that AI-generated content should not be granted the same legal standing as human-created works. The event aligned with the “Stay True to the Act, Stay True to Culture” campaign, which advocates for equitable treatment and fair compensation for creators.
“A lot is happening, almost around the clock, in front of and behind the scenes,” ICMP’s Phelan tells Billboard. He says he and other creator groups are “contesting hard” with the EU’s executive branch, the European Commission, to achieve the transparency standards required by the music business.
“The implementation process doesn’t redefine the law or reduce what was achieved within the AI Act,” says Phelan. “But it does help to create the enforcement tools and it’s those tools which we are concerned about.”