12 Silent Tracks, One Message: Musicians Call Out AI Policies

Published: February 27th, 2025.
Over 1,000 musicians, including Kate Bush and Cat Stevens, released an album without sound. Yes, that’s right—12 tracks of pure silence. But there’s a reason behind it.
The album “Is This What We Want?” protests against the UK’s proposed changes to copyright laws. These changes could let AI companies train their models on any music they can legally access without needing permission from the artists. The tracklist itself clearly spells out: "The British government must not legalize music theft to benefit AI companies."
Many artists are furious, saying these changes could undo decades of copyright protections. Paul McCartney, Elton John, and Dua Lipa are among those who signed an open letter warning that this policy would let AI developers use musicians' work for free while profiting off their creations.
The UK government argues that copyright laws hold back AI innovation, but musicians see it differently. They fear AI companies will flood the market with music trained on real artists’ work—without paying them a dime.
Across the Atlantic, the U.S. entertainment industry is no stranger to AI-related labor disputes. In 2023, the SAG-AFTRA strike brought Hollywood to a standstill for 118 days, with actors fighting for stronger AI protections. Concerns ranged from AI-generated scripts replacing writers to digital replicas of actors being used without consent.
A similar battle erupted in 2024 when video game voice actors and motion capture performers went on strike, citing AI as an existential threat to their careers.
The Writers Guild of America (WGA) secured crucial AI protections in their deal, ensuring that AI-generated content wouldn’t be classified as original material. The agreement stopped studios from using AI to replace writers while allowing creatives to use AI as a tool if they chose.
And musicians? They, too, have made their voices heard. In April 2024, Billie Eilish, Nicki Minaj, Stevie Wonder, and over 200 artists signed an open letter demanding tech companies stop using AI to mimic artists' voices without consent. Tennessee even passed the Elvis Act, protecting musicians from unauthorized AI-generated voice replication.
But these efforts—while important—haven’t gone as far as the UK’s silent album protest. The question now is: should they? There’s a strong case for it. AI-generated music is a growing concern, with deepfake songs mimicking famous artists and AI-generated tracks racking up millions of streams.
Without robust protections, musicians could see their voices, melodies, and styles repurposed without consent—leaving tech firms to profit from their artistry while they receive nothing in return.
To be fair, AI isn’t inherently bad. It can help musicians generate ideas, refine compositions, and break creative barriers. AI-powered tools can assist with mastering tracks, writing lyrics, or composing background music, offering independent artists resources they might not otherwise afford.
But the problem isn’t AI itself—it’s how it's used. If AI can be trained on artists’ work without permission or payment, it sets a dangerous precedent. The fear isn’t just about losing money; it’s about losing creative control. When machines can replicate an artist’s style or voice with near perfection, what happens to the human behind the music?
As AI continues to disrupt creative industries, musicians in the U.S. will likely face the same decision: protest now or risk being left behind. Why shouldn't musicians do the same if Hollywood writers and actors could push back and win protections? The question isn’t whether AI belongs in music but whether musicians will have a say in its use.
For now, all eyes are on the UK. Will the silence be loud enough to spark global change? Or will AI-driven policies move forward, shaping the future of music without the voices of those who create it?