DailyGlimpse

Spotify Won't Let You Filter Out AI Music—Here's Why

AI
April 28, 2026 · 1:00 AM
Spotify Won't Let You Filter Out AI Music—Here's Why

In mid-2025, frustration boiled over for Cedrik Sixtus. Finding his Spotify playlists increasingly sprinkled with tracks he suspected were AI-generated, the Leipzig-based software developer built a tool to automatically label and block them from his listening. He uploaded his Spotify AI Blocker to a couple of code-sharing websites, where hundreds have downloaded it. It filters out a growing list of more than 4,700 suspected AI artists, drawing on community tracking efforts and signs like unusually high release volumes and AI-style cover art. "It is about choice—if you want to hear AI music or if you don't," says Sixtus, who would prefer Spotify label and enable filtering of AI-generated content itself. Sixtus's tool is installed initially via the web browser version of Spotify. He warns that using his software "may violate Spotify's terms of service."

He isn't alone: feelings run deep on Spotify's community forum. While for Sixtus the issue is that AI music doesn't sound right, others simply don't want to listen to music made by a bot. Spotify has made some concessions to address such concerns. In April it launched a test feature that shows, in a song's credits, how an artist used AI. But it's a voluntary system based on what an artist tells their record label or distributor. "We know this isn't a complete solution on its own. Building a truly comprehensive system is a challenge that requires industry-wide alignment," Spotify said in April.

Spotify's position is certainly a long way from actively identifying AI-generated music and giving users an option to filter it out. "It is a difficult—borderline existential—balancing act for Spotify," says Robert Prey, who studies streaming platforms at Oxford University's Internet Institute. Spotify is trying to avoid value judgments about how music is created, but risks eroding trust among listeners, artists and the wider industry if it fails to offer enough transparency, he explains. "It has to figure out what listeners want and how artists feel—all while AI is improving, being used more widely and becoming harder to detect," he adds.

The arrival of AI tools for music is both seducing and unsettling the music world. Generative AI music services like Suno and Udio now produce increasingly polished, fully realized songs, complete with lyrics, vocals and instrumentation from simple text prompts in seconds. In one recent controlled test, part of a Deezer–Ipsos poll, 97% of listeners failed to correctly distinguish between AI-generated and human-made tracks. And tens of thousands of AI tracks appear to be uploaded to streaming platforms daily, where they could dilute revenue pools for human artists—even if most currently attract few listens. Spotify, along with YouTube Music and Amazon Music, have so far avoided any clear user-facing labels or filters for AI-generated music, neither openly using detection tools nor requiring systematic self-disclosure—though that may change as industry standards develop.

Widely suspected AI acts like Sienna Rose, Breaking Rust and The Velvet Sundown are essentially treated like any other artists by Spotify, even as the platform removes what it considers AI-related spam such as mass uploads and short tracks designed to game the system. "Our priority is addressing harmful uses [of AI] like spam and impersonation, rather than trying to filter music based on how it was made," a Spotify spokesperson said, adding that AI in music also isn't a binary category but exists on a spectrum.

Deezer—a smaller competitor to Spotify—has taken a stronger approach. Last year it began both tagging albums that contain AI-generated tracks produced by Suno, Udio and similar, and excluding the tracks from algorithmic recommendations or human-made playlists. It uses its own in-house detection technology based on training AI models to spot statistical patterns in the sound itself, and recently began offering it for sale across the industry. "We're the only music streaming platform that has that in place," notes Jesper Wendel, its head of global communications. In March, Apple Music said it was introducing "transparency tags" and would eventually require that music labels and distributors self-disclose when new songs or related content involve AI. But, as with Spotify's song credit features, critics point out those are unlikely to be reliable as artists may rather not disclose AI use for fear of stigma—and how visible Apple's tags will be to listeners remains unclear.

That AI music exists on a continuum does make labeling difficult, says Maya Ackerman, an expert in AI and computational creativity at Santa Clara University and co-founder and CEO of WaveAI. While some tools are "prompt in, song out"—where AI labels would be straightforward—others are designed for co-creation, assisting with specific parts of the music-making process. If a musician uses those tools, at what point does that warrant a label? And, Ackerman adds, even with tools like Suno and Udio, users can put a lot of their creative selves into the outputs—feeding in their own lyrics or spending many hours iterating on the song's sound. "From a distance it looks like such an obvious 'yes, label AI music' but, once you zoom in, you realize it is a very complicated thing," she says. There is also the technical challenge of accurately detecting AI-generated tracks, with potentially serious consequences if human musicians are falsely labeled as AI.