In 2023, the music industry had its wake-up call — and it sounded a lot like a song by Drake.
A viral track titled Heart on My Sleeve imitated Drake and The Weeknd so convincingly that it racked up millions of streams before anyone could confirm who made it. It wasn’t just the sound that alarmed people — it was the realisation that no one was truly in control.
In response, the music world is quietly building new technology not to stop AI-generated songs altogether but to make them traceable. Developers are embedding detection systems into every stage of the music process — from model training and uploading platforms to licensing databases and recommendation algorithms.
The aim isn’t to ban synthetic music but to label and track it. “If you don’t build this stuff into the infrastructure, you’re just going to be chasing your tail,” explains Matt Adell, cofounder of Musical AI. “You can’t keep reacting to every new track or model — that doesn’t scale. You need infrastructure that works from training through distribution.”
Detection is becoming part of music’s core systems
Startups and major platforms are racing to include AI-detection tools within licensing and publishing workflows. Companies like YouTube and Deezer have begun flagging AI-generated content as soon as it’s uploaded, influencing how these songs appear in search results and recommendation feeds. Others like SoundCloud, Audible Magic, Pex, and Rightsify are expanding moderation and identification tools throughout their music ecosystems.
This rapid development is creating a patchwork of detection systems, each aiming to make AI music traceable from the moment it is created. Companies like Vermillio and Musical AI have developed software that automatically tags songs as synthetic, embedding this data into the track’s metadata.
Vermillio’s TraceID, for example, breaks down songs into individual parts — such as vocal tone or lyrics — and identifies the segments generated by AI. It can spot mimicry even when only certain features of an original song are used. This is especially useful for rights holders who want to be notified and offered licensing options before a track is released.
Instead of acting like YouTube’s Content ID, which often misses subtle imitations, TraceID is designed to offer proactive, verified licensing. Vermillio predicts this authenticated licensing could grow from US$75 million in 2023 to US$10 billion by 2025. It’s not about catching copies but measuring creative influence and making fair deals from the start.
AI training data is under the spotlight
Some companies are further examining the data used to train music AI models. By analysing training inputs, they aim to determine how much a generated song borrows from specific artists or styles. This approach could allow licensing based on influence — before a song is released.
Sean Power, cofounder of Musical AI, describes their system as a full-cycle tool. “Attribution shouldn’t start when the song is done — it should start when the model starts learning,” he says. “We’re trying to quantify creative influence, not just catch copies.”
Meanwhile, Deezer already uses in-house tech to identify and limit the reach of fully AI-generated songs. As of April, around 20% of daily uploads were flagged as AI-made — twice as many as in January. These tracks stay on the platform but are not pushed in algorithmic or editorial recommendations. Shortly, Deezer plans to add clear labels for users to identify AI-generated content.
“We’re not against AI at all,” says Aurélien Hérault, Deezer’s Chief Innovation Officer. “But a lot of this content is being used in bad faith — not for creation, but to exploit the platform. That’s why we’re paying so much attention.”
Spawning AI is pushing detection further upstream with its Do Not Train Protocol (DNTP), which allows musicians to opt out of having their work used in AI training. While visual artists already have similar tools, the audio industry has been slower to catch up. There’s still no standardised approach to consent and transparency at scale.
Some experts argue that DNTP must be run independently and supported by various stakeholders to gain trust. “Nobody should trust the future of consent to an opaque central company,” says technologist Mat Dryhurst. “It needs to be nonprofit and collaborative to truly protect creators.”
Music’s future is being shaped by this behind-the-scenes race to build detection into the foundation of how music is made, shared, and discovered. And it’s only just beginning.