Deezer's CEO Alexis Lanternier published data this week that reframes the scale of what streaming platforms are now dealing with: 44 percent of the songs uploaded to his platform every day are AI-generated, amounting to roughly 75,000 tracks. Over two million AI tracks arrive monthly. Despite that volume, they account for between one and three percent of total streams. The music is there; listeners are not finding it, or not choosing it, or not knowing it exists. That gap tells a more nuanced story than the headline number alone.
The numbers have been accelerating sharply. In January 2025, Deezer was receiving around 10,000 AI-generated uploads per day. By January 2026, that had risen to 60,000. The jump from 60,000 to 75,000 represents the most recent months. Whatever is driving these uploads, the trajectory is not levelling off. Deezer now flags 85 percent of its AI-generated uploads as fraudulent, meaning the dominant use case for bulk AI music generation on the platform is not artistic expression but royalty extraction.
The fraud mechanics have been detailed in recent US court proceedings. In March, a North Carolina man pleaded guilty after generating hundreds of thousands of AI songs and using automated bots to stream them billions of times, pocketing over eight million dollars in royalty payments that would otherwise have gone to human artists. Separate reporting identified commercial services openly advertising their ability to spoof listener identities and bypass anti-fraud systems, with AI music as the product. Bots listening to bot music to redirect a royalty pool designed for human creators: this is not a fringe problem. It is a large-scale, structured extraction operation.
Sony Music has requested the removal of more than 135,000 AI tracks it says are impersonating its artists. That is a separate concern from fraud: it involves AI-generated songs trained to mimic the sound of specific musicians closely enough to capture their listeners. The combination of impersonation and bot-driven streaming creates a two-pronged squeeze on human musicians: their share of the royalty pool shrinks, and their algorithmic visibility is degraded because recommendation systems are being polluted by artificially inflated play counts.
Deezer has responded with several measures: AI-tagged tracks are excluded from editorial playlists and algorithmic recommendations, and the platform has stopped storing high-resolution versions of AI content. Lanternier called on the broader music ecosystem to act collectively, noting that Deezer alone cannot hold this back. Spotify and Apple Music have both acknowledged similar problems. The challenge for all of them is that detection is reactive: fraud operations adapt, and no single platform has the authority or the reach to shut down the upstream generation pipelines.
The streaming royalty model was designed around scarcity of supply and genuine listener demand. Both assumptions have now been broken simultaneously. An artist who releases a carefully produced album competes for algorithmic placement against millions of auto-generated tracks, many of them backed by bot streams, in a system that was never built to handle the volume or the adversarial intent. What Deezer's numbers describe is not a music industry disrupted by AI creativity. It is a royalty system being strip-mined by automated fraud at a scale that human musicians, without access to equivalent automation, cannot counter on equal terms.