Spotify Introduces Artist Profile Protection to Combat AI Deepfakes

Spotify has begun testing a new feature called Artist Profile Protection, designed to give artists control over what releases appear on their profiles.

The opt in tool allows artists to review and approve songs before they go live under their name, addressing a growing problem of misattributed and AI generated content as first reported by Fulltimemusician.

The feature targets a gap in the current distribution system. Open distribution platforms have made it possible for almost anyone to upload music globally with minimal friction, but this accessibility has created challenges around identity verification.

Songs are frequently misattributed to the wrong artists, duplicate profiles get mixed up, and with the rise of AI tools, individuals can generate music at scale and attach it to existing artist profiles without permission.

Recent industry actions illustrate the scope of the issue. Sony Music Entertainment has asked platforms to remove more than 135,000 tracks it claims were created using AI to impersonate artists on its roster.

Spotify has reported removing over 75 million spammy tracks in a single year. Deezer has stated it now sees approximately 60,000 fully AI generated songs uploaded daily, accounting for nearly 40 percent of all new music hitting the platform.

Until now, most platforms have operated reactively. When a problem arises, an artist flags the content, the platform investigates, and eventually it gets removed.

By the time that process completes, however, streams have already been counted, recommendations have been affected, and in some cases, royalty payments have already been distributed.

Artist Profile Protection flips this model. Artists who opt in receive notifications when content is delivered to their page and can decide whether it goes live. The feature effectively creates a pre approval layer, preventing unauthorized releases from ever reaching the profile.

Spotify has acknowledged that adding friction to the release process could potentially delay legitimate releases if artists do not approve content in time.

To balance this, the company is introducing an artist key system that allows trusted distributors to bypass the approval step for verified artists, maintaining speed for regular releases while still protecting against unauthorized content.

Streaming platforms were built to handle large volumes of content. The emergence of AI tools has added velocity to that volume, enabling music creation and upload at rates faster than verification systems were designed to manage.

When artist identity becomes unclear, downstream functions including discovery algorithms, royalty payouts, and artist reputation are all affected.

For much of the streaming era, the focus has been on expanding access. Lowering barriers to distribution enabled more artists to reach audiences. But that open architecture also created gaps that are now being exploited at scale with AI generated content.

Artist Profile Protection represents a shift toward rebuilding control into a system originally designed for openness. Rather than locking down distribution entirely, Spotify is providing artists with a layer of ownership over how their identity is used on the platform.

While the feature does not solve every issue related to AI generated music and misattribution, it signals a broader industry trend. As streaming platforms contend with increasing volumes of content and new forms of unauthorized use, the next phase of development may focus less on getting music onto platforms and more on verifying that what appears there is legitimate.