Date:

AI Slop

Why Spotify’s AI Music Problem Is a Bigger Issue Than You Think

A screenshot of the fake Annie album, taken October 12th.

Some kind of AI slop had been uploaded to HEALTH’s artist page on Spotify, one of three fake albums that would appear under their name that weekend. The band’s X account made some jokes about it, the albums were eventually removed, and I went back to minding my own business. Then, the next weekend, I saw a new Annie album had dropped.

The Problem

That album was more plausible — Annie had just released a new single, "The Sky Is Blue" — but when I clicked in, I couldn’t find it on the list of the song titles. Confused, I played the album and heard birdsong and a vaguely New Age-y instrumental. That… did not sound like Annie.

"That was upsetting to me, because if you have ears, you can definitely hear it’s not our music," said Marcos Mena, Standards’ lead songwriter and guitarist. "It’s definitely a bummer because we did have a new album come out this year, and I feel like it’s detracting from that."

What’s Going On

To understand how this works, you need a sense of the mechanics. Streaming platforms like Spotify don’t work like your Facebook page — Mena and other artists aren’t logging in and adding albums to their accounts directly. Instead, they go through a distributor that handles licensing, metadata, and royalty payments. Distributors send songs and metadata in bulk to the streaming services. The metadata part is important; it includes things such as the song title and artist name but also other information, such as the songwriter, record label, and so on. This is crucial for artists (and others) to get paid.

Why It’s a Problem

The way it should have worked, getting the plumbing right, is that all those albums should have been flagged as new artists, and then it wouldn’t matter. But allowing too many junk bands through creates problems with the streaming services.

As for the distributors, the thing to keep an eye on is UMG’s lawsuit. A pretrial conference is scheduled for January. The outcome of the suit could potentially change how distributors filter the music people try to upload through their platforms — because if lawsuits are more expensive than content moderation, there’s likely to be more content moderation. That could improve things for Spotify, which is downstream of them.

Conclusion

AI music poses the same threat to Spotify, McDonald says. He points out that I had been waiting for the Annie album, excited for it, even. And then instead, I got duped into garbage. "There’s all these mechanisms around assuming this stuff is correct," he says. But right now, those mechanisms are broken — and people who truly care, like artists themselves, don’t have their hands on the controls.

FAQs

Q: Why is this a problem?
A: This is a problem because it’s affecting artists and their music. Fake albums are being uploaded to their accounts, which can confuse fans and detract from their actual music.

Q: Who is responsible for this?
A: The distributors and Spotify are responsible for this. They need to do a better job of filtering out fake albums and ensuring that only legitimate music is uploaded to the platform.

Q: What can be done to fix this?
A: UMG’s lawsuit could potentially change how distributors filter music. Additionally, Spotify and distributors need to do a better job of content moderation to prevent fake albums from being uploaded in the first place.

Q: Is this a new problem?
A: No, this is not a new problem. Fraudsters have been using AI to create fake music for years. However, the rise of AI-generated music has made it easier for fraudsters to create convincing fake albums.

Latest stories

Read More

LEAVE A REPLY

Please enter your comment!
Please enter your name here