Apple Tags, Puth, and Why Flaws Are the Future
As Apple Music rolls out AI transparency tags and Moises hires Charlie Puth, the message is clear: AI is the baseline. The human element is the premium.
There's been plenty happening over the past 48 hours.
First, Charlie Puth joined Moises, a leading AI music platform, as Chief Music Officer. Puth openly admitted to using AI "for years," but his mandate isn't about replacing the studio—it's about accelerating the workflow. His stance is definitive: "AI, when done right, isn’t here to replace musicians."
Second, and perhaps more structurally significant: Apple Music is preparing to introduce Transparency Tags to identify AI-generated tracks and visuals.
The Algorithmic Baseline
For the last three years, the anxiety was replacement. What happens when the machine can write, mix, and master a track in 30 seconds?
We are now seeing the answer: AI becomes the baseline. It is the new floor for fidelity. It is a utility, like a high-end EQ or a perfectly treated room. It solves the technical friction, but it does not solve the taste equation.
When Apple tags a song as "AI-generated," they aren't just categorizing it; they are commoditizing it. They are telling the listener, this was generated at scale.
The New Premium
If perfection and infinite generation are cheap, what is expensive? What commands attention?
Friction. Taste. Error. Intent.
A recent survey of over 1,100 music producers echoed this exact sentiment: AI can't carry amps on stage. The physical, messy, deeply intentional act of making music is becoming the differentiator.
If you want to survive the next decade of music production, stop trying to compete with the algorithm on speed, volume, or technical perfection.
Double down on your humanity. Your bizarre arrangement choices. The slight timing variations in your drum loops. The specific, idiosyncratic way you chain your plugins.
The machine can make it perfect. But only you can make it matter.