Photo Credit: Emily Morter
Just after Sony announces an AI detection tool, dance music platform Traxsource puts forth a policy that seems to call such tools into question.
Artificial intelligence in music continues to be a hot watercooler topic. That’s especially true as major companies continue playing whack-a-mole against—and then getting into bed with—AI companies that may or may not have ethically sourced their training materials. Various companies are taking nuanced approaches to AI audio on their platforms and within their midst, with dance music platform Traxsource the latest to take a stand.
But the timing of Traxsource’s policy, which outright rejects “fully AI-generated tracks” while embracing AI as a production aid within human creativity, is notable. The company outlined its position in a statement that closely follows Sony’s announcement of the development of an AI detection tool that’s raised more questions than it answers.
Traxsource makes clear that music created entirely from prompts without any meaningful human input will not be allowed on its platform. But the company doesn’t exactly denounce AI tools altogether. Instead, it recognizes their increasing presence in studio workflows and argues that such tools do not and cannot replace human authorship—because AI can only ape existing content rather than create something new from whole cloth.
To that end, Traxsource questions the reliability of so-called AI detection tools, without calling out Sony or any other company by name. According to Traxsource, perfect identification of AI-generated content is not only expensive and limited, it’s currently impossible.
“Many platforms are reluctant to acknowledge that accurate detection of AI-generated music is not yet possible. Detection tools are improving, but still face significant limitations and remain extremely cost-prohibitive at scale,” write Brian Tappert, Marc Pomeroy, and Sheldon Prince, at Traxsource. “Add in today’s hybrid workflow, which often blends human creativity with AI tools, and accurate detection is nearly impossible. Even the researchers building these systems acknowledge there is no 100% solution.”
“Any platform claiming foolproof AI detection is overstating what the technology can currently deliver. We choose transparency over false promises, both to avoid enforcement we cannot accurately execute and to protect human artists from being falsely accused.”
Traxsource’s stance on AI comes down to five core principles:
- Human artistry first: the platform remains dedicated to music created through human talent, experience, and identity.
- No fully AI-generated music: tracks created solely through prompts, without meaningful human contribution, are not permitted.
- AI is a tool, not a creator: work created by artists using AI to assist production while retaining creative control is valid.
- Customer-facing transparency: the company is developing a system to scan uploads and disclose information provided by artists, to enable track labeling of AI-assisted works.
- Adapting to an evolving landscape: the company acknowledges that industry norms, legal frameworks, and detection capabilities are rapidly evolving, and vows to evolve alongside them.
While nuanced, the stance is an interesting one. It follows French streamer Deezer’s announcement that its AI-detection technology is now available for other companies to license. The news also comes on the heels of Bandcamp’s sharp line in the sand drawn against AI tracks on its platform.
Ultimately, it’s unclear whose AI-facing direction is the better one, especially when it’s not even clear how accurate these AI-detection tools can be. In fact, such tools are only as good as their training datasets—many of which have been scraped from unauthorized content in the first place.
Whether Traxsource’s position proves to be a sustainable one in an ever-changing and increasingly automated industry has yet to be revealed. Regardless, it’s well-timed amid the broader conversation surrounding artificial intelligence in the entertainment sector.