It looks like SoundCloud has subtly altered its terms of service to enable AI training on user-uploaded music.
A clause in the most recent edition of SoundCloud’s conditions permits the platform to utilize uploaded content to “inform, train, [or] develop” AI, as noted by digital ethicist Ed Newton-Rex.
The terms, which were last revised on February 7, said that “you expressly consent that your Content may be used to inform, train, develop, or serve as input to artificial intelligence or machine intelligence technologies or services as part of and for providing the services.”
According to its rules, SoundCloud appears to assert the right to train on users’ submitted music. I believe this raises important questions for them to address.
According to the wayback machine, it was added to their agreements on February 12, 2024. I use SoundCloud, however I don’t see any image.twitter.com/NIk7TP7K3C
— May 9, 2025, Ed Newton-Rex (@ednewtonrex)
Content covered by “separate agreements” with third-party rightsholders, such record labels, is specifically excluded from the terms. In addition to large music publishers like Universal Music and Warner Music Group, SoundCloud has a variety of licensing arrangements with independent labels.
Like many other creator platforms, SoundCloud is embracing AI more and more.
In order to introduce AI-powered tools for remixing, vocal generation, and making unique samples to its platform, SoundCloud teamed with around a dozen manufacturers last year. SoundCloud promised to “uphold ethical and transparent AI practices that respect creators’ rights” and stated in a blog post last fall that these partners will have access to content ID solutions to “ensure rights holders [sic] receive proper credit and compensation.”
In recent months, some social media and content hosting companies have modified their rules to include AI training by both first- and third parties. Elon Musk’s X revised its privacy guidelines in October to allow outside firms to access user posts to train AI. LinkedIn changed its rules in September of last year to let it to collect user data for training purposes. Additionally, YouTube started allowing outside parties to train AI on user videos in December.
Users who contend that AI training rules need to be opt-in rather than opt-out and that they ought to get recognition and compensation for their contributions to AI training datasets have reacted negatively to several of these actions.
Revised at 2:22 PM Pacific: We’ve included a portion of the statement that a SoundCloud representative sent us via email below:
“SoundCloud does not create AI tools, nor does it permit third parties to scrape or exploit SoundCloud content from our platform for AI training. We have never used artist content to train AI models. In order to specifically forbid unauthorized use, we actually put in place technical protections, such as a “no AI” tag on our website.
The purpose of the February 2024 modification to our terms of service was to make clear how material on SoundCloud’s own platform would interact with AI technology. Use cases for AI technology include fraud detection, content organization, personalized suggestions, and enhanced content identification.
Future AI applications at SoundCloud will be developed to assist human musicians by expanding their access to our platform’s tools, capabilities, reach, and possibilities. Better music recommendations, playlist creation, content organization, and fraud detection are a few examples. These initiatives are in line with current licensing contracts and moral principles. Instead of being utilized to train generative AI models, tools like [those from our partner] Musiio are only used to facilitate content categorization and artist discovery.
We acknowledge the issues brought up and are still dedicated to having an open discussion. Artists will remain in charge of their creations, and we’ll update our community at every stage as we investigate new ideas and use AI technologies sensibly, particularly if legal and business frameworks change.