The AI Drake track, which has mysteriously gone viral over the weekend, is the start of a problem that’s set to turn Google on its head one way or another — and it’s really not clear where it’s headed.
Here’s the basics: There’s a new track called “Heart on My Sleeve” by a TikTok user named @ghostwriter877 with AI generated vocals that sound like Drake and The Weeknd. The song mysteriously exploded out of nowhere over the weekend, which is fishy for a variety of reasons.
After the song went viral on TikTok, a full version was released on music streaming services like Apple Music and Spotify, as well as YouTube. This prompted Drake and The Weeknd’s Universal Music Group label to issue a strongly worded statement on the dangers of AI, specifically stating that the use of generative AI violates their copyrights. Here is this statement from UMG Senior Vice President of Communications James Murtagh-Hopkins:
Part of the success of UMG is that we have embraced new technologies and applied them to our artists – just as we have done for some time with our own innovation around AI. With this in mind, however, the training of Generative AI on our artists’ music (which is both a violation of our agreements and a breach of copyright) as well as the availability of content created with Generative AI on DSPs is in question which side of history everyone involved in the music ecosystem wants to be on: on the side of artists, fans and human creative expression, or on the side of deepfakes, fraud and the denial of artists due compensation.
These cases show why platforms have a fundamental legal and ethical responsibility to prevent their services from being used in a way that harms artists. The engagement of our platform partners on these issues encourages us as they recognize that they must be part of the solution.
What happened next is somewhat mysterious. The track comes from streamers like Apple Music and Spotify, who tightly control their libraries and can pull tracks for whatever reason, but it remained available on YouTube and TikTok, which are user-generated content platforms with established DMCA takedown processes. I was told by a source familiar with the situation that UMG has not issued any takedowns to music streamers and the streaming services have not told industry trade publications. Neither Drake nor The Weeknd. It’s weird – it seems like Ghostwriter977 pulled the track themselves to create hype, especially while the song stayed on YouTube and TikTok.
But then TikTok and YouTube also followed suit. And YouTube in particular pulled it back stating that it was removed due to a copyright notice by UMG. And this is where it gets intriguingly sloppy and probably existentially difficult for Google: In order to grant YouTube a copyright takedown, you have to… have a copyright on something. Since “Heart on my Sleeve” is an original song, UMG doesn’t own it – it’s not a copy of any song in the label’s catalogue.
So what did UMG claim? I was told that the label considers the Metro Boomin producer tag at the beginning of the song to be an unauthorized sample and that the DMCA takedown notice was issued specifically for that sample and only for that sample. It’s not clear if this tag is actually an example or if it was generated by the AI itself, but YouTube, for its part, doesn’t seem to want to push the discussion much further.
“We removed the video after receiving a valid copyright notice for an example included in the video,” said YouTube spokesman Jack Malon of the situation. “Whether or not the video was created using artificial intelligence does not affect our legal responsibility to provide rightsholders with a way to remove content that allegedly infringes their copyrighted expressions.”
UMG then issued individual URL-by-URL takedowns to YouTube when copies of the song surfaced, all based on the Metro Boomin tag – I was told by another music industry source that the company used YouTube’s automated ContentID can’t actually use the system because again it doesn’t own the song and can’t claim it for that system to start matching. (Oddly enough, Ghostwriter977 re-uploaded the track to their YouTube page after the first deactivation, and it’s… still there. There’s a lot of rotten stuff going on here, too.)
If Ghostwriter977 uploads “Heart on my Sleeve” without that Metro Boomin tag, they will start a copyright war pitting the future of Google against the future of YouTube
do you have all this Okay, here’s the problem: if Ghostwriter977 just uploads “Heart on my Sleeve” without that Metro Boomin tag, they’ll start a copyright war that pits the future of Google against the future of YouTube in potentially zero-sum ways. Google must either abandon all of its generative AI projects, including Bard and the future of search, or anger major YouTube partners like Universal Music, Drake, and The Weeknd. Let’s go through it.
The first legal problem with using AI to create a song with vocals that sound like they were made by Drake is that the end product isn’t a copy of anything. Copyright is very much based on the idea of copying – a sample is a copy, as is an interpolation of a melody. Music copyrights in particular have become aggressive in the streaming age, but it’s still all based on copies of actual songs. Fake Drake isn’t a copy of any song from the Drake catalogue, so there’s simply no anticipated copyright claim. There is no copy.
Instead, UMG and Getty Images, along with publishers around the world, claim that collecting all of the training data for the AI constitutes copyright infringement: that including Drake’s entire catalog, or every Getty photo, or the content of every Wall Street Journal article (or what anyway) Training an AI to take more photos or Drake songs or news articles is unauthorized copying. That would make the fake Drake songs created by this AI unauthorized “derivative works” and, phew, we’re still right in the realm of copyright law that everyone understands. (Or, well, pretends to understand.)
The problem is that Google, Microsoft, StabilityAI, and every other AI company claim that these training copies are being used fairly – and by “fair” they don’t mean “fair, as determined by an argument in an internet comment section,” but ” fair” as in “fair as determined by a court in a case-by-case application of 17 United States Code §107, which sets forth a four-factor test of fair use that is as controversial and unpredictable as anything else in the American Political Life.”
I asked Microsoft CEO Satya Nadella about it when I spoke to him about the new ChatGPT-powered Bing, and he wasn’t shy. “See, at the end of the day, search is about fair use,” he said. “In other places, on the other hand, you have to think carefully about what fair use is. And then sometimes I think there will be some legal cases that also need to set a precedent. “
That’s because there’s no real precedent for scraping data to train an AI being fair use; All of these companies rely on age-old internet law cases that allowed search engines and social media platforms to exist in the first place. It’s messy and it feels like all of these decisions are down to a decade of litigation.
So now imagine you’re Google, which on the one hand runs YouTube and on the other hand races to develop generative AI products like Bard, which is…trained by scraping tons of data from around the web under a permissive interpretation of fair use that will definitely be challenged in a wave of lawsuits. Along comes AI Drake and Universal Music Group, one of the biggest labels in the world, releases a strongly worded statement about Generative AI and how their streaming partners must respect their copyrights and artists. How are you?
- If Google Universal agrees that AI-generated music is an illegal derivative work based on the unauthorized copying of training data, and that YouTube should pull down songs that are flagged because they sound like their artists, it undermines its own fair use -Case for Bard and every other generative AI product it makes is undermining the future of the company itself.
- If Google disagrees with Universal and says that AI-generated music should stay current because merely training an AI with existing works is fair use, it’s protecting its own AI efforts and the company’s future, but likely solving a series of future lawsuits from Universal and possibly other labels, and certainly risks losing access to Universal’s music on YouTube, putting YouTube at risk.
I asked Google’s Malon about this dilemma, and he said, “It’s not up to YouTube to determine who owns the rights to content. This happens between the parties involved, and that’s why we give copyright owners tools to make copyright claims and uploaders tools to dispute claims they believe are false. Matters that cannot be resolved through our dispute resolution process may ultimately need to be decided by a court.”
YouTube only exists for a tricky dance to keep rightsholders happy, but Google’s future is a bet on expansionary copyright law
That’s the idea, but copyright claims on YouTube were messy and contentious before the AI explosion, and now there’s basically no way for Google to avoid some major lawsuits here.
YouTube still exists because of a tricky dance that keeps rightsholders happy and pays the music industry, but the future of Google itself is a bet on an expansive interpretation of copyright law that every creative industry from music to movies to news hates and fights against becomes death. Because it’s death: Generative AI tools promise to completely transform the market for almost all standard creative work, and these companies needn’t sit back and let it happen.
The difference here is that the last time something like this happened, Google and YouTube were disruptive newbies with killer products and little to lose, and they accepted Viacom’s lawsuit, and everyone else’s, as the price of victory. Well, they are…well, YouTube is literally a cable company. And navigating the tricky world of content partnerships while chasing after AI startups much freer to wreck things will force Google to make near-impossible decisions at every turn.