Fake Drake, The Weeknd Song Raises Questions on AI Copyright

Facts

  • A viral artificial intelligence (AI) song that replicates the voices of Drake and The Weeknd, called "Heart on My Sleeve," has been pulled from streaming platforms such as YouTube, Amazon, SoundCloud, Tidal, Deezer, and TikTok following a complaint from music label Universal Music Group (UMG).1
  • The song, created and released by TikTok user Gostwriter977, was viewed 8M times on the TikTok account, 18.9M times on Twitter, and streamed hundreds of thousands of times on the music streaming apps Spotify and Apple Music.2
  • Ghostwriter took to TikTok to explain the reason behind the AI-generated song, saying, "i [sic] was a ghostwriter for years and got paid close to nothing just for major labels to profit."3
  • Though Ghostwriter didn't reveal which AI platform was used, they said a person could make an AI song like this by writing and recording a track, then using an AI model to replace their vocals with a popular artist’s voice.4
  • UMG argued that "the training of generative AI using our artists' music" represented "both a breach of our agreements and a violation of copyright law," adding that platforms have a "legal and ethical responsibility to prevent the use of their services in ways that harm artists."1
  • Meanwhile, the battle over AI-generated music is new and ongoing, and people like Ghostwriter say it could democratize music.2

Sources: 1Axios, 2ZDNET, 3AllHipHop, and 4NBC.

Narratives

  • Narrative A, as provided by Forbes. The rapid onset of AI has led to technological progress far outpacing the implementation of legal and moral codes. While the corporate music industry has historically lagged behind technological advancement, there are strong arguments for protecting artists' rights to their own voices. Questions surrounding who can copyright novel, AI-generated music should also be answered to protect artists who wish to mix human creativity with computer-generated songs.
  • Narrative B, as provided by Popular Science. Though no one knows exactly which laws should be implemented, AI-generated voices are far more dangerous than fake songs gaining traction online. For example, as recently seen in Arizona, scammers can also mimic regular peoples' voices to then make it seem like they'd been kidnapped and ask for ransom payments. If this is where AI technology is heading, the world needs to seriously consider the worst possible outcomes of its advancement when writing legislation.