This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.

Search our site

Viewpoints

| 2 minutes read

Monetising mimicking music: AI and deepfakes

With the increasing popularity of “deepfakes” created by AI and the potential revenues from this new technology stream, it was only a matter of time before the industry heavyweights looked to monetise these tools.

“Deepfakes”, in a nutshell, means media created using AI that convincingly mimics a person’s voice, image and/or likeness. An example is the TV programme “Deep Fake Neighbour Wars”, which created deepfakes of various celebrities including “showing” Kim Kardashian and Idris Elba arguing over a shared garden. Whilst obviously untrue in this scenario, the AI-created show is nevertheless extremely convincing. As you can imagine, deepfakes therefore raise several legal issues such as defamation, data protection and of course intellectual property – particularly when it is feasible that the content did in fact originate with the personality in question.

To focus on the intellectual property aspect, which is primarily where the commercialisation and therefore revenue comes from, there are some interesting considerations. Without going into the intricacies of ownership of AI created works here, it is worth keeping in mind that it is not particularly straightforward and is an ever-evolving area of law.

There are always reasons to protect your intellectual property, whether that be financially (for example, so you can licence the IP for money), or artistically (to avoid “rip-offs” which may damage your brand) to name a couple of motivations. This new wave of AI could help to create new revenue streams; however, it could compromise artists’ identities or integrity thereby seemingly putting two of the key motivations to protect IP at odds with each other.

One of the first (or perhaps most newsworthy) musicians to monetise deepfakes in this way was Canadian electronic artist, Grimes. Grimes has allowed people to use her voice in AI-created songs, in exchange for 50% of the royalties and a credit for any music produced. It seems to be a win-win for Grimes, who achieves a new revenue stream without having to actually create any new works. However, it raises the question – what happens if a song is created using the AI tools that she does not like and does not want to be attributed to or associated with her?

The question of artistic identity/integrity will inevitably be an issue for the individual artist in question and perhaps less of a concern to the labels, who would be monetising these AI generated works. The early stage discussions between Google and Universal Music indicate that artists would have the choice to opt in, but this does not address the issue of music created “legitimately” on these platforms that are at odds with artists’ individual tastes. It may be the case that certain artists still require some control over what is ultimately released (which may be addressed in the licensing agreement), but this inevitably raises challenges on defining the boundaries of the freedom granted.

It will be interesting to see how these talks progress and how many artists would sign up to be included in these AI monetising initiatives.

Google and Universal Music are in talks to license artists’ melodies and voices for songs generated by artificial intelligence as the music business tries to monetise one of its biggest threats.

Tags

intellectual property, technology, artificial intelligence