Note to radio air talent and production staff: using AI to impersonate the voices of celebrities, especially your station's core artists, may add a layer of creativity to promos and ads, but it also may come with a cost, as in a lawsuit.
“The advent of artificial intelligence poses interesting and often challenging legal issues because the law is still 'catching up' with the technology,” Belinda Scrimenti, partner at Wilkinson Barker Knauer LLP, says in David Oxenford's weekly Broadcast Law Blog. “Many existing legal concepts applied with traditional celebrity impersonation claims are already applicable to this kind of synthesized celebrity impersonation. Thus, if the use by a broadcaster of Taylor Swift’s voice (either taped and edited or impersonated by a human) would violate the right of publicity that is already found in the law of most states, the use of her AI voice would also violate these same rights.”
While the legality of an on-air celebrity impersonation, such as making Swift's voice say whatever you want it to, is therefore subject to state laws – meaning a legal claim would be based on where your station's signal is heard – defining any possible infringement depends on listeners being able to identify that celebrity. “In the case of an AI Taylor Swift voice, if the voice is identifiable as Ms. Swift, the use may constitute a violation, regardless of whether the broadcaster explicitly identifies the 'voice' as that of Taylor Swift,” Scrimenti says.
As it is, many celebrities already have legal protections in place. “Ms. Swift might have claims for copyright violations – for example, if a series of words or phrases were directly used and are protected by her copyrights – such as a significant portion of her song lyrics or a frequently-used catch phrase.” Scrimenti says. “She could also allege trademark or unfair competition violations, for false endorsement or false association with her, [not to mention] many celebrities also have trademarks registered on or associated with their names.”
This begs the question, are there examples of “fair use” of a celebrity's voice, even if an air personality is controlling what he or she says? “What is found to be 'fair' can vary from court to court,” Scrimenti says. “In copyright, just because the bit in which the voice is used is funny, does not mean that it necessarily is fair use, as that concept usually requires that the bit be making fun of the otherwise protected work itself, not that the copyrighted material is just used for the sake of comedy. While a broadcaster never wants to hear this, if you are considering such a use, consult your attorneys.”
One sure risk when it comes to celebrity voices is their use in local commercials to endorse a product or service. “Even if the celebrity is not explicitly identified in the spot, the use of their recognizable voice without permission may well give rise to a legal claim,” Scrimenti says.
Even as AI makes it easier to make the voices of recording acts and other celebrities popular with your audience a larger part of your station's programming, Scrimenti warns that “you should not treat AI-created celebrity 'voices' any differently than you would if a human impersonator had been doing the speaking, and in fact in some cases you may need to be more careful. Just because AI gives you the ability to make use of synthesized celebrities to enhance your programming does not mean that you should do so.”