An AI ethics row that is really about IP

Future News 109

In a classic case of journalise, a row has been ‘sparked’ and in some media outlets it has also ‘erupted’ because of director Morgan Neville’s new documentary about the late travel and food broadcaster Anthony Bourdain. 

Neville is apparently facing a ‘backlash’ because people are pissed off that he hired a machine learning company to produce a synthetic version of the celebrity chef’s voice to narrate parts of his Roadrunner film. 

Having taken his own life in France around three years ago, Bourdain wasn’t around to give Neville permission to make such a move. But in Neville’s defence he did allegedly run the idea past Bourdain’s ex-literary agency and widow, who has since denied giving her blessing, and used the technique in a limited (the deepfake voice narrates three quotes) and arguably elegant way – artistically the lines delivered by AI Bourdain would fit the film better than a random actor. 

Some people feel misled and uneasy about the technique. This criticism is based more on naivety than a grasp of ethics – documentaries are still a form of film, which rely heavily on narrative and therefore often omit large parts of information to fit into a digestible audio-visual package. 

The interesting questions about the AI audio technique are around intellectual property. Helpfully, Neville revealed in a promo interview with GQ how he built the model: 

“We fed more than ten hours of Tony’s voice into an AI model. The bigger the quantity, the better the result. We worked with four companies before settling on the best. We also had to figure out the best tone of Tony’s voice: His speaking voice versus his “narrator” voice, which itself changed dramatically of over the years. The narrator voice got very performative and sing-songy in the No Reservation years.” 

These snippets were from audiobooks, podcasts, radio, and TV. It would be presumably impossible for the publishers of such media to distinguish if their IP was used, but it does raise the question as to whether they have a licensing right to the use of the AI model which was built using their material. 

Equally, Bourdain’s estate surely has a personality right over an AI model of his likeness, including his voice which is being used in a commercial endeavour? It all depends what jurisdiction you are operating in. Kelsey Farish, a solicitor specialising in technology at DAC Beachcroft, has offered some musings on the situation in the UK:

“...Copyright protections may likewise be difficult to assert, given the multitude of potential rights holders concerned and potential for fair dealing exemptions. It is also important to note that complications will likely arise as these rights and protections must be balanced against freedoms of expression…”

And in the US the Brookings Institute published a paper on the issue in 2019:

“The legal landscape related to deepfakes is complex. Frameworks that can potentially be asserted to combat deepfakes include copyright, the right of publicity, section 43(a) of the Lanham Act, and the torts of defamation, false light, and intentional infliction of emotional distress. On the other side of the ledger are the protections conferred by the First Amendment and the “fair use” doctrine in copyright law, as well as (for social networking services and other web sites that host third-party content) section 230 of the Communications Decency Act (CDA). 

“It won’t be easy for courts to find the right balance. Rulings that confer overly broad protection to people targeted by deepfakes risk running afoul of the First Amendment and being struck down on appeal. Rulings that are insufficiently protective of deepfake targets could leave people without a mechanism to combat deepfakes that could be extraordinary harmful. And attempts to weaken section 230 of the CDA in the name of addressing the threat posed by deepfakes would create a whole cascade of unintended and damaging consequences to the online ecosystem.”

It all seems a bit murky then, at least for now. As the technique becomes more popular, especially around legitimate yet contentious projects like Neville’s, the IP use-case arguments will be formulated. The ethics, perversely, seem more clear cut – journalists and documentarians shouldn’t take the mickey. 

💼 Jobs and business

📧 Contact

For high-praise, tips or gripes, please contact the editor at or via @ianjsilvera. Follow on LinkedIn here. 

Image: Future News, “TV Error" by Sibe Kokke is licensed under CC BY 2.0