Industrial news

Scotrail voice clone is “exploitative” behaviour, says Equity

Scotrail is called upon to remove AI announcement voice following reports of "violation" from voice artist.

Photo credit: jax10289 / Shutterstock.com

Scotrail is being called on to remove the voice of its controversial new AI train announcement system following reports that the artist whom the voice is based on had not given permission for its use.  

Equity member and voiceover artist, Gayanne Potter, whose voice provided the clone for Scotrail’s announcement system, dubbed ‘Iona’, has called the situation “a violation” and has urged for more legislation in the sector. 

Gayanne signed an agreement to provide voiceover work for ReadSpeaker , a Swedish company that specialises in text-to-speech technology, in 2021. Gayanne says she was advised at the time of engagement that her voice would be used by ReadSpeaker for non-commercial purposes to help visually impaired readers to access educational content and would not be forward sold without prior written consent. Gayanne has requested multiple times since 2023 that ReadSpeaker  remove her voice from their platform, which now provides AI clone voices to third party companies. Scotrail is one of the latest companies to have been allowed to purchase Gayanne’s voice.

We have heard from numerous performers who have lost control over their voice or likeness, and had their privacy and likeness hijacked through the misuse of AI. Such misuse is an attack on our members’ fundamental rights.

Equity has called ReadSpeaker’s behaviour “exploitative” and says that without explicit consent, the practice “represents an infringement of our members’ data protection and other rights.” 

There are a growing number of artists who have delivered voiceover work in the past and subsequently found AI-generated versions of their voice listed on platforms without their full and informed consent.   

Gayanne Potter, said:   

“When I engaged with ReadSpeaker,  both I and my agent were assured by them that it was not for commercial content and would not be forward sold without prior written consent. Now to find it has been bought by a third party to be used on a national rail network without my knowledge or my consent is a violation. 

“In 2021 AI was not what it is now. You cannot consent to something that doesn't exist. 

“I’ve been in dispute with ReadSpeaker since 2023, and have requested several times that they remove the AI voice model 'Iona' from their website along with my voice data that it is using. They refuse.  

“Legislation must be brought in to tighten controls. We should all be protected, not just as performers but as human beings.” 

Liam Budd, Equity Industrial Official for Recorded Media, said:   

“It is extremely exploitative for companies to use voice recordings to create commercial digital replicas of aArtists from contracts that pre-date the development of generative AI or were not drafted explicitly for this purpose. 

“Gayanne is directly competing in a marketplace with a low-quality clone of her own voice that she claims was developed without her informed and explicit consent. Not only is this distressing for her, but it would represent an infringement of our members’ data protection and other rights.  

“Sadly, we have heard from numerous performers who have lost control over their voice or likeness, and had their privacy and likeness hijacked through the misuse of AI. Such misuse is an attack on our members’ fundamental rights. It is hardly surprising that most performers are currently pessimistic about the growing impact of AI on their performing work.  

“The union continues to call on the Government for legal certainty around the use of historic contracts for AI-purposes and greater enforcement of existing GDPR laws, which give our members much needed protections but are currently being ignored. We also urge the entertainment industry to work constructively with the union to establish ethical practices for AI and ensure that proper remuneration, consent, and transparency are the norm in performer contracts.”  

Earlier this year, Equity published an Open Letter to the entertainment industry calling on entertainment bosses to comply with performers’ property rights and applicable data protection laws in relation to historic contracts.  

Equity has consistently invited engagers across the audio and audio-visual industries to work constructively with the union to establish ethical practices in performer contracts. Equity’s AI Vision Statement outlines eight principles for the industry to adopt when engaging artists for the purpose of performance cloning. This includes consent (and not consent) for past, current and future performances. 

Are you an audio artist who believes their rights have been infringed? Please contact Shannon Sailing Industrial Official, Audio and Video Games at audio@equity.org.uk

 


Latest News