SAG-AFTRA, OpenAI, actor Bryan Cranston, United Talent Agency, Creative Artists Agency, and the Association of Talent Agents have issued a joint statement about their recent collaboration to strengthen protections for voice and likeness in OpenAI’s Sora 2 product.
The issue arose when Cranston’s voice and likeness were generated by Sora 2 without his consent or compensation during an invite-only release two weeks ago. Although OpenAI had a policy requiring opt-in for such uses from the start, the company acknowledged that unintentional generations occurred and expressed regret. As a result, OpenAI has enhanced its safeguards to prevent replication of individuals’ voices and likenesses without explicit permission.
Cranston commented on the situation: “I was deeply concerned not just for myself, but for all performers whose work and identity can be misused in this way. I am grateful to OpenAI for its policy and for improving its guardrails, and hope that they and all of the companies involved in this work, respect our personal and professional right to manage replication of our voice and likeness.”
OpenAI reiterated its opt-in policy for using individuals’ voices or likenesses in Sora 2. The company stated that artists, performers, and others will have control over how they are simulated by generative technology. This approach is intended to support artist rights and ethical practices. OpenAI also committed to respond quickly to any related complaints.
The updated framework aligns with principles outlined in the NO FAKES Act, proposed federal legislation designed to protect performers from unauthorized digital replication. All parties involved—OpenAI, SAG-AFTRA, Bryan Cranston with his representatives at United Talent Agency, the Association of Talent Agents, and Creative Artists Agency—have voiced strong support for this bill. They believe establishing a national standard requiring consent and compensation is necessary for a sustainable creative ecosystem.
SAG-AFTRA President Sean Astin stated: “Bryan Cranston is one of countless performers whose voice and likeness are in danger of massive misappropriation by replication technology. Bryan did the right thing by communicating with his union and his professional representatives to have the matter addressed. This particular case has a positive resolution. I’m glad that OpenAI has committed to using an opt-in protocol, where all artists have the ability to choose whether they wish to participate in the exploitation of their voice and likeness using A.I. This policy must be durable and I thank all of the stakeholders, including OpenAI for working together to have the appropriate protections enshrined in law. Simply put, opt-in protocols are the only way to do business and the NO FAKES Act will make us safer.”
Sam Altman, CEO of OpenAI added: “OpenAI is deeply committed to protecting performers from the misappropriation of their voice and likeness. We were an early supporter of the NO FAKES Act when it was introduced last year, and will always stand behind the rights of performers.”
SAG-AFTRA represents around 160,000 entertainment professionals across various roles including actors, announcers, broadcast journalists, dancers, DJs, writers, editors, hosts, puppeteers, recording artists, singers, stunt performers and voiceover artists nationwide.









