- California’s new laws protect actors from unauthorised AI use of their voice and likeness, and require contract transparency.
- A second law prohibits using digital replicas of deceased performers without consent from their estates, addressing AI-driven ethical concerns.
OUR TAKE
California’s new laws protect performers by requiring contracts for AI-generated replicas and banning the use of deceased actors’ likenesses without estate consent. This move addresses growing concerns over AI misuse, including deepfakes, fraud, and democratic disruption. While federal regulation lags, state-level protections are crucial.
–Jasmine Zhang, BTW reporter
What happened
California Governor Gavin Newsom has signed two new laws designed to safeguard actors and performers from the unauthorised use of their digital replicas through AI.
One bill mandates that contracts specify the use of AI-generated replicas of a performer’s voice or likeness, ensuring performers are professionally represented during negotiations. The second bill prohibits the commercial use of digital replicas of deceased performers without consent from their estates.
These measures come amidst growing concerns over the implications of AI, including potential job loss and its role in fraud and democratic disruption. The Biden administration has been advocating for AI regulation, but progress has been slow due to a divided Congress.
Similarly, Tennessee Governor Bill Lee signed a bill earlier this year aimed at protecting artists from unauthorised AI use.
Also read: California approves legislation to tighten AI regulation
Also read: Elon Musk backs California bill to regulate AI
Why it’s important
California Governor Gavin Newsom’s recent laws to protect actors and performers from the unauthorised use of their digital replicas through AI are a significant step forward. With the rise of AI technologies, concerns about misuse have grown.
A real example is the 2019 incident where Fox News used deepfake technology to create a fake video of Nancy Pelosi. The video was digitally altered to make Pelosi appear as if she was slurring her words and speaking incoherently, despite it being entirely fabricated. Similarly, in 2021, a celebrity was duped by an AI-generated fake video that affected their business deals.
These new laws address such issues by requiring contracts to specify the use of AI-generated replicas and banning the commercial use of deceased performers’ likenesses without estate consent. This not only protects performers’ careers but also safeguards their legacy.
However, AI’s threats extend beyond the entertainment industry, with risks including fraud, misleading voters, and disrupting democracy. Despite the Biden administration’s push for AI regulation, progress has been slow, making state-level protections crucial.