In an effort to tackle the growing concerns around artificial intelligence (AI), lawmakers have introduced new legislation aimed at controlling election misinformation and safeguarding Hollywood actors.
On Tuesday, bipartisan lawmakers came together to propose a bill that would prohibit political campaigns from using AI in ways that could mislead voters about their opponents' positions.
This move reflects an increasing urgency to address the potential for AI-generated misinformation, particularly concerning "deepfakes"—highly realistic but fabricated videos and images.
As technology evolves rapidly, Congress is grappling with how to regulate these advances effectively.
Experts are sounding alarms about the risk of AI tools flooding the electorate with false information, making it challenging for voters to distinguish fact from fiction.
The proposed legislation aims to grant the Federal Election Commission (FEC) the authority to oversee AI use in elections, similar to its role in monitoring political misrepresentation for many years.
The FEC has already begun considering potential regulations in this area.
"At present, the FEC lacks the necessary authority to safeguard the integrity of our elections," noted Republican Representative Brian Fitzpatrick.
Fitzpatrick, alongside Representatives Adam Schiff, Derek Kilmer, and Lori Chavez-DeRemer, is backing the bill.
Although they acknowledge that the bill might face challenges passing this year, they remain hopeful it could be included in a must-pass bill during the final days of the congressional session.
Schiff views the bill as a crucial first step in addressing the threats posed by deepfakes and other forms of AI-generated misinformation.
In a related move, California Governor Gavin Newsom has signed new legislation designed to prevent the unauthorized use of AI in the entertainment industry, specifically to protect Hollywood actors.
"We're navigating new territory with AI and digital media transforming the entertainment sector, but our focus remains on safeguarding workers," Newsom stated.
"This law will help the industry thrive while enhancing protections for workers, including how their likenesses can be used."
This legislation follows last year’s Hollywood actors' strike, which highlighted concerns about AI potentially replacing human performers.
Starting in 2025, the new law will allow actors to exit contracts if the terms are vague about how studios can use AI to digitally clone their voices or images.
Supported by the California Labor Federation and the Screen Actors Guild-American Federation of Television and Radio Artists (SAG-AFTRA), this law is seen as a significant victory.
"Today is a landmark moment for SAG-AFTRA members and others," said SAG-AFTRA President Fran Drescher.
"The protections we fought for last year are now bolstered by California law, and this often sets a precedent for the rest of the nation!"
Post a Comment