SACRAMENTO, Calif. (CN) – With the 2020 election cycle picking up steam, a California lawmaker is seeking to protect candidates from fake social media videos he believes have the potential to change election outcomes.
Known as deepfakes, the videos and images are edited to pass fictitious events or scenes on as real ones. The deceptive technology came to the forefront during the 2016 presidential election and more recently a doctored video that made House Speaker Nancy Pelosi appear drunk during a speech was shared millions of times on social media.
In a move tinged with free speech implications, Assemblyman Marc Berman wants to give candidates the ability to sue individuals and organizations that share deepfakes without warning labels near Election Day.
“As more and more bad actors try to influence our elections with misinformation campaigns that sow confusion and doubt throughout the electorate, I think we can all agree with the premise that voters have a right to know when video, audio and images that they are being shown have been manipulated,” Berman says of Assembly Bill 730.
Berman, D-Palo Alto, is hoping to squash political deepfakes by barring a person or entity within 60 days of an election from “knowingly or recklessly” distributing deceptive media of a candidate, and creating a civil course of action for the affected candidates.
Under AB 730, courts could order violators to pay candidates damages and attorney’s fees.
On Tuesday, the Senate Elections and Constitutional Amendments Committee cleared the bill during its first legislative hearing, despite claims by media groups that the measure is unconstitutional in its current form.
“By passing this bill, you put your broadcasters in extreme jeopardy,” warned Mark Powers, vice president of the California Broadcasters Association.
Powers and other critics, including the California News Publishers Association, argued AB 730 is not narrowly tailored and would be impossible for radio and television broadcasters to comply with. They claim AB 730 could have the unintended effect of chilling free speech by scaring the media and individual voters from sharing even legitimate political ads.
“If there is no 100% accurate way to detect deepfakes and broadcasters are unable legally to refuse the ones that they suspect might be, your [local] stations may be forced to suspend taking any local or state candidate ads,” Powers told the five-member committee.
The publishers association, which lobbies on behalf of over 500 newspapers, called AB 730 well-intentioned but vague and unworkable. Staff attorney Whitney Prout noted there are already well-established defamation laws intended to ward against fake political advertisements.
“It’s ultimately an ineffective and frankly unconstitutional solution that causes more problems than it solves,” Prout testified. “This bill is a content-based regulation of speech, and as such it must be narrowly tailored to serve a compelling governmental interest.”
The publishers association contends AB 730 is also flawed because it doesn’t guarantee that a candidate’s claims will even be settled before Election Day.
“Allowing a candidate to recover a large amount of damages after an election does not make elections fairer,” the group said in an opposition letter.
Erinn Ryberg, legislative counsel for the California Judges Association, said her group hasn’t taken a position on AB 730 but hinted judges are worried about having to fit deepfake lawsuits into court calendars around election dates.
Berman’s bill contains language giving deepfake matters priority over other civil matters.
The American Civil Liberties Union of Northern California also testified against AB 730, while no groups appeared in support.
Berman introduced the bill on June 24 by amending a placeholder bill in a shifty process known as “gut and amend.” The old version has already passed the Assembly but the amended bill still faces an uphill battle with a tougher test next in the Senate Judiciary Committee.
To address the media groups’ concerns, Berman has built in a clause that would still allow them to play deepfakes if they include a disclosure that the content was manipulated. He said he’s open to further amendments but was adamant about the bill’s main purpose: curbing the popularity of high-quality deepfakes.
“I understand that there are significant First Amendment concerns with the bill as it is currently drafted, and I’m committed to working through these issues,” Berman said. “I would note however, that I haven’t seen a court determine that the First Amendment grants someone the right to literally put their words into my mouth, which is what this technology does.”