Home>Highlight>Assembly panel advances deepfake disclosure bills

Assemblyman Herb Conaway (D-Moorestown).

Assembly panel advances deepfake disclosure bills

One measure would bar manipulated video, audio of candidates within 60 days of an election

By Nikita Biryukov, January 25 2021 4:42 pm

An Assembly panel advanced two bills that would require disclaimers on manipulated video and audio recordings and restrict their use against candidates within 60 days of an election on Monday.

“Deepfake videos can be used to influence voters to believe in untruths without them even knowing the content was manipulated,” said Assemblyman Andrew Zwicker (D-South Brunswick), who sponsored one of the measures. “These deceptive machine-learning, computer-generated videos, images or audio have no place in our democracy and that is why we are requiring that their use be disclosed.”

The bills are an effort to head off a rapidly evolving technology with the capacity to upend elections in the United States and across the globe.

Deepfakes, manipulated media created by deep machine learning, can show candidates and other notable individuals making statements they never really made. Put another way, they can put words in someone else’s mouth.

“Fake videos and other manipulated media content introduced during an election aims to erode public trust and faith in democracy,” said Assemblyman Herb Conaway (D-Moorestown). “Deepfakes have rapidly become a popular tool employed to undermine political candidates, and campaigns. Any altered content of a video, photo, or audio should always include disclosure of its manipulation, especially if there is a chance it could negatively influence an election.”

Conaway is also sponsoring Zwicker’s measure, which would allow candidates to seek injunctions against deepfakes appropriating their likeness.

The other bill, sponsored by Assemblywoman Pamela Lampitt (D-Cherry Hill) requires all deepfakes be presented with disclaimers — as a printed message at the bottom of the screen for videos and as an oral warning on audio recordings.

“We’ve seen ‘deepfakes’ created to attack celebrities, they’ve misappropriated the images of women without their consent and, more recently, have been used in political campaigns to tarnish a candidate’s image,” she said. “These manipulated recordings can disrupt our election process with disinformation if not appropriately checked.”

Spread the news:

 RELATED ARTICLES