Oregon Lawmakers Consider Regulating Use of AI in Campaign Ads
In anticipation of this year’s elections, Oregon lawmakers are considering a bipartisan bill that would require campaigns to disclose the use of artificial intelligence (AI) in their materials. Senate Bill 1571 aims to address the growing concern around deep fake technology, which can manipulate images and voices to create false and misleading content. If passed, the bill would require campaign materials, including physical flyers and online videos, to clearly state if they have used AI to depict a person’s voice or image.
State Senator Aaron Woods, the sponsor of the bill, emphasizes the importance of keeping up with technological advancements. He believes that the bill will build awareness and promote transparency among political campaigns. Although Oregon has not experienced any high-profile instances of AI in political communications, the increasing accuracy and accessibility of AI technology have raised concerns nationwide. Recent examples include robocalls featuring the faked voice of President Joe Biden and a television ad using AI to mimic the voice of former President Donald Trump. In response, the Federal Communications Commission has banned the use of AI in robocalls.
The proposed bill in Oregon has garnered support from legislators across party lines. It defines synthetic media as images, audio recordings, or videos that have been intentionally manipulated using AI techniques or similar digital technology to create a false impression of events. Campaigns using such material would be obligated to disclose its use, with potential violations resulting in a lawsuit from the Oregon Secretary of State and a maximum fine of $10,000. Media organizations reporting on campaign ads featuring AI would be exempt from these regulations.
Unlike other states with more stringent regulations, the Oregon bill does not specify the exact format or size of the disclosure. Senator Woods suggests that these details can be addressed in future legislative sessions. Nevertheless, he maintains that his proposal has received bipartisan and bicameral support, making it a popular choice among lawmakers. Secretary of State LaVonne Griffin-Valade, Oregon’s top election official, also voiced support for the bill.
Public Citizen, a national advocacy group, has been pushing for stricter regulations regarding AI disclosure in states across the country. They argue for requirements such as using the same-sized font as the largest writing in print communications and displaying disclosures for the entire duration of video ads. Over three dozen states have introduced or passed bills aimed at banning deep fake videos in politics or mandating disclosure when AI is used.
In conclusion, as AI technology advances and becomes more prevalent in political campaigns, Oregon lawmakers are taking proactive steps to regulate its use and promote transparency. The proposed bill seeks to ensure that voters are aware when AI has been deployed to depict a person’s voice or image. While it does not provide specific guidelines for disclosure formats, it has garnered bipartisan support and may serve as a model for other states grappling with the challenges posed by deep fake technology in elections.