Americans Show Surprising Support for Mandatory Safety Audits for AI Models, New Poll Finds
In the quest for ensuring a safe future with the rapid advancement of artificial intelligence (AI), many have turned to lawmakers, coders, scientists, philosophers, and activists for guidance. However, it seems that one group may have been overlooked as a potential source of inspiration: accountants.
A recent poll conducted by the Artificial Intelligence Policy Institute found that a seemingly wonky policy idea, namely mandatory safety audits of AI models before their release, enjoys surprising popularity among American adults. While the idea of audits may not be as exciting as more dramatic responses such as bans or nationalization, it has gained traction as a means of independently assessing the risks associated with new AI systems.
The concept of audits for AI models is not entirely new, but it has not received much attention in policy discourse until now. Ryan Carrier, a chartered financial analyst and advocate for AI audits, described it as under-represented and under-understood. However, when respondents were asked about various AI policy responses in head-to-head preference questions, the idea of AI safety audits came out on top two-thirds of the time, making it second only to the broader concept of preventing dangerous and catastrophic outcomes.
The popularity of government-mandated audits of digital technology is already evident, as demonstrated by the EU’s Digital Services Act. The act requires large online platforms like Amazon, YouTube, and Wikipedia to undergo annual independent audits to ensure compliance with its provisions. Additionally, last month, senators Josh Hawley and Richard Blumenthal unveiled an AI policy framework that calls for an independent oversight body to license and audit risky AI models.
The inclusion of audits among these policy responses was driven by their popularity in expert surveys. Daniel Colson, founder of the AI Policy Institute, noted that audits were in the sweet spot of being feasible and a priority for the safety community.
So, how would AI audits actually work? The idea is to adopt a similar system to financial audits, where publicly traded companies must submit to audits by independently certified accountants who are held responsible for their conclusions. Ideally, audits would involve pre-deployment assessments of AI model plans and post-deployment assessments of their functioning in the real world.
However, AI audits present unique challenges because even the designers of large language models do not fully comprehend their inner workings. As a result, auditors would need access to the model’s training data and would have to rely on observations of its inputs and outputs. Despite the complexity, the idea of independent oversight through standardized processes, as opposed to relying solely on powerful agencies, appeals to many at a time when trust in government bodies is low and anxiety about AI is high.
As discussions around AI regulation continue, the upcoming executive order on AI from the Biden administration remains a topic of interest. Some experts have expressed concern about potential regulations interfering with federal procurement and have called for a more focused approach. They suggest that the government should primarily focus on responsibly integrating AI into its operations, which would shape markets through its size and scope.
Furthermore, the Equal Employment Opportunity Commission (EEOC) is positioning itself as a governing body for AI in the workplace. This move provides the opportunity to shape policy in the frontier of AI governance. The EEOC could update existing hiring guidelines and establish new guidelines to prevent AI from violating anti-discrimination laws.
Although challenges and complexities exist in implementing AI audits, the surprising popularity of this policy idea among Americans highlights the growing recognition that independent assessments are crucial for ensuring the safe and responsible development of AI technologies. As discussions around AI regulation continue, incorporating audits as part of the policy response may gain even more traction.