Key Similarities and Differences: EU AI Act and US Executive Order Shape Global AI Governance

Date:

Title: Key Similarities and Differences: EU AI Act and US Executive Order Shape Global AI Governance

In October 2023, the White House released an Executive Order (EO) outlining a comprehensive strategy to support the development and deployment of safe and secure AI technologies, just days before an international AI Safety Summit in the UK. Meanwhile, the European Commission proposed the Regulation Laying Down Harmonized Rules on Artificial Intelligence (the EU AI Act) in 2021, which is currently undergoing negotiations. Both the EO and the AI Act play significant roles in shaping global AI governance and regulation. Let’s delve into the key similarities and differences between the two approaches.

Comparison in Approach:

The EU’s AI Act aims to establish a new regulation modeled on EU product-safety legislation, imposing detailed technical and organizational requirements on AI system providers and users. Providers of high-risk AI systems would bear the most obligations, covering areas such as data governance, training, testing and validation, conformity assessments, risk management systems, and post-market monitoring. The Act also prohibits certain uses of AI systems and enforces transparency obligations.

In contrast, the EO does not introduce new legislative obligations. Instead, it provides directions to government agencies, including instructing the Department of Commerce to develop rules requiring disclosures from companies involved in AI model development or infrastructure under specific circumstances. The EO has a broader scope, encompassing social issues like equity, civil rights, and worker protection. Additionally, it directs the State Department to lead international efforts in establishing AI governance frameworks.

Another noteworthy difference lies in enforcement. The proposed AI Act incorporates a complex oversight and enforcement regime, with potential penalties of up to EUR 30 million or 2-6% of global annual turnover for violations. Conversely, the EO does not include enforcement provisions.

See also  Cybersecurity Startup Cyble Raises $24M in Series B Funding for Global Expansion

Areas of Common Ground:

Both the AI Act and the EO focus on high-risk AI systems. The AI Act categorizes such systems based on risk and imposes significant compliance requirements. These requirements entail designing AI systems to enable record-keeping, facilitating human oversight, and ensuring an appropriate level of accuracy, robustness, and cybersecurity. The EU Parliament’s version of the AI Act proposes additional obligations for foundation models, defined as AI models trained on broad data for general output adaptable to various tasks.

Similarly, the EO concentrates on high-risk AI systems, mandating developers to share safety test results and critical information related to dual-use foundation models posing serious security risks. The red-teaming and reporting requirements are applicable to models meeting certain technical qualifications outlined in the EO.

Both the AI Act and the EO address transparency requirements. The AI Act necessitates that AI systems designed to interact with individuals must be distinguishable as such, while users of AI systems involved in emotion recognition, biometric categorization, or generating manipulated content like deepfakes must inform people exposed to these systems. The EO tackles transparency by requiring a report identifying standards, tools, methods, and practices for authenticating, labeling, detecting, and preventing synthetic content. The guidance for labeling and authenticating synthetic content will be issued by the Director of the Office of Management and Budget (OMB) after the report.

Moreover, both the AI Act and the EO emphasize the importance of standards. The AI Act promotes the development of harmonized technical standards for AI systems, along with the creation of AI regulatory sandboxes to encourage compliance within a controlled environment. The EO establishes guidelines for AI development issued by the U.S. National Institute for Standards and Technology, with an aim to achieve consensus with industry standards. The EO also directs the Secretary of State to devise a plan for global engagement in developing AI standards.

See also  Backlash Against ChatGPT Founder from European Union Lawmakers

In conclusion, while there are notable differences between the EU AI Act and the US Executive Order, such as enforcement mechanisms and approach, they also share common ground concerning high-risk AI systems and transparency requirements. Both approaches recognize the need for standards and emphasize their importance in shaping the future of AI governance. Collaboration between the US and EU in this domain is further facilitated by the U.S.-EU Trade and Technology Council’s joint Roadmap for Trustworthy AI and Risk Management, aiming to advance collaborative approaches in international AI standards bodies.

Sources:

– EU AI Act: [Link to the blog post]
– US Executive Order: [Link to the blog post]
– U.S.-EU Trade and Technology Council’s joint Roadmap for Trustworthy AI and Risk Management of December 2022: [Link to the blog post]

Frequently Asked Questions (FAQs) Related to the Above News

What is the EU AI Act?

The EU AI Act is a proposed regulation by the European Commission that aims to establish harmonized rules on artificial intelligence. It imposes obligations on AI system providers and users, especially those dealing with high-risk AI systems, covering areas such as data governance, training, testing, and more.

What is the US Executive Order mentioned in the article?

The US Executive Order is a directive released by the White House that outlines a strategy for supporting the development and deployment of safe and secure AI technologies. It provides directions to government agencies, including the development of rules for disclosure requirements by companies involved in AI model development or infrastructure.

How do the EU AI Act and US Executive Order approach AI governance differently?

The EU AI Act introduces new legislative obligations, while the US Executive Order does not. The AI Act imposes detailed technical and organizational requirements on AI system providers, whereas the EO provides directions to government agencies. The EO also has a broader scope, including social issues such as equity and civil rights.

How is enforcement handled in the EU AI Act and US Executive Order?

The proposed AI Act incorporates a complex oversight and enforcement regime, with potential penalties for violations. On the other hand, the EO does not include specific enforcement provisions.

What common ground is there between the EU AI Act and US Executive Order?

Both the AI Act and the EO focus on high-risk AI systems and impose compliance requirements on developers. They also address transparency requirements related to AI systems. Additionally, both initiatives emphasize the importance of standards in AI development and governance.

How do the EU and US initiatives plan to collaborate on AI governance?

Collaboration is facilitated through efforts such as the U.S.-EU Trade and Technology Council's joint Roadmap for Trustworthy AI and Risk Management. This roadmap aims to advance collaborative approaches in international AI standards bodies and promote cooperation between the US and EU in shaping the future of AI governance.

Please note that the FAQs provided on this page are based on the news article published. While we strive to provide accurate and up-to-date information, it is always recommended to consult relevant authorities or professionals before making any decisions or taking action based on the FAQs or the news article.

Share post:

Subscribe

Popular

More like this
Related

Obama’s Techno-Optimism Shifts as Democrats Navigate Changing Tech Landscape

Explore the evolution of tech policy from Obama's optimism to Harris's vision at the Democratic National Convention. What's next for Democrats in tech?

Tech Evolution: From Obama’s Optimism to Harris’s Vision

Explore the evolution of tech policy from Obama's optimism to Harris's vision at the Democratic National Convention. What's next for Democrats in tech?

Tonix Pharmaceuticals TNXP Shares Fall 14.61% After Q2 Earnings Report

Tonix Pharmaceuticals TNXP shares decline 14.61% post-Q2 earnings report. Evaluate investment strategy based on company updates and market dynamics.

The Future of Good Jobs: Why College Degrees are Essential through 2031

Discover the future of good jobs through 2031 and why college degrees are essential. Learn more about job projections and AI's influence.