This week at the Build 2023 conference, Satya Nadella, CEO of Microsoft, announced the Windows Copilot, an AI-powered feature. The powerful tool draws on data from OpenAI, an AI maker, and uses advancements such as ChatGPT to improve Bing search and enhance the Windows experience. In response to the proliferation of advanced artificial intelligence (AI) tools and devices, Microsoft published a 40-page document titled, “Governing AI: A Blueprint for the Future.” The report demonstrates Microsoft’s ability to develop AI responsibly, advocating for government-backed regulations that protect public safety while simultaneously limiting fraudulent activities by AI.
In a blog post on Monday, OpenAI CEO Sam Altman applauded Microsoft’s approach, and discussed the need for community-driven international standards to regulate AI advancements. Microsoft responded to Altman’s statement with their report, which called for the government to spearhead an AI safety framework. The document urged society to “acknowledge the simple truth that not all actors are well intentioned or well-equipped to address the challenges that highly capable models present.”
Prior to this call for responsibility, the Future of Life Institute, a non-profit organization, had gathered signatures from prominent figures such as Tesla CEO Elon Musk and Apple co-founder Steve Wozniak, to launch a cautionary alert. The signers demanded a pause in the development of AI technologies more advanced than OpenAI’s GPT-4. Tech giants such as Google, Adobe, and Meta responded to the call, fueling the conversation and rapid launch of AI-infused tools.
Microsoft is among one of the leading tech industry forces driving AI, encouraging regulation that prioritizes precautionary measures while recognizing the potential upside of responsible deployment of AI systems. Microsoft encourages governments and society to build the frameworks that will protect public safety and ensure that AI applications are used for good. Only then can AI unleash the full potential of its capabilities.