ChatGPT, an AI-driven chatbot developed by OpenAI, has been gaining a lot of traction amongst AI users in the past few years. But with such high demand, the AI service comes with a price tag: $700,000 per day to run. This has caused companies like Microsoft to look into building their own AI chips to help reduce the cost of running its services.
Microsoft’s own AI chip, codenamed “Athena”, is reportedly in a good enough spot to be tested internally. The chip is set to be distributed among Azure AI services by next year, so that users can take advantage of lowered costs. Industry forecasts suggest that OpenAI might need to order as many as 30,000 NVIDIA GPUs this year in order to maintain the services, so Microsoft’s own chips could help offset this cost.
Although Microsoft is not looking to replace NVIDIA altogether, this could be seen as a way of cutting the dependency on third-party vendors. Lower costs could help OpenAI run ChatGPT for cheaper, as well as other AI services that run on Azure. This news caps Microsoft’s aggressive push for its AI products and services, sparked by the lukewarm reception of Google’s AI product, Bard. In light of this, reports claim that confidential Google employees have lost confidence in Bard, feeling that the product was rushed to market in response to Microsoft’s own AI efforts.
The chatbot industry is rapidly growing and evolving each day, and Microsoft’s own AI chip is the perfect way to ensure their competitive edge in the race. Dylan Patel, Chief Analyst of research firm SemiAnalysis, is optimistic about the chip, saying that it will “provide the needed boost to Microsoft in this competitive environment.” With an experienced and well-backed team, Microsoft is no doubt looking to further its development in AI technology.