Making ChatGPT Cheaper: Microsoft Aiming to Cut Costs by Over $700,000 Daily

Date:

Using an advanced AI technology like ChatGPT could prove an extremely costly venture for OpenAI. According to an analyst from SemiAnalysis, Dylan Patel, ChatGPT could cost at least $700,000 per day. Patel noted that ChatGPT needs immense computing power on expensive servers to answer queries. The cost so far is based on OpenAI’s GPT-3 model. If GPT-4 were used the costs would be even higher.

In an effort to reduce the costs of using GPT technology, Microsoft is reportedly working on a covert AI chip called Athena. Athena is set to be used internally by OpenAI and Microsoft and could be released as early as next year. Additional reports suggest that around 300 people are currently working on the project which began back in 2019. The whole idea behind the project is to allow Microsoft and OpenAI to access their own AI chips at a cheaper cost compared to using GPUs.

AI is still a relatively new field and the surge in demand for AI solutions has led to many companies investing large amounts into their technology. For example, a startup called Latitude has reportedly been spending $200,000 a month on AI and Amazon Web Services servers. To tackle the issue, Latitude decided to switch over to language software provided by AI21 Labs which successfully reduced the amount to $100,000 a month.

The enterprise mentioned in this article is OpenAI, a San Francisco-based research laboratory focused on developing human-level artificial intelligence. It was founded in 2015 by entrepreneur Elon Musk and other scientists. Its goal is to develop and promote artificial general intelligence (AGI) to improve human decision making. OpenAI has received investments from a range of venture capital firms and industry leaders such as Microsoft and Amazon.

See also  Slack Gets ChatGPT Powered Artificial Intelligence for its Chats

The person mentioned in this article is Dylan Patel, chief analyst at semiconductor research firm SemiAnalysis. He has provided insight into the costs of running the OpenAI GPT-3 and GPT-4 models. His initial estimates are based on GPT-3 and he notes that the costs of GPT-4 could be even higher. Patel and his colleague at SemiAnalysis, Afzal Ahmad, have pointed out that the operational costs, or inference costs, of running ChatGPT far exceed the training costs.

Frequently Asked Questions (FAQs) Related to the Above News

Please note that the FAQs provided on this page are based on the news article published. While we strive to provide accurate and up-to-date information, it is always recommended to consult relevant authorities or professionals before making any decisions or taking action based on the FAQs or the news article.

Share post:

Subscribe

Popular

More like this
Related

Albanese Government Unveils Aged Care Digital Strategy for Better Senior Care

Albanese Government unveils Aged Care Digital Strategy to revolutionize senior care in Australia. Enhancing well-being through data and technology.

World’s First Beach-Cleaning AI Robot Debuts on Valencia’s Sands

Introducing the world's first beach-cleaning AI robot in Valencia, Spain - 'PlatjaBot' revolutionizes waste removal with cutting-edge technology.

Threads Surpasses 175M Monthly Users, Outpaces Musk’s X: Meta CEO

Threads surpasses 175M monthly users, outpacing Musk's X. Meta CEO announces milestone in social media app's growth.

Sentient Secures $85M Funding to Disrupt AI Development

Sentient disrupts AI development with $85M funding boost from Polygon's AggLayer, Founders Fund, and more. Revolutionizing open AGI platform.