Making ChatGPT Cheaper: Microsoft Aiming to Cut Costs by Over $700,000 Daily

Date:

Using an advanced AI technology like ChatGPT could prove an extremely costly venture for OpenAI. According to an analyst from SemiAnalysis, Dylan Patel, ChatGPT could cost at least $700,000 per day. Patel noted that ChatGPT needs immense computing power on expensive servers to answer queries. The cost so far is based on OpenAI’s GPT-3 model. If GPT-4 were used the costs would be even higher.

In an effort to reduce the costs of using GPT technology, Microsoft is reportedly working on a covert AI chip called Athena. Athena is set to be used internally by OpenAI and Microsoft and could be released as early as next year. Additional reports suggest that around 300 people are currently working on the project which began back in 2019. The whole idea behind the project is to allow Microsoft and OpenAI to access their own AI chips at a cheaper cost compared to using GPUs.

AI is still a relatively new field and the surge in demand for AI solutions has led to many companies investing large amounts into their technology. For example, a startup called Latitude has reportedly been spending $200,000 a month on AI and Amazon Web Services servers. To tackle the issue, Latitude decided to switch over to language software provided by AI21 Labs which successfully reduced the amount to $100,000 a month.

The enterprise mentioned in this article is OpenAI, a San Francisco-based research laboratory focused on developing human-level artificial intelligence. It was founded in 2015 by entrepreneur Elon Musk and other scientists. Its goal is to develop and promote artificial general intelligence (AGI) to improve human decision making. OpenAI has received investments from a range of venture capital firms and industry leaders such as Microsoft and Amazon.

See also  Corporate Secrets Vulnerable to ChatGPT Cyber Attack, Security Company Warns

The person mentioned in this article is Dylan Patel, chief analyst at semiconductor research firm SemiAnalysis. He has provided insight into the costs of running the OpenAI GPT-3 and GPT-4 models. His initial estimates are based on GPT-3 and he notes that the costs of GPT-4 could be even higher. Patel and his colleague at SemiAnalysis, Afzal Ahmad, have pointed out that the operational costs, or inference costs, of running ChatGPT far exceed the training costs.

Frequently Asked Questions (FAQs) Related to the Above News

Please note that the FAQs provided on this page are based on the news article published. While we strive to provide accurate and up-to-date information, it is always recommended to consult relevant authorities or professionals before making any decisions or taking action based on the FAQs or the news article.

Share post:

Subscribe

Popular

More like this
Related

Global Data Center Market Projected to Reach $430 Billion by 2028

Global data center market to hit $430 billion by 2028, driven by surging demand for data solutions and tech innovations.

Legal Showdown: OpenAI and GitHub Escape Claims in AI Code Debate

OpenAI and GitHub avoid copyright claims in AI code debate, showcasing the importance of compliance in tech innovation.

Cloudflare Introduces Anti-Crawler Tool to Safeguard Websites from AI Bots

Protect your website from AI bots with Cloudflare's new anti-crawler tool. Safeguard your content and prevent revenue loss.

Paytm Founder Praises Indian Government’s Support for Startup Growth

Paytm founder praises Indian government for fostering startup growth under PM Modi's leadership. Learn how initiatives are driving innovation.