A new report has revealed that the popular text-to-speech chatbot ChatGPT costs an astounding $700,000 per day, or 36 cents per query, to be operated. This is an enormous fee to pay, especially when considering that the company only has approximately 100 million active users, which they achieved in January.
In order to try and alleviate the costs, ChatGPT introduced a paid tier called ChatGPT Plus, which costs $20 per month, though it is unknown how successful this endeavor has been. Additionally, server issues are slowing down the chatbot and crashing its servers, due to the high demand and traffic.
With Microsoft being one of its primary investors, OpenAI, the company behind ChatGPT, may be turning to the tech giant for help in decreasing the cost of operations for ChatGPT. Microsoft is supposedly already in the process of developing proprietary AI chips specifically for Azure AI services, codenamed Athena, in order to help. Currently, OpenAI is utilizing Nvidia GPUs to provide support forChatGPT, and industry experts expect them to require an extra 30,000 GPUs from Nvidia over the course of 2023.
However, it is as of yet unclear when and how these AI chip developments will ultimately benefit ChatGPT. Nonetheless, the forthcoming hardware is expected to decrease the demand for the hardware necessary to operate ChatGPT, potentially lessening its considerable cost.
The person mentioned in this article is Dylan Patel, the chief analyst for the research firm SemiAnalysis. Dylan reported that ChatGPT costs an incredible $700,000 per day, or 36 cents per query, to run.
The company mentioned in the article is OpenAI, the research firm that created and developed ChatGPT. OpenAI is based in San Francisco and was founded by Elon Musk, Greg Brockman, Ilya Sutskever, and other innovators in the AI and machine learning arenas. The team at OpneAI continues to develop the ChatGPT project, although the immense fees designed with it have been prone to causing problems of late.