Keeping ChatGPT running comes with an astonishingly expensive price tag: $694,444 per day. That amounts to 0.36 cents per individual interaction. All of this computing power also consumes an excessive amount of energy, which is necessary to cool the data centers. As discovered by Semianalysis, the cost of continuing the operation is often higher than the cost necessary for training. Furthermore, the demand for ChatGPT is constantly increasing, which further emphasizes the importance of properly managing its upkeep.
Another costly consequence of utilizing the OpenAI language model is seen through Latitude, a company dedicated to creating games with artificial intelligence. According to Nick Walton, Latitude’s CEO, the company spends around $200,000 a month in AI and web hosting expenses. These figures provide an insight into the high price of ChatGPT, considering Latitude is far from being a big startup.
Microsoft is planning to alter the current costliness of ChatGPT with the release of their Athenea AI chip. This chip is meant to exponentially speed up the progress of OpenAI models like ChatGPT and GPT4. If this new chip is successful, it can drastically lower the daily cost, bringing some financial relief to those wishing to use the popular artificial intelligence platform.
It is a surprising yet beneficial alliance between Microsoft and OpenAI. Microsoft’s AI models have been running on chips made by Nvidia, who coincidentally is a provider of OpenAI. This association can potentially alter OpenAI’s current high expenses and make it more accessible to a larger group of audience.