Recent estimates from a research firm named SemiAnalysis suggests that running an advanced AI-powered bot like ChatGPT could set a company back a whopping $700,000 per day. ChatGPT, the natural language processing-enabled technology has come under both positive and negative spotlight since its introduction. A recent report from SemiAnalysis shows that the company requires approximately 3,617 HGX A100 servers and 28,936 GPUs to run efficiently.
Dylan Patel, the Chief Analyst at SemiAnalysis has additionally mentioned that the cost might even be bigger to sustain the newer model GPT-4. The fast-growing technology, however, received a massive boost when it gained 100 million users within two months of its release. This growth was faster than what was witnessed from other popular social media sites like TikTok and Instagram.
Microsoft, an important backer for OpenAI, is currently working to create specialized AI chips to reduce the daily cost of operations for ChatGPT. OpenAI has also released its “ChatGPT Plus” subscription service earlier this year, allowing users to take advantage of its AI capabilities against the payable $20 monthly fee.
When extrapolated, the SemiAnalysis blog post showed that if deployed simultaneously, deploying the AI model into Google searches would cost the tech giant around $36 billion in “LLM inference costs”. This introduction of trained GPT-4 model would require 512,820.51 A100 HGX servers and 4,102,568 A100 GPUs.
Overall, the cost of running an advanced AI-powered ChatGPT is immense, and this amount paid by companies to stay ahead of the curve for the incorporation of these technologies is justified in the results that companies receive.