OpenAI’s ChatGPT, a cutting-edge AI chatbot is reportedly burning through quite a lot of cash every day, with research from SemiAnalysis estimating the daily costs to be around $694,444. According to the research, the system operates with 3,617 HGX A100 servers and 28,936 GPUs, making the cost per query an estimated 0.36 cents. Chief Analyst of SemiAnalysis, Dylan Patel, however believes the priced may even be higher considering the GPT-4 model OpenAI has released for paying subscribers. Such hefty costs are mainly due to the energy-hungry specialized hardware needed to operate the chatbot. Microsoft, being one of the main stakeholders in OpenAI, is trying to tackle this issue by manufacturing its own custom AI chips called ‘Athena’, with the goal of significantly lowering the costs.
Not only is OpenAI’s ChatGPT burning a hole in the company’s pocket, the AI has also raised concerns of replacing humans when it comes to coding and software packaging. Recently, computer scientists Raphaël Khoury, Anderson Avila, Jacob Brunelle, and Baba Mamadou Camara released a study that claimed the code generated by the chatbot to be below minimal security standards. Moreover, when prompted on the code’s security ChatGPT was able to recognize that it was not secure, adding to the worries of a tech industry filled with AI. Nonetheless, after being prompted by the researchers, it managed to produce seven more secure code snippets- a clear sign of further improvement. Although hefty transactions with no results may paint a worrisome picture, OpenAI’s ChatGPT, with more development and specialized hardware may just be the answer to secure coding.