OpenAI’s ChatGPT AI chatbot service is gaining significant popularity, with over 180 million users utilizing the free version and many companies adopting the paid version to enhance their operations. However, the energy and resources required to support this massive user base are staggering.
To accommodate hundreds of millions of requests daily, OpenAI’s servers consume around half a million kilowatt-hours of electricity, equivalent to the energy usage of 17,000 average American households. As the AI industry continues to evolve and grow, this energy consumption is expected to increase significantly in the future.
Researcher Alex de Vries highlighted the energy-intensive nature of AI, estimating that by 2027, the industry could consume 85 to 134 terawatt-hours of electricity annually, potentially accounting for half a percent of global electricity consumption.
In addition to energy usage, AI companies like OpenAI, backed by Microsoft, also require substantial amounts of water for cooling their servers. Microsoft saw a 34% increase in water consumption for its data centers due to the growth of AI services, using around 1.7 billion gallons between 2021 and 2022.
University of California researcher Shaolei Ren noted that ChatGPT consumes water based on the number of questions or prompts processed, with each interaction using approximately 16 ounces of water. This water consumption is a critical aspect that AI companies need to address for long-term sustainability.
As the AI industry continues to expand, the focus on creating more sustainable practices for energy and water usage will be crucial to mitigate environmental impacts. Companies like OpenAI and Microsoft will need to explore innovative solutions to reduce their carbon footprint and ensure responsible resource management in the development and operation of AI services.