March 11, 2024 – A recent report has revealed that OpenAI’s popular chatbot, ChatGPT, may be consuming a staggering amount of electricity, potentially exceeding 500,000 kilowatt-hours per day. This colossal energy usage is nearly 17,000 times the average daily electricity consumption of a typical American household, which stands at just 29 kilowatt-hours.
As generative AI continues to gain traction, concerns about its energy demands are mounting. Alex de Vries, a data scientist at the Dutch National Bank, has estimated in a research paper that the entire AI industry could consume between 85 and 134 terawatt-hours of electricity annually by 2027. This enormous amount of energy is comparable to the total annual electricity generation of Kenya, Guatemala, and Croatia combined.
However, accurately assessing the actual electricity consumption of the burgeoning AI industry poses a significant challenge. Despite leading the way in AI development, major technology companies have been notoriously tight-lipped about their energy usage. Furthermore, the wide range of operating methods employed by different AI models adds another layer of complexity to estimating power consumption.
According to data from New Street Research, NVIDIA holds a dominant market share of approximately 95% in the graphics processing unit (GPU) market. De Vries’ estimates suggest that the AI industry’s electricity consumption could reach between 85 and 134 terawatt-hours by 2027, a significant figure that could potentially account for a substantial portion of global electricity consumption.
As the AI industry continues to expand rapidly, the issue of energy consumption is becoming increasingly urgent. With the potential for significant environmental impacts, it is crucial for companies and researchers to prioritize energy efficiency in their AI development efforts.