March 10, 2024 – In a recent report by The New Yorker, it was revealed that the popular chatbot, ChatGPT, handles over 200 million requests daily, resulting in a staggering electricity consumption that could reach up to 500,000 kilowatt-hours per day. In comparison, the average daily electricity usage of a typical American household stands at just 29 kilowatt-hours, making ChatGPT’s daily power consumption over 17,000 times higher.
What is even more concerning is the potential surge in electricity demand as generative artificial intelligence (AI) becomes more widespread. According to a paper published in the sustainable energy journal Joule by Alex de Vries, a data scientist at the Dutch Central Bank, if Google applied generative AI technology to all its searches, it would consume approximately 29 billion kilowatt-hours of electricity annually. This is equivalent to the total annual electricity generation of Kenya, Guatemala, and Croatia combined.
However, assessing the exact electricity consumption of the booming AI industry poses a challenge. As reported by The Verge, large technology companies have been leading the way in AI development but are tight-lipped about their energy usage. Additionally, different AI models operate in vastly different ways.
Nonetheless, de Vries has made some rough calculations based on figures released by NVIDIA, which according to CNBC, holds about 95% of the market share for graphics processing units (GPUs). In his paper, de Vries estimates that by 2027, the entire AI industry could consume between 85 and 134 terawatt-hours of electricity annually.
“That’s equivalent to about half of the world’s electricity consumption by 2027,” de Vries told The Verge. “I think that’s a pretty substantial number.”
In comparison, even some of the world’s largest electricity consumers pale in significance. According to calculations by Business Insider based on a report by the Consumer Energy Solutions, Samsung consumes close to 23 terawatt-hours of electricity annually, while technology giants like Google use slightly over 12 terawatt-hours per year for their data centers, networks, and user devices. Microsoft, on the other hand, consumes just over 10 terawatt-hours of electricity annually.