With ChatGPT gaining popularity, global electricity consumed by artificial intelligence (AI) could increase by 85-134 Terawatt-hours (TWh) annually by 2027, according to a new report published in the journal Joules.
This amount, according to the paper, is comparable to the annual electricity consumption of countries such as the Netherlands, Argentina and Sweden.
“Looking at the growing demand for AI service, it is very likely that energy consumption related to AI will significantly increase in the coming years,” Alex de Vries, a doctoral candidate at Vrije Universiteit Amsterdam, said in a statement
Following the successful launch of OpenAI’s ChatGPT, a conversational generative AI chatbot that reached 100 million users in two months, Alphabet and Microsoft began to catch up.
They significantly increased their support for AI in 2023 and introduced their own chatbots, Bing Chat and Bard, respectively.
“This accelerated development raises concerns about the electricity consumption and potential environmental impact of AI and data centres (facilities that store computing machines and their related hardware equipment),” the paper highlighted.
It also added that data centres’ electricity consumption between 2010 and 2018 may have increased by only 6 per cent.
But “there is increasing apprehension that the computational resources necessary to develop and maintain AI models and applications could cause a surge in data centers' contribution to global electricity consumption”, de Vries wrote in Joules.
Estimated energy consumption per request for various AI-powered systems compared to standard Google search
In the recent past, generative AI, used for creating new content such as text, images or videos, such as ChatGPT and DALL-E have grown popular.
Both these tools, de Vries said, use natural language processing, a branch of AI that allows computers to understand text and spoken words like human beings.
These tools share a common process: An initial training phase followed by an inference phase. A previous analysis found 98 papers on the environmental sustainability of AI since 2015. Of them, 17 focused on the inference phase and 49 dealt with the training phase.
Studies have mainly focused on the training phase, which has a large carbon footprint. For training, large language models, including GPT-3, Gopher and Open Pre-trained Transformer (OPT), reportedly consumed 1,287, 1,066 and 324 MWh of electricity, respectively.
After they are trained, the LLMs are tested on new data, thereby kicking off the inference phase.
The author expressed concerns that the inference phase might contribute significantly to an AI model’s life-cycle costs. The energy demand of ChatGPT was 564 MWh per day compared to the estimated 1,287 MWh used in the training phase.
“It remains an open question how the inference phase generally compares to the training phase in terms of electricity consumption, as the current literature offers minimal additional insights into the relative weight of each phase,” de Vries wrote. He called for further studies to bridge the knowledge gap.
If generative AI is used in every Google search, the daily electricity consumption would amount to 80 GWh.
In 2021, Google’s annual electricity use was 18.3 TWh, with 10-15 per cent coming from AI.
In the worst-case scenario, according to de Vries, Google AI’s electricity usage could be comparable to Ireland's 29.3 TWh per year. This scenario, however, assumed a full-scale AI adoption.
There are solutions, though. For example, innovations in model architectures and algorithms could help mitigate or even reduce AI-related electricity consumption in the long term, de Vries noted.
“The potential growth highlights that we need to be very mindful about what we use AI for. It is energy-intensive. So, we don't want to put it in all kinds of things where we don’t actually need it,” de Vries said.