15 Free YouTube subscribers for your channel
Get Free YouTube Subscribers, Views and Likes

Global electricity consumption by AI could increase by 85-134 TWh annually by 2027

Follow
Down To Earth

With ChatGPT gaining popularity, global electricity consumption by Artificial Intelligence could increase by 85134 TWh annually by 2027, according to a report published in the journal Joules. This amount is comparable to the annual electricity consumption of countries such as the Netherlands, Sweden and Argentina. As per Alex de Vries, a doctoral candidate at Vrije Universiteit Amsterdam, “Looking at the growing demand for AI service, it is very likely that energy consumption related to AI will significantly increase in the coming years”. Although data centres’ electricity consumption between 2010 and 2018 may have increased by only 6 per cent, the accelerated development raises concerns about electricity consumption and potential environmental impact of AI and data centres.In the recent past, generative AI, used for creating new content such as text, images or videos, such as ChatGPT and DALLE have grown popular.
If generative AI is used in every Google search, the daily electricity consumption would amount to 80 GWh. In 2021, Google’s annual electricity use was 18.3 TWh, with 1015% coming from AI. In the worstcase scenario, according to de Vries, Google AI’s electricity usage could be comparable to Ireland's 29.3 TWh per year. This scenario, however, assumed a fullscale AI adoption.
Studies have mainly focused on the training phase, which has a large carbon footprint. For training, large language models (LLMs), including GPT3, Gopher and Open Pretrained Transformer (OPT), reportedly consumed 1,287, 1,066 and 324 MWh of electricity, respectively. After they are trained, the LLMs are tested on new data, thereby kicking off the inference phase.
The author expressed concerns that the inference phase might contribute significantly to an AI model’s lifecycle costs. The energy demand of ChatGPT was 564 MWh per day compared to the estimated 1,287 MWh used in the training phase.De Vries suggested some solutions as innovations in model architectures and algorithms could help mitigate or even reduce AIrelated electricity consumption in the long term.

posted by atbulomiso5