Jevons paradox AI could use as much electricity as entire

Jevons’ paradox: AI could use as much electricity as entire countries – Issues.fr

Artificial intelligence (AI) brings many benefits, but its adoption could come with significant energy costs, a recent study highlights. Generative AI models like OpenAI’s ChatGPT consume large amounts of energy during training and operational use. While global efforts are underway to improve the energy efficiency of AI, the increased efficiency could inadvertently drive demand due to the Jevons Paradox. According to current forecasts, AI’s electricity consumption could rival that of entire countries by 2027. Researchers emphasize the importance of conscious use of AI due to its energy-intensive nature.

Artificial intelligence (AI) has the potential to improve coding speed for programmers, improve driver safety, and speed up everyday tasks. However, in a recent commentary published in Joule magazine, the founder of Digiconomist shows that if the tool is widely used, it could have a significant energy footprint that could exceed the energy needs of certain countries in the future.

“Given the growing demand for AI services, it is very likely that AI-related energy consumption will increase significantly in the coming years,” says author Alex de Vries, a Ph.D. . candidate at the Vrije Universiteit Amsterdam.

Since 2022, generative AI that can generate text, images or other data has seen rapid growth, including OpenAI’s ChatGPT. Training these AI tools requires supplying the models with large amounts of data, an energy-intensive process. Hugging Face, a New York-based AI development company, reported that its AI multilingual text generation tool used about 433 megawatt hours (MWH) during training, enough to power 40 average American homes for a year.

And AI’s energy footprint doesn’t stop at training. De Vries’ analysis shows that when using the tool (generating data based on prompts), every time the tool generates a text or an image, a significant amount of computing power and therefore energy is also consumed. For example, ChatGPT could cost 564 MWh of electricity per day.

While companies around the world are working to improve the efficiency of AI hardware and software to make the tool less power-hungry, de Vries says increasing machine efficiency often increases requirements. Ultimately, technological advances will lead to a net increase in resource consumption, a phenomenon known as the Jevons paradox.

“By making these tools more efficient and accessible, we may enable more applications and more people to use them,” says de Vries.

Google, for example, has integrated generative AI into its messaging service and is currently testing its search engine with AI. The company currently processes up to 9 billion searches per day. Based on this data, de Vries estimates that if every Google search used AI, it would require around 29.2 TWh of energy per year, equivalent to Ireland’s annual electricity consumption.

Due to the high costs associated with additional AI servers and bottlenecks in the AI ​​server supply chain, this extreme scenario is unlikely to occur in the short term, says de Vries. However, AI server production is expected to grow rapidly in the near future. Based on AI server production forecasts, global AI-related electricity consumption could increase by 85 to 134 TWh per year by 2027.

The amount is comparable to the annual electricity consumption of countries such as the Netherlands, Argentina and Sweden. In addition, improving AI efficiency could also allow developers to reuse some computer chips for their use, which could further increase AI-related power consumption.

“The potential growth shows that we need to think very carefully about why we use AI. It uses a lot of energy, so we don’t want to use it for all sorts of things where we don’t really need it,” says de Vries.