AI generates fake nude photos of teenage girls in Spain

Artificial intelligence could soon use more electricity than a country

A Dutch researcher has highlighted the enormous energy consumption caused by the entire new generation of tools based on generative artificial intelligence. Ultimately, if adopted by a very large public, these instruments could accumulate electricity consumption equivalent to that of a country or even several countries combined.

• Also read: “It has nothing to do with this”: Picture of a Hollywood actor was stolen

Alex de Vries, a doctoral student at the School Business and Economics at VU Amsterdam, has published research in the journal Joule on the environmental impact of new technologies such as artificial intelligence. The use of tools such as ChatGPT (from OpenAI), Bing Chat (Microsoft) or even Bard (Google), but also Midjourney and others in the image sector, in less than a year has significantly increased the demand for servers and therefore the energy required for them proper functioning. This development inevitably raises concerns about the environmental impact of this technology, which has already been widely adopted.

If we ignore cryptocurrency mining, data center electricity consumption has been relatively stable in recent years, accounting for about 1% of total global consumption. However, the development of AI, which is inevitable in many areas, risks changing the situation.

According to Alex de Vries, the simple GPT-3 language model would have consumed more than 1,287 MWh for training alone. Then comes the production entry phase with, to stay with ChatGPT, the creation of responses to requests (prompts) from Internet users. Earlier this year, SemiAnalysis suggested that OpenAI would need 3,617 servers with a total of 28,936 graphics processing units (GPUs) to support ChatGPT, which would equate to around 564 MWh of electricity demand per day.

And this is obviously just the beginning. According to SemiAnalysis, implementing ChatGPT-like AI in every Google search would require the use of 512,821 dedicated servers, for a total of more than 4 million GPUs. With an energy requirement of 6.5 kW per server, this would result in a daily electricity consumption of 80 GWh and an annual consumption of 29.2 TWh (terawatt hour or one billion KW hour). According to the most pessimistic scenario, AI deployed on a large scale by Google could alone consume as much electricity as a country like Ireland (29.3 TWh per year).

At Alphabet, we have also already confirmed that an interaction with a language model can consume up to ten times more than a traditional search using standard keywords, from 0.3 Wh to around 3 Wh. As for Nvidia, the main supplier of AI-adapted servers, More than 1.5 million units could be sold by 2027, corresponding to a total consumption of 85 to 134 TWh per year.

In summary, the power consumption associated with AI will quickly become a major problem. However, several levers could enable a reduction. The first, of course, would be to favor renewable energy sources for data center operations. Next, we need to figure out how to develop less energy-intensive algorithms. Finally, it will be necessary to train Internet users to use AI responsibly without excessive use.