Artificial intelligence could soon consume more electricity than a country

A Dutch researcher has highlighted the enormous energy consumption generated by the entire new generation of tools powered by generative artificial intelligence. Ultimately, if they were to be adopted by a very large public, these tools could accumulate electricity consumption corresponding to that of a country, or even several countries combined.

• Read also: ‘Nothing to do with it’: Hollywood actor has his image stolen

Alex de Vries, a doctoral student at the School Business and Economics at VU Amsterdam, has published research on the environmental impact of emerging technologies such as artificial intelligence in the journal Joule. The deployment in less than a year of tools like ChatGPT (from OpenAI), Bing Chat (Microsoft) or even Bard (Google), but also Midjourney and others in the image field, has significantly boosted the demand for servers and consequently the energy necessary for their proper functioning. This development necessarily raises concerns about the environmental impact of this technology already adopted by a large public.

In recent years, if we do not take into account the mining of cryptocurrencies, the electricity consumption of data centers has been relatively stable, around 1% of overall consumption worldwide. However, the development of AI, inevitable in many areas, risks changing the situation.

According to Alex de Vries, the simple GPT-3 language model would have consumed more than 1,287 MWh just for training. Then comes the phase of entry into production with, to stay with ChatGPT, the creation of responses to requests (prompts) from Internet users. Earlier this year, SemiAnalysis suggested that OpenAI needed 3,617 servers, with a total of 28,936 graphics processing units (GPUs), to support ChatGPT, which would equate to a power demand of some 564 MWh per day.

And this is obviously just the beginning. Also according to SemiAnalysis, implementing an AI similar to ChatGPT in each Google search would require the use of 512,821 dedicated servers, for a total of more than 4 million GPUs. At an energy demand of 6.5 kW per server, this would result in a daily electricity consumption of 80 GWh and an annual consumption of 29.2 TWh (terawatt-hour, or one billion KW- hour). According to the most pessimistic scenario, AI deployed on a large scale by Google could alone consume as much electricity as a country like Ireland (29.3 TWh per year).

At Alphabet, we have also already confirmed that an interaction with a language model could consume up to ten times more than a traditional search using standard keywords, going from 0.3 Wh to around 3 Wh. As for Nvidia, the main supplier of servers adapted to AI, more than 1.5 million units could be sold by 2027, for an overall consumption ranging from 85 to 134 TWh per year.

In conclusion, electricity consumption linked to AI will quickly become a major concern. However, several levers could make it possible to reduce it. The first would obviously be to favor renewable energy sources to power data centers. Next, we will have to find how to develop less energy-intensive algorithms. Finally, it will be necessary to educate Internet users so that they adopt responsible use of AI, without excess.


source site-64