Generative artificial intelligence (AI) uses “30 times more energy” than a traditional search engine, warns researcher Sasha Luccioni, who wants to raise awareness among the population about the environmental impact of this new technology.
Recognized as one of the 100 most influential people in the world of AI by the American magazine Time In 2024, this Canadian of Russian origin has been trying for several years to quantify the emissions of programs like ChatGPT or Midjourney.
“I find it particularly disappointing that generative AI is being used to search the internet,” laments the researcher met by AFP at the ALL IN conference dedicated to artificial intelligence in Montreal.
The language models on which these AIs are based require enormous computing capacities to train on billions of data, which requires powerful servers. Added to this is the energy consumed to respond to a user’s requests.
Instead of extracting information, “like a search engine would do to find the capital of a country for example”, these AIs “generate new information”, making the whole thing “much more energy-intensive”, she underlines.
According to the International Energy Agency (IEA), combining AI and the cryptocurrency sector, data centers consumed nearly 460 TWh of electricity in 2022, or 2% of total global production.
“Energy efficiency”
A pioneer in research on the impact of AI on the climate, Sasha Luccioni participated in 2020 in the creation of a tool intended for developers to quantify the carbon footprint of the execution of a piece of code. CodeCarbon has since been downloaded more than a million times.
The woman who leads the climate strategy of the young company Hugging Face, a platform for sharing open-access AI models, is now working on creating a certification system for algorithms.
Similar to that of Energy Star, which awards ratings according to the energy consumption of an appliance in the United States, this program, which it also compares to the French Nutri-score in the food sector, would make it possible to know the energy consumption of a model in order to encourage users and developers to “make better decisions.”
“We don’t take into account water or rare materials,” she acknowledges. “But at least we know that for a specific task, we can measure energy efficiency and say this model has an A+, and that model has a D,” she says.
“Transparency”
In order to develop his tool, Sasha Luccioni is experimenting with it on generative AI models accessible to everyone (open source), but she would also like to do it on models from Google or OpenAI (the creator of ChatGPT) who remain reluctant for the moment.
Despite having committed to achieving carbon neutrality by the end of the decade, these tech giants see their greenhouse gas emissions increase in 2023 due to AI: +48% for Google compared to 2019 and +29% for Microsoft compared to 2020.
If we do nothing to regulate these AI systems, “we are accelerating the climate crisis,” sighs the thirty-year-old, who is demanding more transparency from these companies.
And the solution, she says, could come from governments that are currently “flying blindly,” without knowing what’s “in the data sets or how the algorithms are trained.”
“Once we have transparency, we can start legislating,” says the expert.
“Explaining to people”
For the Montreal researcher, it is also necessary to “explain to people what generative AI can and cannot do, and at what cost.”
In her latest study, the woman who speaks frequently internationally has demonstrated that producing a high-definition image using artificial intelligence consumes as much energy as fully recharging the battery of your mobile phone.
At a time when more and more companies want to democratize this new technology by integrating it in several formats (conversational robot, connected devices, online searches), Sasha Luccioni advocates “energy sobriety”.
The idea here is not to oppose AI, she emphasizes, but rather to choose the right tools and use them wisely.