Journal

What is the environmental impact of artificial intelligence?

fr
The consumption of resources and energy required to run artificial intelligence (AI) systems is an increasingly high-stake issue. It is the subject of major controversy, as there is a lack of precise data to enable a proper forward-looking assessment. Operators do not communicate transparently about the energy and resource consumption of the AI systems they design and operate. Nevertheless, some studies and indirect data allow us to establish orders of magnitude.

In 2023, according to the International Energy Agency (IEA), data centre consumption accounted for 2% of global electricity demand, 80% of which was linked to computing operations and infrastructure cooling requirements. This figure could double by 2026. Water withdrawals from Meta, Google and Microsoft data centres amounted to 2.2 billion cubic metres in 2022, double the annual withdrawals of a country like Denmark. Computers heat up and need to be cooled down to protect electronic components, which means that cold water circuits are still needed in most cases. Due to the dominance of GAFAM (Google, Apple, Facebook, Amazon), most of this resource consumption takes place in the United States (40% for electricity), although the geographical reach of these players is expanding.

For the past five years or so, AI has been the main driver of growth in data centre resource requirements, alongside cryptocurrencies. This is because AI models have become more complex and their training is based on increasingly large datasets. Between Google’s first AI model in 2017 (Transformer) and the latest version, Gemini Ultra, the computing power required has increased by a factor of 7.

Training compute of notable machine learning models by domain, 2012-2023 (in petaFLOPS*)

*FLOPS: FLoating-point Operations Per Second. PetaFLOPS = 1015 FLOPS.
Source: AI Index Report 2024, p. 51.

More computing means more electricity consumption. The training phase of ChatGPT-3 would have required 1.29 gigawatt hours[1] (equivalent to the average annual electricity consumption of almost 600 French people). This estimate relates to the model’s training phase, but not to its use by the public. A query on ChatGPT-3 or Bloom would consume around 4 Wh, equivalent to the consumption of an LED bulb for 15 minutes. Generating images would consume 60 times more energy than generating text. This means that training the model is a major fixed consumption item, but it is quickly overtaken by the variable consumption linked to usage, given the demand that can reach several million requests per day for models like ChatGPT-3.

With regard to the water requirements of AI, a benchmark study coordinated by Shaolei Ren (Univ...