Artificial intelligence systems could account for nearly half of data centre power consumption by the end of this year, analysis has revealed.
The estimates by Alex de Vries-Gao, the founder of the Digiconomist tech sustainability website, came as the International Energy Agency forecast that AI would require almost as much energy by the end of this decade as Japan uses today.
De Vries-Gao’s calculations, to be published in the sustainable energy journal Joule, are based on the power consumed by chips made by Nvidia and Advanced Micro Devices that are used to train and operate AI models. The paper also takes into account the energy consumption of chips used by other companies, such as Broadcom.
The IEA estimates that all data centres – excluding mining for cryptocurrencies – consumed 415 terawatt hours (TWh) of electricity last year. De Vries-Gao argues in his research that AI could already account for 20% of that total.
He said a number of variables came into his calculations, such as the energy efficiency of a data centre and electricity consumption related to cooling systems for servers handling an AI system’s busy workloads. Data centres are the central nervous system of AI technology, with their high energy demands making sustainability a key concern in the development and use of artificial intelligence systems.
By the end of 2025, De Vries-Gao estimates, energy consumption by AI systems could approach up to 49% of total data centre power consumption, again excluding crypto mining. AI consumption could reach 23 gigawatts (GW), the research estimates, twice the total energy consumption of the Netherlands.
However, De Vries-Gao said a number of factors could lead to a slowdown in hardware demand, such as waning demand for applications such as ChatGPT. Another issue could be geopolitical tensions resulting in constraints on producing AI hardware, such as export controls. De Vries-Gao cites the example of barriers on Chinese access to chips, which contributed to the release of the DeepSeek R1 AI model that reportedly used fewer chips.
“These innovations can reduce the computational and energy costs of AI,” said de Vries.
But he said any efficiency gains could encourage even more AI use. Multiple countries attempting to build their own AI systems – a trend known as “sovereign AI” – could also increase hardware demand. De Vries-Gao also pointed to a US data centre startup, Crusoe Energy, securing 4.5GW of gas-powered energy capacity for its infrastructure, with the ChatGPT developer OpenAI among the potential customers through its Stargate joint venture.
“There are early indications that these [Stargate] data centres could exacerbate dependence on fossil fuels,” writes De Vries-Gao.
On Thursday OpenAI announced the launch of a Stargate project in the United Arab Emirates, the first outside the US.
after newsletter promotion
Microsoft and Google admitted last year that their AI drives were endangering their ability to meet internal environmental targets.
De Vries-Gao said information on AI’s power demands had become increasingly scarce, with the analyst describing it as an “opaque industry”. The EU AI Act requires AI companies to disclose the energy consumption behind training a model but not for day-to-day use.
Prof Adam Sobey, the mission director for sustainability at the UK’s Alan Turing Institute, an AI research body, said more transparency was needed on how much energy is consumed by artificial intelligence systems, and how much they could save by helping to make carbon-emitting industries such as transport and energy more efficient.
Sobey said: “I suspect that we don’t need many very good use cases [of AI] to offset the energy being used on the front end.”