Love it or hate it, artificial intelligence (AI) is becoming part of our everyday lives. From online shopping to searching the web, AI is evolving into a useful, time-saving tool for people and corporations alike.
When it comes to climate change, AI is proving its usefulness there, too. At the UK’s Cambridge University, researchers are using AI in everything from climate modelling to land use planning, and see it as a transformative tool for protecting nature.
Researchers at Oxford University have created an AI tool that promises to make corporations’ environmental conduct more transparent. Even Google has touted the benefits, developing various AI-powered tools to improve climate resilience.
Despite all the potential for AI to have a positive impact on the climate crisis, there are concerns over its potentially significant contribution to greenhouse gas emissions.
A new report from the International Energy Agency (IEA) shows that AI is driving a massive increase in electricity demand. Data centres, which form the backbone of AI systems, are projected to double their energy demand in the next five years.
IEA projects that, by 2030, data centre electricity demand will rise to around 945 terawatt hours – that’s more than the entire electricity consumption of Japan.
However, the report also points out that AI has the potential to cut emissions elsewhere. It says that if it is adopted in the right ways, the carbon savings it accounts for could offset the additional emissions it generates.
AI is one of the biggest stories in the energy world today,” says IEA Executive Director Fatih Birol. “But until now, policy makers and markets lacked the tools to fully understand the wide-ranging impacts.”
How much energy does AI need?
AI requires large amounts of energy to train and run. The huge processing power required to support large language models comes from thousands of servers housed in data centres, some of which consume as much energy as a small country.
Data centres are located all over the world, although the US leads with 5,381 facilities, around 40 per cent of the global market. Other countries with significant data centres include the UK, Germany, India, Australia, France and the Netherlands.
The power consumption of these facilities is substantial. Some AI-focused data centres use as much electricity as two million households. In 2023, they accounted for around 1.5 per cent of the total global electricity consumption, but are set to consume a lot more in the coming years.
Training AI requires a great deal of processor power, and therefore a lot of electricity. Research published in the Journal of Machine Learning found that training the popular OpenAI ChatGPT model consumed 1,287 megawatt hours of electricity, producing as much CO2 as 80 short-haul flights in Europe.
“What is different about generative AI is the power density it requires, ” says Noman Bashir, Computing and Climate Impact Fellow at MIT. “Fundamentally, it is just computing, but a generative AI training cluster might consume seven or eight times more energy than a typical computing workload.”
Running the software is less energy-intensive per task, but it quickly begins to add up when millions of queries are being submitted every day.
The Electric Power Research Institute found that, per query, ChatGPT consumes approximately 2.9 watt-hours. That’s around ten times the amount of energy required for a standard Google search.
As of early 2025, ChatGPT is processing more than a billion queries per day, and the number is growing.
In early 2025, around 8 per cent of US adults were using ChatGPT as their primary search engine. That’s still a fraction of the number who use Google, but given it has grown from just 1 per cent in June 2024, this underscores the rapid shift towards AI-powered tools.
There’s also the changing face of AI to consider. Current queries are usually limited to text-based interactions. Emerging AI video, image and audio applications have no precedent, but are likely to be even more thirsty for fuel.
“When we think about the environmental impact of generative AI, it is not just the electricity you consume when you plug the computer in,” says Elsa A. Olivetti, professor in the Department of Materials Science at MIT. “There are much broader consequences that go out to a system level and persist based on actions that we take.”
Can AI really offset its own emissions?
According to the IEA, concerns that AI could accelerate climate change are ‘overstated.’ It says that, despite the growth, emissions caused by data centres will still be a fraction of the world’s total energy-related emissions, an estimated 1.5 per cent.
It further argues that widespread adoption of AI could make a host of activities more efficient, reducing emissions in other areas. This may be from the optimisation of industrial processes, scientific research or technology innovation.
The IEA estimates that the broad application of existing AI-led solutions could lead to emissions reductions of up to 5 per cent by 2035. It claims this will offset the increase in emissions generated by data centre demand.
A separate report from Energy Intelligence predicted a doubling of energy demand, but also framed AI as a key enabler of the clean energy transition.
It cited smarter grid management, cost reduction in low-carbon technologies and enhanced integration of renewables as benefits AI could bring. It further argued that advances in processor efficiency, cooling technologies, and algorithm optimisation will ultimately curb AI’s high energy demands.
Although the IEA report looks favourably on the future of AI and its climate impact, it notes that this outcome is not automatic.
“It is vital to note that there is currently no momentum that could ensure the widespread adoption of these AI applications,” the report states. “Therefore, their aggregate impact, even in 2035, could be marginal if the necessary enabling conditions are not created.”
Realising AI’s potential will require concentrated action on multiple fronts. In particular, it notes the positive impact AI could have in the energy industry through the optimisation of grids and distribution, one area in which AI is woefully underused at present.
It also admits that investment in low-carbon electricity generation is crucial, particularly when it comes to supplying energy-hungry data centres.
Some players are making strides in this. Amazon is the largest corporate buyer of renewable energy worldwide. It says that over 90 per cent of its operations, including its Amazon Web Services data centres, are already powered by renewables.
Digital Reality, with over 300 data centres worldwide, has committed to renewable energy. Today, 100 per cent of its European portfolio’s energy needs are matched with renewable energy purchases.
But it’s not easy going green in the data centre business. The intermittency of renewable energy sources presents a challenge, as do geographical limitations, which may impact the availability of clean energy sources.
The role of the US in AI’s future climate impacts
With most of the world’s biggest data centres in the US, this will be where the largest growth in energy demand will be seen.
By the end of the decade, energy consumption from data centres is projected to outstrip that of all other energy-intensive activities combined (production of aluminium, concrete, chemicals, etc.), according to the IEA report.
Today, US data centres rely on fossil fuels, mainly natural gas. IEA doesn’t see this changing, particularly with the current administration’s focus on dirty fuels.
Just this week, President Donald Trump signed an executive order instructing cabinet members to identify regions where coal-powered infrastructure can support AI data centres.
In the state of Louisiana, plans are already in place to construct a large-scale gas power plant specifically to cater to a massive new data centre being built by Meta.
IEA’s report presents a scenario that will only be achievable with concerted efforts and political support. Depending on the priorities at the time, it’s just as likely that AI could be used to find new oil and gas reserves as to detect methane leaks or optimise grids.
Considering the notion of AI ‘offsetting’ its own emissions needs to be taken in context. Carbon dioxide stays in the atmosphere for hundreds of years, so even if AI does eventually find ways to cut more emissions than it produces, it won’t cancel out the damage it will do along the way.
“The widespread adoption of existing AI applications could lead to emissions reductions that are far larger than emissions from data centres – but also far smaller than what is needed to address climate change,” the report concludes.