5 min read

What's the environmental impact of an AI query? A definitive directory

All the energy, water and emissions data shared by Big Tech and nonprofits to date.
Melodie Michel
Environmental impact of an AI query: A definitive directory
Photo by Firmbee.com on Unsplash

There is currently no consensus on the environmental impact of AI – but until one emerges, this guide gathers all the water use, power consumption and emissions data shared by tech firms, nonprofits and the International Energy Agency in one place.

This is an evolving directory of all the sources that have shared AI environmental impact data so far, along with the specific information they shared. It’s meant to help readers navigate this complex environment, where each company uses a different framework to calculate emissions and green operations (GreenOps) experts question every single claim – while nonprofits warn that it will take time to come up with a definitive number.

For Chief Sustainability Officers, this guide can help deepen the understanding of AI and how using it may impact a company’s sustainability objectives. It will also strengthen the case to take hyperscaler claims with a pinch of salt.

For a more in-depth analysis of AI and emissions, we recommend reading these previous CSO Futures articles:

AI environmental impact data provided by tech firms

Amazon

No data on an AI query’s impact shared as of November 2025.

In its latest sustainability report, Amazon’s cloud service AWS reported a global Power Usage Effectiveness (PUE, effectively the energy efficiency of data centres) of 1.15 – lower than the public cloud industry average of 1.25, and 1.63 for on-premises enterprise data centres.

However, the company’s operational emissions increased by 182% between 2020 and 2023 as a result of data centre and AI deployment.

In October 2025, a Guardian investigation revealed that Amazon had “strategised about keeping the public in the dark over the true extent of its data centres’ water use”, and that its total water footprint in 2021 was 397 billion litres – as much as 958,000 US households.

Google

Google released data on the environmental impact of its generative AI tool Gemini in August 2025, along with an explanation of its methodology. 

It states that the median Gemini Apps text prompt uses:

  • 0.24 watt-hours (Wh) of energy and 
  • 0.26 millilitres (or about five drops) of water, and emits
  • 0.03 grams of CO2e

Meta

No data shared as of November 2025.

Meta’s operational emissions increased by 145% between 2020 and 2023 as a result of data centre and AI deployment.

Microsoft

No data shared as of November 2025.

Meta’s operational emissions increased by 155% between 2020 and 2023 as a result of data centre and AI deployment.

Mistral AI

Mistral AI shared data on the lifecycle emissions of its AI model in July 2025, after an analysis conducted in ​​collaboration with Carbone 4, a leading consultancy in CSR and sustainability, and the French ecological transition agency (ADEME).

It says that one query to its AI assistant uses:

  • 0.05 litres of water 
  • 0.2 milligrams of  Sb eq (the measure for resource depletion), and emits
  • 1.14 grams of CO2 equivalent

The company did not disclose the energy use of an AI query.

OpenAI

OpenAI CEO Sam Altman released data about the power usage of the average ChatGPT query in a June 2025 blog post:

  • 0.34 Wh of electricity

However, OpenAI is yet to share any data on water use or emissions, be that of at company level or for individual AI queries.

AI environmental impact data estimated by other sources

AI Energy Score

In February 2025, Salesforce teamed up with academic and technology leaders to develop a tool allowing AI developers and users to compare the energy efficiency of different AI models.

The AI Energy Score allows users to find energy consumption data for a range of AI queries and models, calculated through a standardised framework for measuring and comparing AI model energy efficiency.

According to the platform (visited in October 2025), a text query on Meta’s Llama 2 model used 16.44 Wh of electricity, while the same request on ChatGPT used 2.7 Wh.

Epoch AI

In February 2025, AI research institute Epoch AI calculated that a ChatGPT query used an average of:

  • 0.3 Wh of electricity

International Energy Agency

In 2024, the International Energy Agency estimated that a ChatGPT query used 10 times the amount of electricity of a standard Google search (0.3 Wh):

  • 2.9 Wh of electricity

This figure has been widely contested since, including by Epoch AI, who used a similar methodology but with updated data in 2025. 

The IEA has not updated its research on AI queries, but shared new insights in 2025 projecting that electricity consumption in accelerated servers, which is mainly driven by AI adoption, will grow by 30% annually, and that global electricity consumption for data centres will double to reach around 945 TWh – just under 3% of total global electricity consumption by 2030.

International Telecommunication Union x World Benchmarking Alliance

In their Greening Digital Companies 2025 report, ITU and the World Benchmarking Alliance provided insights on the energy use of AI by looking at the sustainability reports of tech companies. While the report does not look at the energy use, water consumption and emissions of an AI query, it states: “Data from company reports show an increasing trend in operational emissions for companies with a high level of AI adoption. This evidence coincides with a larger demand for and investment in data centres.” 

Analysing the share of operational emissions (using location-based Scope 2 emissions) from 2021 to 2023 as a proportion of 2020 emissions for Amazon, Microsoft, Google and Meta – significant suppliers or users of AI, the report shows that operational emissions from these companies were, on average, 150% higher in 2023 than in 2020. 

Attempting to compare the data

Before uniting all the different sources into a table, it’s important to understand that comparison remains difficult, as each source uses different calculations and measures to come up with a number. Some, like Mistral AI, count the lifecycle emissions of producing servers and data centres in their final number, while others only count the energy and water used in the moment the query is issued.

One of the most energy-intensive aspects of AI lies in training the large language models that respond to queries (the energy consumed to train the GPT-3 model has been estimated around 1,064 MWh) – yet most tech firms don’t include that in their estimation.

Finally, emissions calculations are highly dependent on the location of data centres used for a specific query, and how clean the grid is in that region – making it difficult to generalise.

With that disclaimer in mind, CSO Futures has gathered the different data available on the environmental impact of a text data query in the table below.


Google

Mistral AI

OpenAI

Epoch AI

IEA

AI Energy Score

Energy use

0.24 Wh

No data

0.34Wh

0.3Wh

2.9Wh

1.31 to 28.7 Wh

Water use

0.26 ml

50ml

N/A

N/A

N/A

N/A

Emissions

0.03 gCO2e

1.14 gCO2e

N/A

N/A

N/A

N/A

Resource depletion

N/A

0.2 mg SB eq

N/A

N/A

N/A

N/A

LCA

No

Yes

No

No

No

No

Help us keep this page updated: Share any feedback or new data at melodie@csofutures.com