According to a report from Google, the ecological toll of Gemini isn’t that bad. Experts find that Google presents the figures more favorably than they actually are.
The energy and water consumption by (AI) data centers has long been a topic of discussion. Google wants to dispel the energy-guzzling reputation of its Gemini models. A technical paper aims to convince the world that the environmental impact of AI is not as drastic as experts claim.
According to the report, a “median” text prompt in Gemini consumes approximately 0.24 Wh of electricity, equivalent to turning on your television for nine seconds. This is accompanied by 0.03 g of CO2 emissions. Gemini consumes about five drops of water per prompt, or 0.26 ml.
Operational Footprint
Google pats itself on the back in a blog post claiming that Gemini has become much more energy-efficient. Energy consumption is 33 times lower than twelve months ago, while the total footprint is even 44 times lower. According to Google, this result was achieved through a combination of more efficient models, algorithms, inference, software, and hardware in data centers.
read also
How Google Lies About the Power of Its Latest Chips, Compared to El Capitan
It’s important to note that Google uses its own calculation method. Google claims to look at the “actual operational footprint”, which takes various factors into account:
- Full system power: Google measures not only the energy of the AI model itself but also the effective chip utilization in production. In practice, this utilization is often lower than the theoretical maximum, meaning that part of the capacity remains unused but still consumes energy.
- Idle capacity: To ensure availability and reliability, Google must always keep extra capacity ready for peak loads or failover. These inactive but running machines consume power and are counted in the total footprint.
- CPU and RAM: AI doesn’t run solely on accelerators. The host CPUs and RAM memory also support execution and therefore consume energy, which is included in the calculation.
- Data center overhead: Besides the IT equipment itself, the infrastructure (cooling, power distribution, …) also requires energy. Google uses the international standard metric PUE (Power Usage Effectiveness) to account for this overhead.
- Water consumption: Data centers often use water for cooling to reduce energy consumption and emissions.
Google claims this provides a more realistic picture of AI models’ energy consumption than looking only at theoretical efficiency. This approach helps avoid overestimation.
More Water
However, experts disagree with Google’s figures, which they say still severely underestimate reality. The Verge consulted academics cited in Google’s paper. The scientists raise several concerns about Google’s methodology.
Shaolei Ren, professor of computer engineering at the University of California, says Google is withholding crucial information. Gemini’s water consumption is calculated based on the amount of water needed for cooling systems in data centers. While that’s a significant source of water usage in data centers, AI also indirectly consumes water.
The electricity consumed by these data centers comes from gas and nuclear power plants, which in turn use a lot of water. According to Ren, if you include this indirect consumption, you end up with 50 ml of water per prompt.
Google, in turn, does not agree with this. In a statement, it calls UCR’s conclusion and research “flawed”. “Our goal is to establish a clear, measurable, and repeatable methodology to compare the operational footprint of AI. General methodologies and assumptions are suitable for high-level estimates, but obscure precise benchmarking. We encourage the researchers to also consult an expert to get an informed perspective,” Google retorts.
Comparing Apples to Oranges
Alexander De Vries-Gao, a doctoral student in environmental studies at the University of Amsterdam, accuses Google of being selective in the metrics used to measure energy consumption. Google only accounts for market-based emissions. This takes into consideration the sustainable energy certificates and contracts that a company enters into to support green energy, but says little about the actual energy mix a company uses.
Only location-based emissions provide the complete picture. This method shows how much CO₂ is actually released based on where a data center draws power. This prevents a company from offsetting local emissions by financing green energy elsewhere. Location-based emissions are typically higher and thus provide a more honest picture of a data center’s local environmental impact.
Finally, both academics dispute Google’s claim that its calculation is more accurate than previous academic work. Google looks at a “median” prompt, while previous studies are based on averages. That’s comparing apples to oranges, because the median is less sensitive to outliers. It’s also not clarified what Google considers a median Gemini prompt.
read also
Google Juggles Numbers to Minimize Environmental Impact of AI
Google’s paper will not dispel ecological concerns about AI’s environmental impact. The company itself also admits that emissions have risen sharply over the past five years. The AI ambitions of Google and other companies come with a high price.
This article originally appeared on August 21 and was updated with a statement from Google.