Mistral AI Reveals Environmental Impact of its Largest AI Model

mistral ai

French start-up Mistral AI is the first to map the ecological footprint of an LLM.

Anyone using generative AI knows it involves a lot of energy and water. The extent of this consumption is often unclear. In an effort to increase transparency, Mistral AI now shares detailed figures for the first time on the environmental impact of its Mistral Large 2 model.

Training Consumes the most Energy

85.5 percent of CO₂ emissions and 91 percent of water consumption occur during the training and execution of the model with 123 billion parameters. The figures come from a new report by Mistral.

In total, about 20,000 tons of CO₂ were emitted and 281,000 cubic meters of water were consumed. This is equivalent to 112 Olympic swimming pools. Material consumption was also mapped, with significant impact from the electricity infrastructure behind data centers: including wind turbines, solar panels, and even coal-fired power plants.

Smaller Models, Smaller Impact

During inference, consumption is significantly lower. Generating a page of text is estimated to cost 45 ml of water and 1.14 grams of CO₂. However, it still adds up considerably.

According to Mistral, smaller models can make a difference here. They require less training, run more efficiently, and often deliver better results for specific tasks. Additionally, smart grouping of AI requests helps prevent waste.

Call for Transparency

Mistral calls on other AI companies to report their environmental impact according to clear, internationally recognized standards. Only then can companies and governments make informed decisions. The report itself highlights three crucial points: the impact of training, the cost of inference, and the balance between the two over the lifespan of a model.