When AI Becomes the Solution for AI: Optimization, Efficiency, and Sustainability in Data Centers

When AI Becomes the Solution for AI: Optimization, Efficiency, and Sustainability in Data Centers

AI solving AI. It sounds strange, but in Schneider Electric’s case, it’s the key to success in making data centers operate more efficiently and sustainably.

The explosive rise of AI applications is transforming the playing field for data centers. These systems draw enormous power and produce intense heat, leading to increasingly demanding requirements for energy management and cooling.

And ironically: AI itself could be the key to radically improving the efficiency and sustainability of that infrastructure.

During a visit to the Start Campus data center site in Portugal, which uses innovative sea cooling, we see Schneider Electric’s latest innovations at work. The demand for more data center capacity is also resonating loudly there due to numerous AI applications growing in use (and consumption).

The Challenge: AI Demands and Emits Heat

AI compute generates exceptionally high thermal loads, especially when training large models. Traditional air cooling reaches its limits at 50 kilowatts (kW) per rack. With AI, these values can easily rise to 150 kW or more. That’s why liquid cooling has become essential to prevent overheating and guarantee performance.

Marc Garner, SVP Secure Power Europe at Schneider Electric: “The data centers of the future must be designed to support 200 kW per rack today, with the flexibility to handle perhaps 600 kW or even 800 kW tomorrow.”

The data centers of the future must be designed to support 200 kW per rack today, with the flexibility to handle perhaps 600 kW or even 800 kW tomorrow.

Marc Garner, SVP Secure Power Europe at Schneider Electric

A liquid-based system also requires constant adjustment: even small temperature fluctuations can put pressure on GPU performance or lead to thermal throttling. That’s why the classic ‘set and forget’ approach is inadequate in the AI era, according to Garner.

AI as the Director of Cooling

Using AI to cool AI infrastructure: the idea might sound paradoxical, but this is precisely where its strength lies. Advanced algorithms can continuously analyze measurements such as temperature, flow rate, or load and adjust cooling parameters in real-time.

Pankaj Sharma, EVP Secure Power Division at Schneider Electric, provides additional context: “AI for energy and energy for AI — that’s the equation we need to solve repeatedly. Yes, AI consumes a lot, but the same technology can help us drastically reduce that consumption.” Schneider has been preaching this vision since its Innovation Summit in Paris last year.

Start Campus Schneider Electric

By integrating AI into the control model, a data center can tune itself. This includes reducing flow rates where possible, shutting down cooling elements when they’re not needed, or fine-tuning pump and fan control based on predictions.

Strategic Focus: End-to-End AI Infrastructure

For Schneider Electric, this isn’t just a technological experiment, but a central pillar in their offering. Their portfolios for AI data centers include ‘Grid to Chip’ solutions focusing on power supply, power quality, and heat dissipation.

Robert Dunn, CEO of the Start Campus project, emphasized the scale and ambition: “What we’re building here is Europe’s largest and most sustainable data ecosystem. With a PUE of 1.1, zero water consumption, and a capacity of 1.2 gigawatts, we’re setting a new standard for the industry.”

Collaborations with GPU manufacturers ensure these designs are future-proof. Garner: “We have strong relationships with suppliers like Nvidia, not just to support their current roadmap, but also to prepare our own infrastructure for tomorrow’s workloads.”

The collaboration with Nvidia isn’t just to support their current roadmap, but also to prepare our own infrastructure for tomorrow’s workloads.

Marc Garner, SVP Secure Power Europe at Schneider Electric

The acquisition of Motivair earlier this year also fits into this ambition. Together with the cooling specialist, Schneider Electric recently highlighted its end-to-end offering for AI data center infrastructure.

Conclusion: AI as a Lever for AI Infrastructure

The AI hype doesn’t come without impact on energy infrastructure. According to Schneider’s Sustainability Research Institute, AI could be responsible for 20 to 50 percent of electricity consumption growth in the US between 2025 and 2030.

AI-based cooling optimization is more than just a curiosity today: it forms the nervous system that makes such advanced systems efficient. The next generation of data centers must not only be heavy on hardware but also intelligent in management.

By integrating AI into cooling and energy systems, Schneider Electric believes a data center will fine-tune itself, smooth out peaks, and minimize power loss. Such characteristics are exactly what’s needed in a time when AI itself is becoming an increasingly heavy burden on energy networks.

As Sharma summarizes: “We must not only think about how much power AI demands, but also how much AI can give back to our energy story.”