Imec unveils imec.kelis: an analysis tool for designing and optimizing AI architecture.
The Belgian research center imec introduces a new analysis tool to support the design and optimization of AI data centers. The tool, imec.kelis, targets large AI workloads such as LLM training and will be available for licensing from early 2026.
Simulation Model for Large-Scale AI
Imec.kelis was presented at Super Computing 2025, an international trade show for high-performance computing. The tool offers a modeling framework that allows system architects to assess performance across various layers of infrastructure, from chip to data center. According to imec, the method is fast, transparent, and accurate, providing an alternative to slow or limited simulation systems.
The tool is specifically tailored for AI workloads, such as training and running large language models (LLMs). Imec.kelis analyzes compute, communication, and memory systems, allowing for comparisons of different architectural choices. This enables developers to balance performance, energy consumption, and costs.
The model was validated on Nvidia’s A100 and H100 GPUs, showing an error margin of less than 12 percent in predictions for large-scale LLM tasks.
Focus on Data Center Optimization and Scalability
Imec.kelis includes various modules, such as an LLM task analyzer, a topology-aware communication layer, and an interactive dashboard environment for exploring design options. According to imec, the system supports both architecture exploration and future technology projection.
In a test scenario, imec used the tool to compare the training time of GPT-3 across different GPU architectures and scale levels. Performance was also measured against cost, providing insights into the efficiency of various configurations.
The tool is the result of imec’s experience in system modeling, hardware-software co-design, and performance analysis in HPC and AI contexts. Imec.kelis v1.0 will be launched in the first quarter of 2026. Some companies are already testing the system in its early stages.
