Nutanix launches a new version of its Enterprise AI solution. It primarily offers deeper integration with the Nvidia AI Enterprise suite.
At its .Next conference in Washington D.C., Nutanix announces the general availability of the latest version of Nutanix Enterprise AI (NAI). This new release of what was previously called GPT-in-a-Box, includes extensive integration with Nvidia AI Enterprise, including Nvidia NIM microservices and the Nvidia NeMo framework.
With the integration, Nutanix aims to facilitate the implementation of agentive AI applications in the business environment. NAI’s goal is to accelerate the adoption of generative AI in enterprises by helping customers build, run, and securely manage AI models and inference services, all within the Nutanix environment.
Like the rest of the Nutanix platform, NAI is also location-agnostic. The solution works at the edge of networks, in data centers, and in public cloud environments. Kubernetes is the cornerstone of the system.
AI and the Autonomous Data Center
According to Nutanix, the latest version makes it easier to implement AI workloads, thus simplifying daily operations. Agents will be able to take action on anomalies in the data center, it is stated on stage. With this, Nutanix heralds the era of the autonomous data center.
NAI streamlines the resources and models needed to implement various applications across different departments. This is done with a secure, common set of embedding and rearrangement models, which provide the functionality of AI agents. The solution also includes a centralized LLM model repository, allowing customers to easily and privately connect generative AI applications.
Efficiency
The new version of NAI supports the implementation of agentive AI applications with shared LLM endpoints. This enables reuse of model endpoints for multiple applications, reducing the use of critical infrastructure components such as GPUs, CPUs, memory, and storage.
NAI further provides support for Nvidia models, such as the Llama Nemotron open reasoning models and NeMo Guardrails, which help develop safe and reliable AI solutions.
Additionally, the NAI solution offers generative AI security capabilities, allowing companies to implement AI applications in accordance with their own policies. NeMo Guardrails help filter unwanted content and improve the reliability of code generation.
Together with Nvidia
None of these things are truly revolutionary on their own. In fact, Nutanix is simply linking the recent additions of the Nvidia Enterprise AI ecosystem to its own AI solution. This is relevant for Nutanix customers and fits into a broader trend. When it comes to AI innovation, Nvidia holds sway. It’s then up to third parties to connect Nvidia’s AI capabilities to their own platform, combining that broad ecosystem with the benefits of their own environment.
“Last year, we started our collaboration with Nvidia,” CEO Rajiv Ramaswami adds on stage. “All of Nvidia’s solutions are incorporated into our solution. We bring everything together, make it simple, and give you predictable costs, both on-premises and in the cloud.”