Three clicks and done: with Private Cloud AI, HPE aims to remove the complexity of AI and free it from the public cloud. A groundbreaking vision or more of the same?
It was the highlight of HPE Discover in Las Vegas in June. HPE CEO Antonio Neri and Nvidia top executive Jensen Huang announced Private Cloud AI in front of a packed Sphere like two seasoned rock stars. Huang won’t be there in Barcelona, but AI won’t be any less talked about because of that. “We see ourselves as stewards for AI,” Neri proclaimed during his keynote.
With Private Cloud AI, HPE believes it has cracked the AI code. The company wants to provide an easy-to-implement private alternative to GenAI solutions that are now primarily in the public cloud. HPE is not alone in that vision. Has HPE discovered the magic formula or reinvented the wheel?
AI is hybrid
For many companies, applications such as ChatGPT are the most obvious way to experiment with AI. Whether you prefer OpenAI’s ChatGPT, Googles Gemini or Anthropics Claude, the models have one thing in common and that is that they are available through the public cloud. It doesn’t have to be that way when it depends on HPE. Discover is the perfect opportunity to convince the world that AI belongs on private infrastructure.
read also
Choosing is not losing in a hybrid cloud
HPE describes generative AI as “the ultimate hybrid workload. Neil McDonald, VP of HPC & AI, interprets that statement. “Classical AI depended on coding. GenAI is about data. If models are not embedded in data, they don’t produce results. Most data sits on-prem”.
Mohan Rajagopalan, who has Ezmeral under his belt, cites several arguments for running AI locally. “A first reason is economic. AI production can be expensive. On private infrastructure, you have more control over your costs than if you put everything with hyperscalers.” Hyperscalers will gladly rebound that argument and argue that the “entry ticket” is lower with them.
HPE therefore also plays heavily into sovereignty: a string that is especially sensitive in Europe, Rajagopalan says. “In certain situations, data needs to stay within highly defined boundaries, and companies don’t want to move their data to the cloud at all. Data is the foundation for AI. If you invest in proprietary infrastructure, you have a guarantee that it will always be available. The same goes for your data”.
Lots of good will, fewer results
Multiple studies show that companies do not lack good will, but implementation is less smooth. Rajagopalan does not want to turn a deaf ear to the stumbling blocks companies experience. “I have had many conversations about this. You used to be able to run models in the cloud and see what comes out. Now the use cases for AI are much more diverse. Success stories start from the use case”.
“Only one in 10 proof of concepts with AI effectively goes into production,” explains CTO Fidelma Russo. “The life cycle is complex. Tuning the infrastructure is a long process that companies don’t want to spend time on. If the learning curve of a technology is too high, people quickly lose interest.”
“Companies are learning to embrace AI the hard way,” McDonald picks up on his colleague. “The three big barriers are time to value, data management and compliance. No solution yet existed that addresses those problems simultaneously. The underlying infrastructure shouldn’t be a burden. An integrated solution removes concerns about infrastructure, so companies only have to worry about how they want to deploy AI.”
If the learning curve of AI is too high, people quickly lose interest.
Fidelma Russo, CTO & VP Hybrid Cloud HPE
AI in three clicks
From that understanding, Private Cloud AI was born. HPE markets the platform as a turnkey solution that companies can easily run on their infrastructure. “We don’t want to just give customers servers and let them draw their plan. This is a turnkey solution that is ready in three clicks, available in different sizes and configurations. Private Cloud AI allows you to process data where it is. We bring AI to your data,” Russo continued.
Private Cloud AI is the mixture you would get if you put the assortment of HPE and Nvidia in a blender. The two companies find each other not only in terms of hardware, but also software. The tandem will be expanded at Discover into a tricycle with Deloitte to help companies with the practical side.
Rajagopalan explains more: “AI is a very technical workload. You are then better off if you can work with experts in the field who communicate in a language you understand. That way you create a win-win for all parties. For us, it’s not only about the technology, but also about how we deliver it. That’s where you make the difference. With Private Cloud AI, you buy a very fast vehicle with the full support package included.”
By making AI as accessible as possible, HPE also wants to include SMEs in the whole AI thing, a segment that is lagging behind. “SMEs are limited by their available skills and budgets. They don’t want to invest in skills, but proven use cases. All the necessary functions, from infrastructure to software and models, are built into the platform, so you don’t have to program yourself.” Still, purchasing at least one ProLiant server with Nvidia GPUs is required: the access ticket will quickly cost you several thousand euros.
Recognizable recipe with one unique ingredient
HPE is positioning itself squarely against the hyperscalers with Private Cloud AI. Microsoft, Google, AWS and co lure you into their data centers to access the most advanced AI models and features, while HPE looks at how to adapt models to what is possible within the customer’s environment. Still, HPE is not alone in approaching AI from a hybrid approach.
Lenovo and Dell, two of HPE’s major competitors in the server market, also play heavily into the local component of AI. The vision sounds similar, the interpretation is slightly different. All three start from their historical background in hardware. Lenovo and Dell not only look at servers, but also at the PC as a vector to run AI locally. One big constant: Nvidia is never far away. Then there are parties like Nutanix that, like HPE, want to profile themselves with “bite-sized” AI solutions for hybrid IT environments.
read also
More than a hardware farmer: Lenovo surfs AI wave toward big ambitions
HPE is not inventing the wheel with Private Cloud AI, but one wheel is not the other. HPE’s recipe contains one more ingredient that its competitors don’t have. HPE, with subsidiary Aruba, is a major player in the networking industry and likes to play that card. Therefore, it should come as no surprise that HPE is reaching deep into its purse to acquire Juniper Networks. We analyzed that acquisition in detail.
“Hyperscalers are very strong in computing, but HPE stands out on the network. AI needs a powerful network. If your Ferrari can only go twenty kilometers per hour, you have no use for such a fast car,” Rajagopalan concludes.