Hewlett Packard Enterprise adds new GenAI LLM models to HPE Aruba Networking Central Platform

By Q2, new GenAI LLM models will be directly applied to HPE Aruba Networking Central’s AI Search feature, improving search performance and security.

Alan Li, Senior Director, Edge Marketing at HPE Aruba

Today, Hewlett Packard Enterprise [HPE] is announcing the expansion of its AIOps network management capabilities through an integration of multiple generative AI [GenAI] Large Language Models [LLMs] within HPE Aruba Networking Central, HPE’s cloud-native network management solution on the HPE GreenLake Cloud Platform.

“What we have being doing is integrating our purpose built GenAI into our HPE Aruba Networking Central platform,” said Alan Li, Senior Director, Edge Marketing at HPE Aruba. “We started at the beginning of this month, and will be completed by next month.”

Li said that while other GenAI networking approaches just send API calls to public LLMs, HPE Aruba Networking Central’s new self-contained set of LLM models is fundamentally different.

“We have multiple LLM that are trained and hosted by us, to make AI search better from workflow, navigation and privacy perspectives,” Li added. They were designed with innovative pre-processing and guardrails to improve user experience and operational efficiency, with a focus on search response times, accuracy, and data privacy.

“Within Networking Central, we have a lot of AIOps tools, and we will continue to build our more tools with this technique.” Li indicated. The new GenAI LLM functionality will be incorporated into HPE Aruba Networking Central’s AI Search feature, complementing existing machine learning-based AI throughout HPE Networking Central to provide deeper insights, better analytics, and more proactive capabilities.

Security is top of mind in the new platform.

“There is uncompromised security, with no external APIs , and no Personal and Customer Identifiable Information (PII/CII)  shared between individual customer instances.” Li said. The LLMs remove PII/CII data and improve search accuracy, all while delivering sub-second response to network operations questions.

“Our security approach is important and super-differentiated,” Li added. “Our  internal GenAI transformers have been trained to remove PII/CII from training lakes.”

As part of its expanded capabilities, HPE Aruba Networking Central’s training sets for the GenAI models are now up to ten times larger than other cloud-based platforms and include tens of thousands of HPE Aruba Networking-sourced documents in the public domain, as well as more than three million questions that have been captured from the customer base over many years of operations.

“In our data lake, which is one of the largest in the industry, we have collected over three million Aruba questions, and have trained and tuned the specifics of LLM on them,” Li indicated. “We can give more perspective in our answers because we know the data, and it also allows us to avoid AI hallucinations. We also see LLMs provide improved results based on NLPs we ran in the  past.”

HPE Aruba Networking Central is a SaaS offering that is primarily sold as an annual subscription with a two-tier licensing model [Foundation and Advanced]. The new GenAI LLM-based search engine will be available in HPE’s FY24 Q2 and is included with all tiers of licensing. In addition to being a standalone SaaS offering, HPE Aruba Networking Central is also included as part of an HPE GreenLake for Networking (NaaS) subscription and is available through the HPE GreenLake platform.

“This is our first application,” Li said. “Our product teams are looking at other use cases.”