HPE extends the BlueData technology they acquired last year with a new solution that introduces a DevOps-like process to accelerate AI deployments.
Hewlett Packard Enterprise has announced a new solution which leverages the BlueData technology for containerizing AI environments that they acquired last year. HPE ML Ops is a container-based software solution designed to support the entire machine learning [ML] model lifecycle across the hybrid cloud, through a DevOps-like process that standardizes machine learning workflows and significantly accelerates AI deployments.
“This is a big investment for HPE in the AI and machine learning space,” said Patrick Osborne, VP and GM, Big Data and Secondary Storage, at HPE. “We are making a big bet on this in the market because we think customers will continue to use data to fuel innovation. Most major customers now have a major AI or ML initiative. To stay relevant, this is the next phase of enterprise adoption.”
The Blue Data acquisition was originally announced late last year during HPE Discover in Madrid, and the Blue Data EPIC software platform became available to channel partners in the April-May timeframe. HPE ML Ops is a new product, which Osborne described as being under the umbrella of Blue Data, and which extends the capabilities of the EPIC platform.
“The value proposition around Big Data is addressing customers’ desire to deploy stateful container applications in the AI and applications space,” Osborne said. “Containers have had difficulty operationalizing machine learning, and this helps customers to do this. For data scientists, there are lots of tools out there for building and developing models, but not many for managing end-to-end lifecycles. Blue Data provides the platform and the container framework to do that, and to tap into data lakes to do analytics work. This new product, HPE ML Ops, extends this to machine learning around stateful containers.”
HPE ML Ops transforms AI initiatives from experimentation and pilot projects to enterprise-grade operations and production by addressing the entire machine learning lifecycle from data preparation and model building, to training, deployment, monitoring, and collaboration. It includes pre-packaged, self-service sandbox environments for ML tools and data science notebooks. In addition, it works with a wide range of open source machine learning and deep learning frameworks including Keras, MXNet, PyTorch, and TensorFlow as well as commercial machine learning applications from ecosystem software partners such as Dataiku and H2O.ai.
“The solution itself has ability to manage data science workflows throughout their entire lifecycle,” Osborne said.
Osborne acknowledged that this is still a niche area among HPE channel partners, but also believes that this is changing, and thinks that AI/machine learning is a wise investment for partners.
“We see this as a growing part of the market, and partners in it today are ones with strong advisory practices who expanded into this area,” he said, noting that this includes both established players like WWT as well as new players who cater to the hybrid cloud.
“We believe that channel partners need to build a practice in this area, and develop an expertise in data science and analytics,” he added.