The Hitachi Hyper Scale-Out Platform, a new, next-generation technology, leverages technology acquired with Pentaho last year to provide an appliance that is aimed at Big Data, Data Lake and Internet of Things use cases now, and at broader use cases once an all-flash version is available later this year.
Hitachi Data Systems Corporation (HDS) has announced the launch of its Hitachi Hyper Scale-Out Platform (HSP), an entirely new and next-generation system which the company believes offers a fundamentally different approach to data management than it has employed in the past. The secret sauce is its native integration with the Pentaho open source-based platform for enterprise grade big data deployments. Pentaho, which Hitachi acquired last year, is now a separately managed Hitachi company.
“This is an absolutely significant change for us,” said Paul Lewis, Chief Technology Officer at Hitachi Data Systems Canada. “We are adding capabilities beyond our heritage. With this, we are making sure that DATA is the centre of attention in an organization, rather than equipment.”
Pentaho technology allows HDS to integrate analytics and Internet of Things capabilities into a social innovation strategy that goes far beyond its big iron storage background.
“Our heritage is mid-range and enterprise storage, but this allows us to create scale-out infrastructure for Big Data architecture like Hadoop, Spark or the Hortonworks Data Platform, while providing the integration and visualization capabilities that Pentaho delivers,” Lewis said. “This is a purpose-built scale-out hyper-converged appliance for multiple workloads and Big Data and IOT analytics.”
Lewis stressed that this isn’t just for large enterprises.
“Pentaho had thousands of customers, and the heritage of Pentaho was the open core, open source environment,” he said. “This is built for Big Data analytics, where you have a velocity or volume issue with traditional data management, so can benefit from a scale-out deployment. That lets you start small and grow big.”
Lewis also indicated that the HSP can work either in the data centre or at the edge, where it would more likely be used in an IoT application.
“Pentaho fits in many ways – whether you want to put it in a Big Data Lake or consume it at the edge to perform analytics – those are both options,” he stated. “It just depends when you need that data. If you need it in micro-time, you don’t move it to a Big Data Lake first.
Hitachi will offer HSP in two configurations to support a broad range of enterprise applications and performance requirements. One of them, a 2U unit with twelve 6TB SAS disk drives is available now. The other, with all-flash drives, is expected to ship in mid-2016.
“It just takes time to get the all-flash configuration out,” Lewis said. “But when it arrives, it will expand the workloads. Our goal here is to initially increase consumption of analytics in an easy-to-use package with the first configuration, and then look at broadening into different use cases with the all-flash.”
Lewis said HDS expects this to be a strong channel product.
“It’s a prepackaged single node deployment which is ideal for the channel to consume, especially if they are having analytics conversations with customers. It’s very much the kind of thing that customers would experiment with in a test-dev environment at first.”