Covering Disruptive Technology Powering Business in The Digital Age

Home > Archives > Blog > Big daddy HDS shows off fruits of Big Data slurpee Pentaho
Big daddy HDS shows off fruits of Big Data slurpee Pentaho
February 12, 2016 Blog big data

This article was originally published by and can be viewed in full here

Hyper-converged HSP analytics appliance

A refreshed HSP 400 series, HDS’ scale-out analytics appliance, has native integration with the Pentaho Enterprise Platform.

HDS slurped up Pentaho in February 2015 to acquire big data integration and predictive analysis technology. The software amalgamates multi-source data from, for example, Hadoop (Cloudera, Hortonworks and others), NoSQL (MongoDB, HBASE, Cassandra, etc) and data warehouses like Netezza and Vertica, and then runs analysis routines on it.

We’re told the HSP 400 is a hyper-converged analytics appliance using 2U nodes combining compute, storage and networking and “delivers seamless infrastructure to support big data blending, embedded business analytics and simplified data management.”

There is a “centralised, easy-to-use user interface to automate the deployment and management of virtualised environments for leading open source big data frameworks, including Apache Hadoop, Apache Spark, and commercial open source stacks like the Hortonworks Data Platform (HDP).”

The Pentaho integration provides “complete control of the analytic data pipeline and enterprise-grade features such as big data lineage, lifecycle management and enhanced information security.”

James Dixon, Pentaho’s CTO, said customers can unify disparate datasets and workloads, like legacy applications and data warehouses, using this HSP 400. It’s a “simplified, all-in-the-box solution that combines compute, analytics and data management functions in a plug-and-play, future-ready architecture.“

The HSP 400 is “a great first-step in simplifying the entire analytic process,” and has a pay-as-you-go business model.

HDS tells us that the HSP 400 will be used for more workloads than analytics in the future, with Pentaho being used in some or all of them.

A nearline SAS disk (12 X 4TB) configuration of the HSP 400 is available now, and an all-flash version is expected by the middle of the year. We expect a dozen SAS-interface SSDs will occupy the current SAS disk drive slots if HDS takes the simple engineering route.

On the other hand it has just launched its HFS A series all-flash array, which, like the 400 series nodes, is a 2U box. Hmm, slide those in an HSP 400 rack and you would have a hellishly fast and powerful system.