At their annual technology event in Las Vegas last week, Hitachi NEXT 2019, Hitachi Vantara unveiled the new VSP 5000 Series, the next generation of the company’s venerable high-end storage system. But a key message was about the importance of services. The merger underway with Hitachi Consulting (which will be complete in early 2020), shows that for this infrastructure company, services, as in professional services, are an essential component to their product offerings.
We have all heard how “data is the new oil”, as the asset that’s going to fuel growth in the 21st century. But according to Hitachi Vantara CTO Bill Schmarzo, unlike oil and other resources, data never depletes and can be used over and over at a marginal cost of $0. This is an incredible opportunity for enterprises to create insights from their data, but also an enormous challenge for IT organizations that have to that data where it can be used, and reused. Data science and analytics, ML, AI, etc, are evolving quickly but current tools don’t address the primary problem of getting data into the right format so it can be exploited. This is where data engineering comes in with DataOps, a term for the processes required to get the right data to the right people (or applications) at the right time to support analytics, governance and agility. Hitachi’s solution for DataOps is called Lumada.
Originally released in 2017 for IoT environments, Lumada has been expanded to support all aspects of data analytics in all industries. Lumada Data Services provides an intelligent foundation for the curation and sharing of data while meeting data management and compliance requirements. Lumada Data Lake (GA March 2020), is a data catalog and curation service that integrates Hitachi’s Pentaho analytics platform and Hitachi Content Intelligence utility with Hitachi’s HCP object storage system to enrich metadata in large data sets and help data lakes maintain their value. Lumada Data Optimizer for Hadoop (GA November 2019) is an automated, policy-based solution that’s integrated with Hadoop to allow HDFS files to be tiered to the Hitachi Content Platform, reducing costs. Lumada Edge Intelligence is a real-time analytics and management tool for remote environments, including manufacturing or maintenance operations. Pentaho 3.0, (GA February 2020), can automate the data flow process, getting the right data from different sources with the right filters and transformations to the user so they can be easily consumed. Using templates for common integration patterns, Pentaho can create self-service data flows reducing the time and overhead associated with manual processes.
Called “Project Jupiter”, Hitachi announced the general availability of the new VSP 5000 Series, an update to the VSP 1500 enterprise array. The “hero numbers” provided were: 21M IOPS, 70 microsecond latency, 69PB total capacity and ‘8 nines’ of reliability. The VSP 5000 has an internal PCIe internode fabric (redundant pair of 1U switches) connecting up to 12 storage controllers on the back end. According to Hitachi this fabric is 6x faster than its previous single PCIe switch-based “crossbar” architecture. VSP 5000 systems have between one and three controller blocks, each 4U block containing 2 pairs of controllers, connecting to storage media via NVMe, 12 Gb SAS, or both, on the back end and FC, iSCSI or Ficon on the front end. Each block has up to 2TB of cache, 6TB total for the system. Systems can scale by adding controller blocks (Hitachi calls this “scale-out”) supporting up to 2034 SFF SAS SSDs, 1152 LFF SAS HDDs or 288 SFF NVMe SSDs
The Hitachi Content Platform (HCP), an object storage system, is the foundation for new offerings with Lumada Data Services (see above). The company is trying to emphasize the benefits of object storage for primary use cases like analytics (and Hadoop environments), instead of just secondary data storage, like archiving. Earlier this year, Hitachi released HCP Cloud Scale, a scalable object storage system that addresses the inherent problems with performance as object storage systems grow. This scale-out system uses a container-based architecture to separate data from metadata and enable it to scale performance linearly with capacity. Currently, HCP Cloud Scale doesn’t offer encryption or some other features that the base HCP solution does, but should reach feature parity by mid 2020.
Ops Center software is Hitachi’s infrastructure management platform that’s comprised of the Administrator, Analyzer, Automator and Data Protection (HDID) modules. Ops Center uses automation and analytics to simplify things for IT and enable the same people to manage more data. Now using machine learning techniques, Ops Center can do more than just provide alerts, but provide recommendations about the environment and the systems in it.
Hitachi Vantara Cloud Services (from the REAN Cloud acquisition in 2018) is a PS-based engagement that helps enterprises answer the question of which applications to move to the public cloud and when, and then can provide migration services and ongoing operational support. Hitachi Enterprise Cloud (HEC) is a management platform for private and public cloud infrastructure, delivered as part of Hitachi managed services offering.
Hitachi is a big company that makes big equipment (think trains, construction equipment, medical imaging, etc). Several years ago the company combined Hitachi Data Systems, Hitachi Insight Group (IoT) and Pentaho (data analytics) into Hitachi Vantara, ostensibly to focus on the IoT and analytics market. The rationale was that many primary IoT and big data projects were centered around big equipment, and a company that builds big equipment, including data equipment, would have a lot of insight into collecting and analyzing these complex data sets. This rationale made sense, but candidly, the company didn’t produce the results that were expected.
Hitachi is an infrastructure company, and the VSP 5000 is an example of that commitment to infrastructure. But earlier this year, Hitachi Vantara added the Hitachi Consulting group, demonstrating the importance of professional services to the big data solution. At Hitachi NEXT, services were front and center with Lumada Data Services. This is an area that Hitachi got right. Professional Services isn’t another “as a Service” offering, it provides the know how to design these complex solutions and the personnel to implement them.
As an infrastructure company, Hitachi knows the importance of services and this was made clear at Hitachi NEXT 2019. Making big projects work requires specialized expertise and plenty of resources. Hitachi seems to be applying that concept to the challenge of digital transformation, a project that enterprises are struggling with. Lumada is a software platform to get data where it’s needed, in the format required for analytics. It’s been a while coming, but these may be the pieces Hitachi needs to make IoT and big data work for their enterprise customers.
Download the free Industry Snapshot Now!