Can the Data Fabric Make NetApp’s Hybrid Cloud Infrastructure (HCI) Strategy Work?
“Data Driven” was a key theme at NetApp Insight in Las Vegas last week. According to CEO George Kurian and other executives that spoke, becoming data driven requires legacy IT to transform, using the cloud – more specifically, a hybrid, multi-cloud architecture. This means building a data strategy that moves from batch to real time processing, from data centers to data fabrics and from fragmented to centralized. More on these concepts below. Kurian also emphasized that “speed is the new scale”, referring to the importance of agile infrastructure and agile thinking. That said, companies must still be able to deal with increasingly larger data sets, like Dreamworks, a big NetApp customer.
Dreamworks, the film and digital media company, shared some of the realities of being a data-driven company – one that creates an enormous amount of data. Feature length movies, like “How to Train your Dragon”, (for which they showed a pre-release clip),
have about 130K frames. This adds up to about 1⁄2 billion active files that must be available, protected and managed throughout their 12-step production process. Dreamworks has about 10 films in process at any one time, meaning their infrastructure must support 5 billion active files.
NetApp’s Data Fabric lets them move these between on-prem infrastructure and multiple cloud- based systems. Data Fabric is NetApp’s data architecture that enables data to be managed and seamlessly transferred between on-prem and cloud-based infrastructures. It provides a consistent set of data services and tools to ensure access, control, protection and security for data on-prem or in the cloud.
Download now to read this free report!