Is there a Leak in the Data Center? – Eric Slack

By , Wednesday, December 9th 2015

Categories: Analyst Blogs

Tags: cloud, leak,

Once upon a time (not that long ago) consolidation was all the rage in IT. Big arrays and “big iron” were the order of the day as networked storage (SAN and NAS) systems took over from direct-attached storage (DAS). IT organizations looked for ways to reduce the inefficiency and management burden of storage provisioning, the manual process of allocating storage capacity for application servers, and support the kind of data growth that was coming with the internet, social media, etc.
Leak in the data centerLately, the storage emphasis seems to be shifting from consolidation back to decentralization, driven by several factors, the first of which is the rise of server-side storage devices. Companies are supporting more real-time processes where latency is the chief concern so storage, at least for the most active data, has been steadily moving from centralized arrays back into the servers themselves and from spinning disk drives to flash devices. “Server-side” caching and tiering, puts these data sets closer to the CPUs to enables this latency reduction, but it’s also reducing the demand for networked storage. Now, with 3D NAND delivering multi-TB flash devices, there may soon be little reason to keep any primary data on disk drives and even less to leave it on the network.

Movement of IT to “the edge”
The proliferation of mobile devices further decentralizes computing and increases the “leaking” of data out of the data center, or at least away from the big, consolidated infrastructures. The Internet of Things (IoT) promises to make our appliances “smart” and turn them into data generators in the process. As this generation occurs at the edge, can storage be far behind? And with the cloud, making a safe copy is easier than ever.

Cloudification

Private/hybrid/public clouds are steadily pulling data and applications away from traditional storage systems. Originating with hyper-scalers, the largest of the web-based companies, a new infrastructure design is being used to build public clouds. Scale-out storage systems comprised of low cost server-based storage hardware and independent software-based storage components are becoming the de facto architecture for public clouds. Now, with the Open Storage Platform, enterprises can build their private/hybrid clouds the same way, giving companies an alternative to the traditional data center infrastructure.

New Projects

It’s not just cloud projects that are leveraging these disaggregated architectures. As companies set up new systems to support big data analytics, data lakes and anything else with the potential to get big in a hurry, they’re turning away from traditional data center solutions. Part of this is due to economics, but part is the reality that it’s often easier for departments charged with designing and running these new projects to set up their own infrastructures than to get central IT involved. In a similar situation, VDI projects are often set up and run separately from central data center infrastructures, even though they may still be in the physical data center itself. In this example, VDI workloads are different enough and significant in the resources they consume to warrant investing in a new infrastructure.

Hyper-Convergence

Hyper-converged appliances have made self-contained storage and compute infrastructures very easy to set up and run. These solutions have taken off in remote and branch offices for supporting local compute activity and to stay coordinated with headquarters. As self-contained compute solutions, hyper-converged appliances are popular in small companies as their primary IT infrastructures. They’re also being used for departmental computing and the kinds of projects mentioned above, like VDI.

Whether it’s driven by a need for speed (server-side flash), a movement to the cloud or just an aversion to big projects, there’s definitely a shift underway in how and where companies deploy IT infrastructures. And that’s cause for concern. For example, as compute and storage activities leave the familiar confines of the data center how will companies maintain data protection and integrity? There’s also the question of security and compliance when data owners become custodians of information without the experience or knowledge about protecting corporate data that IT personnel have.

These issues have been brought up when discussing cloud-based solutions, but they probably need to be considered when departments set up their own infrastructure outside of the data center. In general, companies need to be aware that they can outsource their data processing and storage activities (including removing them from the purview of the IT organization), but can’t outsource their responsibility for that data.


Many products have long lists of features that sound the same but work very differently. It’s important to think outside of the checkbox of similar-sounding features and understand how technologies and products differ.

 

 

 

 

 

Forgot your password? Reset it here.