Categories: Analyst Blogs
With data reduction a very common if not standard feature for primary storage systems, the amount of data reduction has always been a point of discussion between Information Technology professionals and vendor sales and marketing. Data reduction is an expected characteristic now, enabled by advances in technology such as solid-state storage and integration into processors and other specialty chipsets. That leads to capacity planning decisions for storage acquisitions: how much data can be stored on a system. Vendors see this as a potential competitive advantage and even give some type of guarantee as to the amount of data reduction.
The data reduction information can be misleading however. Normally, this is thought of as compression and deduplication resulting in less data to be stored. Depending at what point data is reduced in the movement of data through a storage system, the reduction can also serve as a multiplier to the internal bandwidth and internal memory (DRAM) because of only operating on reduced amount of data. Sometimes, the effect of thin provisioning where capacity is only allocated when data is written rather than when a volume is provisioned, is included in some materials as part of the data reduction. This is misleading and should be considered an implementation characteristic represented by raw versus usable capacity.
Depending on the type of data, the reduction amount (compression) can be quite different. The likelihood of repeated data segments that benefit from deduplication also is different. Evaluator Group has a model for IT clients to get an idea of what to expect. The model is based on a large set of data from vendor analysis. The model is only a guideline based on data. Vendors have generally given a guarantee that is two different numbers: one for data without analysis (sight unseen if you will) and another higher number if their analysis software is run against the data. The numbers can be quite different, usually something like 2:1 for sight unseen and then 4:1 or 5:1 depending on the output of the analysis. The guarantees give customers recourse if the system does not meet the guarantee, usually with the vendor supplying additional capacity.
NetApp, with the February 2023 announcement, has made a very welcome advance for IT customers regarding the guarantees for capacity. The guarantee will no longer be a blanket number across all data stored on the system but will be a different number for different types of data, very similar to what can be seen from the Evaluator Group model. This should be much simpler for customers and more expedient in acquisitions and deployments than running analysis software. This advance will probably be matched by competitors at some point because it does make it easier for customers.
NetApp has made a step forward in allowing customers to set expectations in how much data can be stored on their storage systems and having the assurance that NetApp will make good on the expectations set.
More information: Access Evaluator Group’s NetApp AFF Product Brief and EvaluScale Comparison for SAN and NAS on http://evaluatorgroup.com.
Disclosure: Evaluator Group, wholly owned by The Futurum Group, is a research and analyst firm that engages or has engaged in research, analysis and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article. Analysis and opinions expressed herein are specific to the analyst individually.
We encourage citing of the content, provided the citing is in context and attributed in the core copy as Evaluator Group, LLC and includes a link to the full article.