The slow adoption of new storage technology- WHY? — blog by Camberley Bates

By Camberley Bates, Friday, April 20th 2012

Analyst Blogs

Over the last few months during our education courses and conferences, one topic kept coming up – the conservative nature of Storage Architects and Administrators (no – I am not talking Limbaugh or politics). Storage, in comparison to other technologies, seems to move slower than a snail on a cold California day. There seems to be a reluctance to try new stuff – despite all the noise and claims we hear in the market.  In fact, Tom Joyce of HP in one of his presentations talked about when 3Par was bought by HP, only 50% of the customers were using Thin Provisioning.  And 3PAR has been considered one of the leading proponents driving this technology.

Understanding why Storage people are conservative is necessary in order to effectively communicate to their needs especially around new technologies.  So here is our short list of reasons from our experience with IT (if you have others, would love to hear from you )

  1. The first rule of storage is LOSE NO DATA  – It is one thing to lose availability, it is another to lose the data.  It is why there are so many copies, so many ways to recover, so much testing, testing, testing before deployment.  Once it is gone – it’s gone.  A hard example of this the forest fire in Colorado last month where  23 houses burnt to the ground and 200+ families  were evacuated.  The evacuees got to go back home, granted to damage.  The lost houses, are just that – lost.
  2. Storage is EXTREMELY COMPLEX –  We have cool new GUI’s, iPad access and integrated management.  This does nothing to reduce the requirement for integration and interface testing.  In fact, if anything, the masking and creation of ease of use increases the need for regression and failure testing.  Just a glance at all the various layers in the storage architecture gives you an idea of what a vendor must scrutinize to deliver a 4 –nines or more product.  It is no wonder the introduction of SSD, Thin Provisioning or Tiering brings conservative responses by the IT admin.
  3. Useful life, warranty and depreciation schedules. – Storage usually starts as a tier 1 device, once it is reached the end of warranty or depreciation, its next phase is tier two or support of less critical applications.  For this reason, storage has a useful life of 5 years.  More on this topic can be found in Randy Kern’s StorageSoup Blog.
  4. Knowledge and familiarity –One of my favorite stories about the value of familiarity was a call we had on a 10 year old architecture.  The client was refreshing their technology and we were assisting on the RFP. Our first response was – why would anyone just refresh the technology, when there are some new, incredible technologies available that can impact the overall cost of ownership? After review of the environment, offerings and requirements, the answer was  – experience and skills with in the data center.  With all the changes occurring in the company, and the experience of the IT storage team, change and re-education was not in the best interest of the company – if the appropriate terms could be negotiated.   We have seen this repeated in other companies often in the name of expediency.

Given all this, how does new technology get introduced and adopted?  The easiest way is through new applications and initiatives.   Often initiatives need a new way of approaching the business requirements and thus introduce a different architecture to deliver performance or capacity efficiency.  Next blog – applications driving new architectures.

Back to Analyst Blogs

Forgot your password? Reset it here.