For the CTO’s and CIO’s of the past, tiered storage used to be a simple matter of assigning categories of data to different types of media in order to reduce overall storage costs.
Things have changed; it’s now about dynamically storing data in the right place at the right time.
Pre-cloud, tiered storage was the norm; enterprises used a combination of HDD and tape to reduce costs and improve reliability and efficiency. But in the cloud, tiered storage is neglected. Businesses tend to keep all of their data on the same grade of storage, regardless of the priority or activity of data.
This is a hugely inefficient approach to data storage for enterprises.
The pressure is on to blend the diverse range of options into a solution that’s responsive to an organisation’s needs. That’s a big ask when the options available are so diverse and unavoidably complex.
So how does an organisation avoid the pitfalls associated with getting it wrong? The answer is to work with a specialist company that’s already faced the same issues and overcome them.
Experience means categorising data objectively
The thinking behind tiered storage is clear: divide data according to its value, access needs and retention requirements, and then handle each class of data accordingly. While this may sound simple, the reality is that it can be hard to achieve.
Enterprises handle many different types of data, and with so many stakeholders who have different priorities and objectives, it often proves difficult to achieve a consensus on how data types are to be handled. Experience is needed to objectively classify data.
The implementation of successful tiered storage needs to balance the cost of storage, the life expectancy of data, and both the speed and the frequency at which data will be retrieved. It requires measurement against the cost and technology that is available, for each data type.
With the assignment of different categories of data to different types of enterprise data storage media, overall storage costs are reduced. High performance but expensive storage devices, for instance, are not wasted on low value or infrequently accessed data.
Tiered storage plays an important role in a successful information lifecycle management (ILM) hierarchy. ILM is all about having the right storage at the right time at the right cost.
Experience means understanding today’s tiered storage options
Regardless of whether data storage is in the cloud or on-premise, the reality is that data is stored on a drive somewhere. The speed, size and physical characteristics of the drive are what determines the cost per GB.
The specific tier of storage for enabling your cloud capabilities is dependent on the type and classes of service being provided. For example, if the cloud service is to provide low- or no-cost capabilities that do not need very high performance, then high-capacity, energy-efficient HDDs are an option. On the other hand, if the service requires fast response time (latency) while supporting a large number of active users, then SSDs are a good fit. What this means is that different tiers of storage media are aligned and used to exploit the various attributes of those technologies as needs change.
Experience means appreciating capacity requirements
Planning capacity becomes more complicated as a result of tiering. You not only need more hard drives but also need to decide which kinds of hard drives as well. It’s important not to assume that storage device categories will grow in tandem. Tiered storage typically has some categories that grow faster than others. Meanwhile, some may reduce in size.
As an example, the need to archive certain data may reduce as permanently stored data becomes reclassified into other categories.
We know that the economics of SSDs are dramatically improved if the storage system implements some sophisticated management techniques to maximise the technology’s usefulness.
Experience means recognising that technology is continually evolving
Technology improvements are now playing an important role in the success of tiered storage strategies, including improved SAS drives, new SSDs and more efficient data migration and data classification tools.
Leveraging these technologies as part of a tiered storage strategy helps those responsible for storage to lower operational and asset costs, and improve application performance. Software has become sophisticated enough to recognise when data is more or less likely to be used to the extent where future use is predicted by looking at past behaviour trends. This has enabled more dynamic control over data.
Certain types of data may be heavily accessed at certain times of the day, month or year and so need to be stored on more expensive servers. But at low use times it’s more economical to move this data to cheaper storage options. This is especially relevant as more businesses choose to use object storage services like Amazon S3 or Atmos where charges relate to the actual space used at any given time.
Experience means finding the right solution for legacy data
The management of legacy data is still an important consideration as the volume of transactional data increases. Once this data is inactive, it might not be used for some time, it may never be needed. This makes a case for data footprint reduction (DFR) where archiving, compression, thin provisioning, deduplication and storage tiering can be employed.
The problem with legacy content is that this inactive data can take up storage space capacity which is better used for active data. The idea is to transfer inactive data from high-performance storage media tiers, such as Solid State Drives (SSD) or fast 15K HDDs, to slower, larger-capacity, energy-efficient and lower-cost HDDs. Once this happens those resources can be used more efficiently while reducing storage costs. Data that is inactive can move to a nearline or offline tier of local or offsite storage, the benefit being a much better return on investment.
While traditional data lifecycle models assumed the value of data diminished over time, new thinking suggests that this simply isn’t the case now.
New data lifecycle models show that data is now more likely to maintain or even increase in value, and this is directly attributed to its availability; nearline or online. Recognising activity patterns is a significant step in the alignment of different applications to their storage medium whether that be in public or private clouds.
Experience means being able to blend the right tools
The ability to bring together a range of providers, platforms and technologies and then mix them together can be difficult for those tied to one solution. Working with a partner that is technologically agnostic means decisions are free from constraint. That’s not to say suppliers won’t have their preferred solutions, but these preferences are often the result of experience over obligation.
Navigating the complex world of tiered storage to find the right solution for your organisation can be less painful with the right guide.Subscribe to RSS Feed