A true digital transformation is a matter of orchestrating multiple moving parts, with no single component transforming in isolation. Understanding the interdependencies of a transformation can be a complex process, and an organization’s readiness to transform requires a deep understanding of those interdependencies. Almost everyone agrees data is one of the most essential components of a digital transformation. However, storage technology, which stores all the organization’s data and determines how that data is used and managed, is all too often overlooked.
Storage technologies and services are in great flux in today’s market. Enterprises must look past the marketing hype around “as a service” offerings and claims of cheap cloud storage and get a clear understanding of their specific and changing requirements for storage capacity and data usage. Organizations also need to understand the interdependencies between storage, compute and their application landscape, which dictates the requirements for performance, responsiveness and availability of stored data.
When considering next-generation storage solutions, organizations should keep these top five factors in mind:
- Next-gen storage architectures. As organizations begin their journey into the cloud (whether private or public), they typically continue to rely heavily on traditional on-premise data center solutions while also implementing cloud (e.g. a hybrid cloud solution). To optimize their hybrid cloud solution and utilization of associated storage resources, an organization must understand how the server and application landscapes leverage the storage resources and use the data stored on them. Hybrid cloud architecture decisions often fail to fully investigate the interdependencies between applications that can or have already moved to the cloud and other systems / applications that still reside in the traditional on-prem data centers. When datasets created by applications in the cloud are required for applications still residing in the data center – or vice versa – the resulting “chatter” between the cloud solution and the on-prem solution can lead to performance issues. It also can drive up storage costs since most public cloud solutions charge for data transfers (including both ingress and egress) outside of the cloud. Most organizations don’t need to closely measure and track the volume of data it transfers between systems and apps when everything resides on-prem, but, when components start to move to the cloud, transfer-related charges can grow quickly.
- Costs and billing mechanisms. The per-gigabyte (GB) charging mechanism is still the most common and probably the most transparent when a service provider solution includes storage hardware, software and services to support the client’s environment. When the provider is providing only the services to support the environment (and the enterprise client owns the hardware and software) then there may be additional metrics to consider to more fully reflect the effort of managing storage, such as number of locations, number of subsystems or storage types and level of complexity. However, the outsourcing industry still lacks a universally accepted alternate metric to the per-GB measure.
- Cost and capacity drivers. The best way to control costs for storage is to implement strong capacity and demand management. Enterprises should institute policies for deleting data that no longer has value to the organization or is not required for compliance purposes. The idea that storage is cheap is a myth. Many newer technologies available today can help organizations optimize the use of their storage resources, such as thin provisioning, virtualized tiering (using all flash storage with inline compression and service performance characteristics to create virtual performance “tiers”) and storage management interfaces and tools that can manage storage platforms from multiple storage vendors. These technologies help organizations better utilize their resources and increase productivity for storage management. It should be noted that these technologies are most effective when they are coupled with best practices for requesting, storing, using, managing and life-cycling data.
- Technologies of the next-gen storage landscape. New storage technologies are disrupting the market today. These include software-defined storage (SDS), all-flash arrays, cloud-based storage, unified storage offering the management of storage area networked (SAN) and network attached (NAS) storage (NAS) solutions under a single management layer and data-aware storage, to name a few. The challenge is that storage vendors often use terminology differently. For example, VMWare’s SDS solution vSAN aggregates local or direct-attached data storage devices to create a single storage pool shared across all hosts in the vSAN cluster. This solution eliminates the need for external shared storage and simplifies storage configuration and virtual machine provisioning. Meanwhile, NetApp’s SDS solution using virtual tiering essentially imposes service quality constraints of solid-state disk (SSD) or all-flash storage systems to create “virtual performance tiers.” By leveraging in-line compression and deduplication, this solution drives more effective capacities while only moderately impacting the overall enhanced performance capability of SSD over traditional spinning disk. Both solutions are called “software defined storage” and both solutions optimize storage utilization, but they address different business needs and objectives.
- Forecasting and flexibility in storage service delivery models. Managed storage services, storage as a service, pay-as-you-go storage, cloud storage – what do these different storage delivery models mean for your enterprise data management strategies? Organizations are now leveraging multiple platforms and models for services, such as traditional on-prem services, private cloud, hybrid cloud and public cloud services. New models and approaches for delivering managed storage services also are emerging. Storage as a service can be delivered in an on-prem data center, for example, or in a private cloud environment. Unlike more traditional managed services, some storage-as-a-service or pay-as-you-go-storage offerings no longer have a minimum volume commitment, and customers can flex their capacity upward and downward while paying only for the capacity used. This model is not unlike most public cloud storage offerings, but it offers greater control over where the storage infrastructure and the organization’s data resides. Keep in mind the term “storage as a service” is receiving a lot of marketing hype, but the constraints and service terms vary significantly depending on the vendor and the customer requirements for capacity, so buyer beware.
ISG helps enterprises transform their data storage solution, assist them in navigating the data storage services market and assess their next-gen data management strategies. Please contact us to discuss how ISG can help your organization.
About the author
Dr. Cindy LaChapelle has broad expertise in technical strategy, project management, IT outsourcing and data center performance assessments and transformations. She helps clients with data protection strategies, disaster recovery planning and analysis to enable cost-effective technology solutions. She has worked with global clients across many industries in the U.S., Canada, Europe and the Asia-Pacific. Cindy has a BSc degree in honors chemistry from McGill University and a doctorate in planetary astronomy from the University of Arizona. She is a published thought leader on a wide range of business and technology topics and is ITL V3 certified.