Data center infrastructure has had to accommodate a higher volume of workloads in recent years as businesses collect more data and do more with it. Hardware procurement decision makers have had to place a premium on availability and reliability. Before advances in virtualization and cloud computing, those valuable assets were only available to enterprises with deep financial resource pools and robust expertise to manage the infrastructure. However, industry demands have changed the market considerably and allowed for companies of all sizes to leverage best-fit solutions, whether they are housed on premise or within the cloud.

Richard Jenkins of DataCore Software recently explored IT evolution in relation to virtual storage environments, noting that virtual server provisioning has emerged as a cost effective way to create highly available systems. It also enabled companies to make more strategic hardware choices by raising utilization rates. Rather than invest in a completely new server, data center operators can now deploy a new VM to handle more workloads, and tasks are more easily shifted from one VM to another to maximize reliability. Although storage has always been an essential component of these systems, Jenkins pointed out that it took longer for companies to begin exploring virtualization in this area.

The early stages of virtualized storage
While servers became virtualized, many deployments relied on shared storage from the same physical device. This created a problem because it created a single point of failure. Jenkins noted that many businesses turned to specialized, highly redundant storage solutions to reduce this risk. This strategy failed to overcome the critical issue behind the problem and resulted in a lack of availability during maintenance time. The first solution to emerge involved clustered storage in which each device could make copies of disks within the cluster. This suffered from poor integration with existing systems and was also cost prohibitive. In response, some hardware manufacturers shifted toward virtualized storage. This model enables companies to use powerful physical devices and multiple storage architectures so that they can custom-fit solutions to their unique needs. However, Jenkins warned that some virtual storage models force customers to use a particular company’s hardware and built-in feature set.

“The weakness in this is that the firmware is physically bound to the hardware, limiting the consumer to a ‘like it or lump it’ choice of equipment at an inflexible price point,” Jenkins wrote. “On top of this, there comes the logical issue that although most have recognized the benefits of virtualization for servers, desktops and app’s, there are still many who have not made the connection that storage is the final stage of this, and that all of the benefits of virtualization can apply to storage as well.”

Cloud hardware providers could offer more value by decoupling the storage firmware from the physical devices. This translates to more freedom in technology decisions and often results in lower total cost of ownership than proprietary systems. Because the software and hardware are separate, businesses gain the advantage of being able to use the former across many different instances rather than having to start over for every new deployment.

Offering the best solutions
Other experts have noticed the market’s push toward accessible storage infrastructure that avoids vendor lock-in, but the trend’s momentum may pick up sooner than many think. In a December blog post, for example, VMWare’s Christos Karamanolis predicted that 2013 would be the year of software-defined storage. He identified significant enterprise demand for tiered systems that support multiple layers of redundancy, as well as different levels of performance and capacity. However, moving toward open storage also creates an increasingly interconnected IT environment, meaning that the best solutions will likely come from providers willing to expand their expertise. Vendors within the Seagate Cloud Builder Alliance, for example, are able to leverage leading technology at both the hardware and software levels while leveraging the expertise required to create the best solutions for their customers.

Twitter Facebook Google Plus Linked in