Jon Toigo, Chairman, Data Management Institute; Augie Gonzalez, Director of Product Marketing, DataCore Software
Clouds have been touted as the next big thing in IT infrastructure for nearly a decade, but larger enterprises have been slower to adopt the service delivery model than smaller firms. The reasons have been several, but a key stumbling block has been the legacy storage infrastructure that companies have already deployed. Planners are reluctant to rip and replace legacy shared storage infrastructure, especially in those cases where applications have been designed to use the particular storage topology or when anticipated returns have not yet been obtained from storage investments. While the appeal of hybrid public-private cloud data centers is a given, and the concept of building your base and buying your burst is viewed as a sound strategy for next generation data center design, there is still a need for technology that will enable existing shared storage infrastructure to be included and managed alongside software-defined and hyper-converged infrastructure models. Such technology must work with storage infrastructure both on premise and in a public cloud and without sacrificing performance or resiliency.
The good news is that the technology exists today that can help organizations:
- Reduce I/O latency at the source, by parallelizing raw I/O output
- Retain shared storage infrastructure while embracing software-defined storage where it makes sense
- Deliver the right storage services to the right data based on what the data requires
See how DataCore can help you bring cloud infrastructure to your data center in an intelligent way.