Hi [[ session.user.profile.firstName ]]

Software-Defined Storage

  • Date
  • Rating
  • Views
  • Bringing Cloud Infrastructure to Your Data Center Bringing Cloud Infrastructure to Your Data Center Jon Toigo, Chairman, Data Management Institute; Augie Gonzalez, Director of Product Marketing, DataCore Software Recorded: Nov 16 2016 50 mins
    Clouds have been touted as the next big thing in IT infrastructure for nearly a decade, but larger enterprises have been slower to adopt the service delivery model than smaller firms. The reasons have been several, but a key stumbling block has been the legacy storage infrastructure that companies have already deployed. Planners are reluctant to rip and replace legacy shared storage infrastructure, especially in those cases where applications have been designed to use the particular storage topology or when anticipated returns have not yet been obtained from storage investments. While the appeal of hybrid public-private cloud data centers is a given, and the concept of building your base and buying your burst is viewed as a sound strategy for next generation data center design, there is still a need for technology that will enable existing shared storage infrastructure to be included and managed alongside software-defined and hyper-converged infrastructure models. Such technology must work with storage infrastructure both on premise and in a public cloud and without sacrificing performance or resiliency.

    The good news is that the technology exists today that can help organizations:

    - Reduce I/O latency at the source, by parallelizing raw I/O output
    - Retain shared storage infrastructure while embracing software-defined storage where it makes sense
    - Deliver the right storage services to the right data based on what the data requires

    See how DataCore can help you bring cloud infrastructure to your data center in an intelligent way.
  • Fulfilling the Promise of Infrastructure Consolidation Fulfilling the Promise of Infrastructure Consolidation Jon Toigo, Chairman, Data Management Institute; Augie Gonzalez, Director of Product Marketing, DataCore Software Recorded: Oct 12 2016 58 mins
    In the early 2000s, server virtualization was hailed by hypervisor vendors as the key to data center infrastructure consolidation and huge CAPEX cost reductions. In many firms, it didn’t quite pan out that way.

    A new generation of thought leaders is arguing that hyper-converged infrastructure will pick up where hypervisor computing left off. The claims of CAPEX and OPEX improvements may sound familiar, but the technology worldview has expanded to include not only application workload and server hardware, but also software-defined networking and storage.

    Join us in this webinar for a frank and thoughtful discussion of hyper-convergence and how it might well be the key to hyper-consolidation.
  • Microsoft Ignite 2016 Review: Top Trends You Need to Know Microsoft Ignite 2016 Review: Top Trends You Need to Know Sushant Rao, Sr. Director of Product Marketing, DataCore Software; Todd Mace, Tech Evangelist, DataCore Software Recorded: Oct 5 2016 47 mins
    Did you attend Microsoft Ignite 2016? If so, then let us summarize the top trends we observed at the show and compare notes with you.
    Missed the event? Don’t worry, we have you covered. Join us as we cover the top trends and highlights of the news and interesting tidbits from Ignite. It’s the next best thing to being there!
  • VMworld 2016 Wrap Up: Top Trends You Need to Know VMworld 2016 Wrap Up: Top Trends You Need to Know Sushant Rao, Sr. Director of Product Marketing, DataCore Software Recorded: Sep 22 2016 48 mins
    Did you attend VMworld 2016? If so, then let us summarize the top trends we observed at the show and compare notes with you.
    Missed the event? Don’t worry, we have you covered. Join us as we cover the top trends and highlights of the news and interesting tidbits from VMworld. It’s the next best thing to being there!
  • The Fallacy of Using IOPS to Measure I/O Performance The Fallacy of Using IOPS to Measure I/O Performance Sushant Rao, Sr. Director of Product Marketing, DataCore Software Recorded: Sep 21 2016 40 mins
    Surveys show that 61% of companies have experienced slow applications after server virtualization with 77% pointing to I/O problems as the culprit. So, companies are desperately trying to improve their I/O performance.

    This brings up the question of what measure should IT departments use to evaluate storage products’ performance.

    The most common performance measurement is IOPS. But, is this the best metric? IOPS is dependent on a number of issues including, the type of workload being simulated, read/write percentages, sequent/random percentages and block size.

    Most importantly, IOPS doesn’t indicate the response time.

    So, what metric should be used to measure I/O performance?

    In this webinar, we’ll discuss Latency Curves, which show the response time the application sees as the storage product deals with more and more I/O (measured by IOPS). This measure will help you understand why some storage products can claim high number of IOPS but still perform poorly when they run into peak demand.
  • How hyper converged infrastructures transform databases How hyper converged infrastructures transform databases Augie Gonzalez, Storage Virtualisation Specialist Recorded: Sep 8 2016 45 mins
    HyperConverge like a Pro - part 2

    Regardless of your industry, databases form the core of your profitability. Whether online transaction processing systems, Big Data analytics systems, or reporting systems, databases manage your most important information – the kind of data that directly supports decisions and provides immediate feedback on business actions and results. The performance of databases has a direct bearing on the profitability of your organization, so smart IT planners are always looking for ways to improve the performance of databases and the apps that use them.

    Join Augie Gonzalez, Subject Matter Expert, DataCore to see if hyper covergence holds an answer to reducing latency and driving performance in database operations. But be careful, not all hyper converged solutions show dramatic improvements across the I/O path.
  • Next-generation Infrastructure Consolidation: Going Beyond Server Virtualization Next-generation Infrastructure Consolidation: Going Beyond Server Virtualization David Floyer, Chief Technology Officer, Wikibon; Sushant Rao, Sr. Director of Product Marketing, DataCore Software Recorded: Aug 23 2016 48 mins
    Server virtualization was supposed to consolidate and simplify IT infrastructure in data centers. But, that only “sort of happened”. Companies do have fewer servers but they never hit the consolidation ratios they expected. Why? In one word, performance.

    Surveys show that 61% of companies have experienced slow applications after server virtualization with 77% pointing to I/O problems as the culprit.

    Now, companies are looking to take the next step to fulfill their vision of consolidating and reducing the complexity of their infrastructure. But, this will only happen if their applications get the I/O performance they need.

    This is where DataCore’s Parallel I/O technology comes in. By processing I/Os in parallel leveraging multi-core, multi-processor systems, Parallel I/O delivers industry leading I/O response times as well as price/performance. The net benefit is that fewer storage nodes can provide much better performance, allowing you to reduce and simplify your infrastructure.
  • Bringing the Speed of Flash to Your Software-Defined Storage Infrastructure Bringing the Speed of Flash to Your Software-Defined Storage Infrastructure Augie Gonzalez, Director Product Marketing, DataCore; Steve Wilkins, Manager Product Marketing, SanDisk Recorded: Aug 23 2016 38 mins
    Do you run a mix of virtualized and diverse workloads, including block storage? Are you looking to increase density and maintain blazingly fast speeds? If so, this webinar is for you!

    In this webinar, speakers from DataCore and SanDisk will discuss the performance and economic advantages of combining software-defined-storage with all-flash storage. We’ll also share two customer stories on how they were able to:

    - Achieve effortless and non-disruptive data migration from magnetic to flash storage
    - Prevent storage-related downtime
    - Dynamically control the movement of data from flash to high-capacity storage
    - Strike the right economic balance between fast performance and low cost

    Don’t let data growth and complex workloads slow you down. Attend this webinar and learn about new possibilities.
  • When Bad Things Happen to Good Data:  Plan For & Achieve Business Continuity When Bad Things Happen to Good Data: Plan For & Achieve Business Continuity Jon Toigo, Chairman, Data Management Institute Recorded: Jul 28 2016 46 mins
    High Availability doesn’t trump Disaster Recovery and there is nothing simple about creating a recovery capability for your business – unless you have a set of data protection and business continuity services that can be applied intelligently to your workload, managed centrally, and tested non-disruptively. The good news is that developing such a capability, which traditionally required the challenge of selecting among multiple point product solutions then struggling to fit them into a coherent disaster prevention and recovery framework, just got a lot easier.

    Join us and learn how DataCore’s Software-Defined and Hyper-Converged Storage platform provides the tools you need and a service management methodology you require to build a fully functional recovery strategy at a cost you can afford.
  • Improving the Performance of Microsoft SQL Server with Parallel I/O Improving the Performance of Microsoft SQL Server with Parallel I/O Alan Porter, Solutions Architect, DataCore Software & Steve Hunsaker, Solutions Architect, DataCore Software Recorded: Jul 19 2016 34 mins
    The virtualization wave is beginning to stall as companies confront application performance problems that can no longer be addressed effectively.

    DataCore’s Parallel I/O breakthrough not only solves the immediate performance problem facing multi-core virtualized environments, but it significantly increases the VM density possible per physical server. In effect, it achieves remarkable cost reductions through maximum utilization of CPUs, memory and storage while fulfilling the promise of virtualization.

    Join us for this webinar where we will take an inside look into DataCore’s Parallel I/O technology and show you what it can do for businesses running Microsoft SQL Server to improve the performance of database-driven applications.