Hi [[ session.user.profile.firstName ]]

Best Practices for Creating Highly Available Architectures

Got High Availability for data storage? Join Tim Warden as he discusses and presents his findings from years in the field on how to best protect data and prevent planned and unplanned downtime. Using his background from Data General Corporation's storage division, CLARiiON, Tim brings us some of the best ideas on how to architect Highly Available system architectures for SME and large Enterprises. With an emphasis on data protection and redundancy for High Availability systems, Mr. Warden discusses how to prevent unplanned downtime and his experience with his six (6) criteria for a well-defined architecture: Redundancy, Autonomy, Separation, Asymmetry, Diversity, and Polylithic Design.
Recorded Aug 4 2014 21 mins
Your place is confirmed,
we'll send you email reminders
Presented by
Tim Warden, Senior Solutions Architect at DataCore Software
Presentation preview: Best Practices for Creating Highly Available Architectures

Network with like-minded attendees

  • [[ session.user.profile.displayName ]]
    Add a photo
    • [[ session.user.profile.displayName ]]
    • [[ session.user.profile.jobTitle ]]
    • [[ session.user.profile.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(session.user.profile) ]]
  • [[ card.displayName ]]
    • [[ card.displayName ]]
    • [[ card.jobTitle ]]
    • [[ card.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(card) ]]
  • Channel
  • Channel profile
  • Meet DataCore: Data Infrastructure for Next-Generation Data Centers Jan 10 2017 7:00 pm UTC 45 mins
    Tara Callan, Marketing Manager, DataCore Software
    We think differently. We innovate through software and challenge the IT status quo.

    We pioneered software-based storage virtualization. Now, we are leading the Software-defined and Parallel Processing revolution. Our Application-adaptive software exploits the full potential of servers and storage to solve data infrastructure challenges and elevate IT to focus on the applications and services that power their business.

    DataCore parallel I/O and virtualization technologies deliver the advantages of next generation enterprise data centers – today – by harnessing the untapped power of multicore servers. DataCore software solutions revolutionize performance, cost-savings, and productivity gains businesses can achieve from their servers and data storage.

    Join this webinar to meet DataCore, learn about what we do and how we can help your business.
  • AWS re:Invent 2016 Wrap Up: Top Trends You Need to Know Dec 15 2016 3:00 pm UTC 45 mins
    Todd Mace, Tech Evangelist, DataCore Software; Sushant Rao, Sr. Director of Product Marketing, DataCore Software
    Did you attend AWS re:Invent 2016? If so, then let us summarize the top trends we observed at the show and compare notes with you.

    Missed the event? Don’t worry, we have you covered. Join us as we cover the top trends and highlights of the news and interesting tidbits from re:Invent. It’s the next best thing to being there!
  • Meet DataCore: Data Infrastructure for Next-Generation Data Centers Recorded: Dec 6 2016 42 mins
    Haluk Ulubay, Sr. Director of Marketing, DataCore Software
    We think differently. We innovate through software and challenge the IT status quo.

    We pioneered software-based storage virtualization. Now, we are leading the Software-defined and Parallel Processing revolution. Our Application-adaptive software exploits the full potential of servers and storage to solve data infrastructure challenges and elevate IT to focus on the applications and services that power their business.

    DataCore parallel I/O and virtualization technologies deliver the advantages of next generation enterprise data centers – today – by harnessing the untapped power of multicore servers. DataCore software solutions revolutionize performance, cost-savings, and productivity gains businesses can achieve from their servers and data storage.

    Join this webinar to meet DataCore, learn about what we do and how we can help your business.
  • Bringing Cloud Infrastructure to Your Data Center Recorded: Nov 16 2016 50 mins
    Jon Toigo, Chairman, Data Management Institute; Augie Gonzalez, Director of Product Marketing, DataCore Software
    Clouds have been touted as the next big thing in IT infrastructure for nearly a decade, but larger enterprises have been slower to adopt the service delivery model than smaller firms. The reasons have been several, but a key stumbling block has been the legacy storage infrastructure that companies have already deployed. Planners are reluctant to rip and replace legacy shared storage infrastructure, especially in those cases where applications have been designed to use the particular storage topology or when anticipated returns have not yet been obtained from storage investments. While the appeal of hybrid public-private cloud data centers is a given, and the concept of building your base and buying your burst is viewed as a sound strategy for next generation data center design, there is still a need for technology that will enable existing shared storage infrastructure to be included and managed alongside software-defined and hyper-converged infrastructure models. Such technology must work with storage infrastructure both on premise and in a public cloud and without sacrificing performance or resiliency.

    The good news is that the technology exists today that can help organizations:

    - Reduce I/O latency at the source, by parallelizing raw I/O output
    - Retain shared storage infrastructure while embracing software-defined storage where it makes sense
    - Deliver the right storage services to the right data based on what the data requires

    See how DataCore can help you bring cloud infrastructure to your data center in an intelligent way.
  • Fulfilling the Promise of Infrastructure Consolidation Recorded: Oct 12 2016 58 mins
    Jon Toigo, Chairman, Data Management Institute; Augie Gonzalez, Director of Product Marketing, DataCore Software
    In the early 2000s, server virtualization was hailed by hypervisor vendors as the key to data center infrastructure consolidation and huge CAPEX cost reductions. In many firms, it didn’t quite pan out that way.

    A new generation of thought leaders is arguing that hyper-converged infrastructure will pick up where hypervisor computing left off. The claims of CAPEX and OPEX improvements may sound familiar, but the technology worldview has expanded to include not only application workload and server hardware, but also software-defined networking and storage.

    Join us in this webinar for a frank and thoughtful discussion of hyper-convergence and how it might well be the key to hyper-consolidation.
  • Microsoft Ignite 2016 Review: Top Trends You Need to Know Recorded: Oct 5 2016 47 mins
    Sushant Rao, Sr. Director of Product Marketing, DataCore Software; Todd Mace, Tech Evangelist, DataCore Software
    Did you attend Microsoft Ignite 2016? If so, then let us summarize the top trends we observed at the show and compare notes with you.
    Missed the event? Don’t worry, we have you covered. Join us as we cover the top trends and highlights of the news and interesting tidbits from Ignite. It’s the next best thing to being there!
  • VMworld 2016 Wrap Up: Top Trends You Need to Know Recorded: Sep 22 2016 48 mins
    Sushant Rao, Sr. Director of Product Marketing, DataCore Software
    Did you attend VMworld 2016? If so, then let us summarize the top trends we observed at the show and compare notes with you.
    Missed the event? Don’t worry, we have you covered. Join us as we cover the top trends and highlights of the news and interesting tidbits from VMworld. It’s the next best thing to being there!
  • The Fallacy of Using IOPS to Measure I/O Performance Recorded: Sep 21 2016 40 mins
    Sushant Rao, Sr. Director of Product Marketing, DataCore Software
    Surveys show that 61% of companies have experienced slow applications after server virtualization with 77% pointing to I/O problems as the culprit. So, companies are desperately trying to improve their I/O performance.

    This brings up the question of what measure should IT departments use to evaluate storage products’ performance.

    The most common performance measurement is IOPS. But, is this the best metric? IOPS is dependent on a number of issues including, the type of workload being simulated, read/write percentages, sequent/random percentages and block size.

    Most importantly, IOPS doesn’t indicate the response time.

    So, what metric should be used to measure I/O performance?

    In this webinar, we’ll discuss Latency Curves, which show the response time the application sees as the storage product deals with more and more I/O (measured by IOPS). This measure will help you understand why some storage products can claim high number of IOPS but still perform poorly when they run into peak demand.
  • How hyper converged infrastructures transform databases Recorded: Sep 8 2016 45 mins
    Augie Gonzalez, Storage Virtualisation Specialist
    HyperConverge like a Pro - part 2

    Regardless of your industry, databases form the core of your profitability. Whether online transaction processing systems, Big Data analytics systems, or reporting systems, databases manage your most important information – the kind of data that directly supports decisions and provides immediate feedback on business actions and results. The performance of databases has a direct bearing on the profitability of your organization, so smart IT planners are always looking for ways to improve the performance of databases and the apps that use them.

    Join Augie Gonzalez, Subject Matter Expert, DataCore to see if hyper covergence holds an answer to reducing latency and driving performance in database operations. But be careful, not all hyper converged solutions show dramatic improvements across the I/O path.
  • Next-generation Infrastructure Consolidation: Going Beyond Server Virtualization Recorded: Aug 23 2016 48 mins
    David Floyer, Chief Technology Officer, Wikibon; Sushant Rao, Sr. Director of Product Marketing, DataCore Software
    Server virtualization was supposed to consolidate and simplify IT infrastructure in data centers. But, that only “sort of happened”. Companies do have fewer servers but they never hit the consolidation ratios they expected. Why? In one word, performance.

    Surveys show that 61% of companies have experienced slow applications after server virtualization with 77% pointing to I/O problems as the culprit.

    Now, companies are looking to take the next step to fulfill their vision of consolidating and reducing the complexity of their infrastructure. But, this will only happen if their applications get the I/O performance they need.

    This is where DataCore’s Parallel I/O technology comes in. By processing I/Os in parallel leveraging multi-core, multi-processor systems, Parallel I/O delivers industry leading I/O response times as well as price/performance. The net benefit is that fewer storage nodes can provide much better performance, allowing you to reduce and simplify your infrastructure.
  • Bringing the Speed of Flash to Your Software-Defined Storage Infrastructure Recorded: Aug 23 2016 38 mins
    Augie Gonzalez, Director Product Marketing, DataCore; Steve Wilkins, Manager Product Marketing, SanDisk
    Do you run a mix of virtualized and diverse workloads, including block storage? Are you looking to increase density and maintain blazingly fast speeds? If so, this webinar is for you!

    In this webinar, speakers from DataCore and SanDisk will discuss the performance and economic advantages of combining software-defined-storage with all-flash storage. We’ll also share two customer stories on how they were able to:

    - Achieve effortless and non-disruptive data migration from magnetic to flash storage
    - Prevent storage-related downtime
    - Dynamically control the movement of data from flash to high-capacity storage
    - Strike the right economic balance between fast performance and low cost

    Don’t let data growth and complex workloads slow you down. Attend this webinar and learn about new possibilities.
  • When Bad Things Happen to Good Data: Plan For & Achieve Business Continuity Recorded: Jul 28 2016 46 mins
    Jon Toigo, Chairman, Data Management Institute
    High Availability doesn’t trump Disaster Recovery and there is nothing simple about creating a recovery capability for your business – unless you have a set of data protection and business continuity services that can be applied intelligently to your workload, managed centrally, and tested non-disruptively. The good news is that developing such a capability, which traditionally required the challenge of selecting among multiple point product solutions then struggling to fit them into a coherent disaster prevention and recovery framework, just got a lot easier.

    Join us and learn how DataCore’s Software-Defined and Hyper-Converged Storage platform provides the tools you need and a service management methodology you require to build a fully functional recovery strategy at a cost you can afford.
  • Improving the Performance of Microsoft SQL Server with Parallel I/O Recorded: Jul 19 2016 34 mins
    Alan Porter, Solutions Architect, DataCore Software & Steve Hunsaker, Solutions Architect, DataCore Software
    The virtualization wave is beginning to stall as companies confront application performance problems that can no longer be addressed effectively.

    DataCore’s Parallel I/O breakthrough not only solves the immediate performance problem facing multi-core virtualized environments, but it significantly increases the VM density possible per physical server. In effect, it achieves remarkable cost reductions through maximum utilization of CPUs, memory and storage while fulfilling the promise of virtualization.

    Join us for this webinar where we will take an inside look into DataCore’s Parallel I/O technology and show you what it can do for businesses running Microsoft SQL Server to improve the performance of database-driven applications.
  • How to Integrate Hyper-converged Systems with Existing SANs Recorded: Jun 29 2016 41 mins
    Augie Gonzalez, Storage Virtualization Specialist
    Part 1 of How to Hyper-converge like a Pro:

    Hyper-converged systems offer a great deal of promise and yet come with a set of limitations. While they allow enterprises to re-integrate system components into a single enclosure and reduce the physical complexity, floor space and cost of supporting a workload in the data center, they also often will not support existing storage in local SANs or offered by cloud service providers. There are solutions available to address these challenges and allow hyper-converged systems to realize their promise. During this session you will learn:

    • What are hyper-converged systems?
    • What challenges do they pose?
    • What should the ideal solution to those challenges look like?
    • About a solution that helps integrate hyper-converged systems with existing SANs
  • Meet DataCore: Data Infrastructure for Next-Generation Data Centers Recorded: Jun 7 2016 29 mins
    Sushant Rao, Sr. Director of Product Marketing, DataCore Software
    We think differently. We innovate through software and challenge the IT status quo.

    We pioneered software-based storage virtualization. Now, we are leading the Software-defined and Parallel Processing revolution. Our Application-adaptive software exploits the full potential of servers and storage to solve data infrastructure challenges and elevate IT to focus on the applications and services that power their business.

    DataCore parallel I/O and virtualization technologies deliver the advantages of next generation enterprise data centers – today – by harnessing the untapped power of multicore servers. DataCore software solutions revolutionize performance, cost-savings, and productivity gains businesses can achieve from their servers and data storage.

    Join this webinar to meet DataCore, learn about what we do and how we can help your business.
  • Data Center Consolidation and Simplification with Hyper-converged Recorded: May 18 2016 38 mins
    Sushant Rao, Sr. Director of Product Marketing, DataCore Software
    Server virtualization was supposed to consolidate and simplify IT infrastructure in data centers. But, that only “sort of happened”. Companies do have fewer servers but they never hit the consolidation ratios they expected. Why? In one word, performance.

    Surveys show that 61% of companies have experienced slow applications after server virtualization with 77% pointing to I/O problems as the culprit.

    Now, with hyper-converged, companies have another opportunity to fulfill their vision of consolidating and reduce the complexity of their infrastructure. But, this will only happen if their applications get the I/O performance they need.

    Join us for this webinar where we will show you how to get industry leading I/O response times and the best price/performance so you can reduce and simplify your infrastructure.
  • When Bad Things Happen to Good Data: Plan For and Achieve Business Continuity Recorded: Apr 27 2016 48 mins
    Jon Toigo, Chairman, Data Management Institute
    High Availability doesn’t trump Disaster Recovery and there is nothing simple about creating a recovery capability for your business – unless you have a set of data protection and business continuity services that can be applied intelligently to your workload, managed centrally, and tested non-disruptively. The good news is that developing such a capability, which traditionally required the challenge of selecting among multiple point product solutions then struggling to fit them into a coherent disaster prevention and recovery framework, just got a lot easier.

    Join us and learn how DataCore’s Software-Defined Storage platform provides the tools you need and a service management methodology you require to build a fully functional recovery strategy at a cost you can afford.
  • Driving Down Hardware Costs for High Performance Databases Recorded: Apr 5 2016 44 mins
    Dan Kusnetzky, Distinguished Analyst Founder, Kusnetzky Group LLC; Augie Gonzalez, Director of Product Marketing, DataCore
    Learn how to super-charge high-performance database applications while slicing hardware costs.

    High performance databases are at the heart of many extreme transaction processing to Big Data to Engineering and research to emerging Internet of Things applications. A common thread is the requirement for very fast database access combined with very large databases. Addressing these requirements in the past required expensive and complex data storage strategies. Storage virtualization, Parallel I/O and intelligent caching can address these requirements at a much lower cost.

    Join us as industry analyst, Dan Kusnetzky provides some insight on the following:

    • Why are high-performance databases required?
    • Why does that lead to an expensive storage infrastructure?
    • What new technology can provide the needed performance at a lower cost?
  • Building High Performance Infrastructure for Databases Recorded: Mar 29 2016 44 mins
    Jon Toigo, Chairman, Data Management Institute; Sushant Rao, Sr. Director of Product Marketing, DataCore Software
    Learn how to reduce latency and improve performance in your database environment without expensive hardware rip and replace.

    Regardless of your industry, chances are that databases form the core of your profitability. Whether online transaction processing systems, Big Data analytics systems, or reporting systems, databases manage your most important information – the kind of data that directly supports decisions and provides immediate feedback on business actions and results. The performance of databases has a direct bearing on the profitability of your organization, and these days, with 70 percent of respondents to one recent survey stating that IT must justify its budget by demonstrating real contributions to the bottom line, smart IT planners are always looking for ways to improve the performance of databases and the apps that use them.

    Many in the industry are pitching expensive flash storage peripherals to reduce latency and drive performance in database operations, but what is really needed is improvement across the I/O path – cost-effective improvements to infrastructure that will yield measurable gains not only in database processing, but also in the extract-transform-load workflows that define overall performance efficiency.

    Join us as industry analyst, Jon Toigo provides an overview of a strategy you can use to reduce latency and improve database performance without breaking the bank.
  • Integrating Hyper-converged Systems with Existing SANs Recorded: Feb 24 2016 37 mins
    Dan Kusnetzky, Distinguished Analyst Founder, Kusnetzky Group LLC; Augie Gonzalez, Director of Product Marketing, DataCore
    Hyper-converged systems offer a great deal of promise and yet come with a set of limitations. While they allow enterprises to re-integrate system components into a single enclosure and reduce the physical complexity, floor space and cost of supporting a workload in the data center, they also often will not support existing storage in local SANs or offered by cloud service providers. There are solutions available to address these challenges and allow hyper-converged systems to realize their promise. During this session you will learn:

    - What are hyper-converged systems?
    - What challenges do they pose?
    - What should the ideal solution to those challenges look like?
    - About a solution that helps integrate hyper-converged systems with existing SANs
All you need to know about Hyper-converged & Software-Defined Storage
Organizations around the world are experiencing the benefits of increased performance, continuous business operations and lower costs through server consolidation and virtualization. Many of them are now evaluating Hyper-converged systems and Software-Defined Storage to develop their next-generation storage architectures. Join this channel to hear industry and technical experts discuss the value of Hyper-converged and Software-Defined Storage and how you can take advantage of these technologies.

Embed in website or blog

Successfully added emails: 0
Remove all
  • Title: Best Practices for Creating Highly Available Architectures
  • Live at: Aug 4 2014 6:00 pm
  • Presented by: Tim Warden, Senior Solutions Architect at DataCore Software
  • From:
Your email has been sent.
or close