Hi [[ session.user.profile.firstName ]]

WANdisco | Big Data & Cloud

  • Date
  • Rating
  • Views
  • How to Empower Your Cloud Object Storage Strategy with LIVE DATA
    How to Empower Your Cloud Object Storage Strategy with LIVE DATA
    Paul Scott-Murphy VP of Product Management WANdisco, Recorded: Feb 7 2018 52 mins
    Learn how you can break through the limitations of cloud object storage implementations with a LIVE DATA strategy. Moving to the cloud frees companies from their on-premises confines, and cloud object storage enables vast scalability and flexibility for data in its native format. Yet cloud object storage applications present their own challenges. Legacy solutions dictate what options can be deployed while the risk of vendor lock-in by choosing a single provider restricts future implementations and hampers pricing negotiation. A live data platform that enables active replication ensures that data that is always available, always consistent and always protected, fulfilling on the promise of cloud object storage.

    Join us as Paul Scott-Murphy, WANdisco VP of Product Management, discusses the benefits of a cloud object storage approach empowered with a live data platform to help companies break down the silos and implement a robust multi cloud strategy.

    In this webinar we will discuss:

    - When and why you should consider cloud object storage solutions
    - Key cloud object storage platforms available (S3, Dell EMC, IBM)
    - Limitations of going with a single vendor, and benefits of multi-cloud
    - Need for and benefits of LIVE DATA strategy in a multi-cloud environment
    - Register now to learn how to implement a cloud object storage initiative while maintaining flexibility for the future and maximizing current capabilities through a live data strategy
  • Extend On-premise Deployments to Azure HDInsight in Real-time
    Extend On-premise Deployments to Azure HDInsight in Real-time
    Paul Scott-Murphy, VP of Product Management, WANdisco, Pranav Rastogi, Program Manager, Microsoft Recorded: Oct 25 2017 54 mins
    Join us to learn how you can achieve continuous replication at scale between multiple Big Data and Cloud environments with guaranteed data consistency and continuous availability.

    Pranav Rastogi, Microsoft Program Manager, and Paul Scott-Murphy, WANdisco VP of Product Marketing, will discuss how Microsoft customers can achieve a truly hybrid architecture for on-demand data analytics and offsite disaster recovery via a single-click installation from the Azure marketplace.

    During this webinar you will also learn:

    - How to meet stringent data availability and compliance requirements whilst seamlessly moving production data at petabyte scale from on-premises Big Data deployments to MS Azure.
    - How to save money and achieve greater ROI with no need for dedicated hardware or storage systems.
    - How to enable data to span HDFS, object storage, and other systems.
    - How to select subsets of content for replication with fine-grained control over where data resides.
    - How to recover from intermittent network or system failures automatically.
    Register now to learn how to get the most value from your data while also meeting strict service level agreements.
  • Keeping Subversion In Sync Across Global Data Centers: Case Study with MaxLinear
    Keeping Subversion In Sync Across Global Data Centers: Case Study with MaxLinear
    Russ Hil, Account & Renewals Manager Americas at WANdisco, Owen Ofiesh, Software Configuration Manager at MaxLinear Recorded: Aug 24 2017 45 mins
    Join us to learn how MaxLinear relies on WANdisco to improve productivity with Subversion Multisite delivering results such as:

    - A 24/7 continuous integration environment with zero Subversion downtime
    - Improved administrative efficiencies with Access Control
    - Elimination of the effects of network failures and the dependency on legacy backup procedures
    - Overcoming the challenges with Subversion mirrors

    About the Presenters:

    Russ Hill, Account & Renewals Manager Americas at WANdisco. Russ Hill works with our existing SCM install base as an account manager and renewals specialist. He works closely with the WANdisco Professional Services team on all SCM service opportunities in North America and is currently responsible for all new SCM opportunities within the Americas.

    Owen Ofiesh Software Configuration Manager at MaxLinear. Owen Ofiesh is the Software Configuration Manager for MaxLinear, a global chip design firm. With over 15 years experience in configuration management, he has a strong background in many of the most common SCM tools and platforms. Owen has worked with WANdisco Subversion MultiSite for over six years and has a great understanding of how it compares and contrasts with other SCM tools.
  • Disaster Recovery for Hadoop
    Disaster Recovery for Hadoop
    Paul Scott-Murphy, WANdisco VP Product Management Big Data/Cloud Recorded: May 11 2017 40 mins
    Join us as Paul Scott-Murphy, WANdisco VP of Product Management, discusses disaster recovery for Hadoop. Learn how to fully operationalize Hadoop to exceed the most demanding SLAs across clusters running any mix of distributions any distance apart, including how to:

    - Enable continuous read/write access to data for automated forward recovery in the event of an outage
    - Eliminate the expense of hardware and other infrastructure normally required for DR on-premises
    - Handle out of sync conditions with guaranteed consistency across clusters
    - Prevent administrator error leading to extended downtime and data loss during disaster recovery
  • Cloud migration & hybrid cloud with no downtime and no disruption
    Cloud migration & hybrid cloud with no downtime and no disruption
    Paul Scott-Murphy, WANdisco VP Product Management Big Data/Cloud and James Curtis, 451 Research Senior Analyst Recorded: Apr 13 2017 46 mins
    Cloud migration and hybrid cloud with no downtime and no disruption:
    If business-critical applications with continually changing data are really moving to the cloud, the typical lift and shift approach of copying your data onto an appliance and shipping it back to the cloud vendor to load onto their storage days later, isn’t going to work. Nor will the one-way batch replication solutions that can’t maintain consistency between on-premises and cloud storage. Join us as we discuss how to migrate to the cloud without production downtime and post-migration deploy a true hybrid cloud, elastic data center solution that turns the cloud into a real-time extension of your on-premises environment. These capabilities enable a host of use cases, including using the cloud for offsite disaster recovery with no downtime and no data loss.
  • Continuous Replication and Migration for Network File Systems
    Continuous Replication and Migration for Network File Systems
    Paul Scott-Murphy, WANdisco VP Product Management Big Data/Cloud Recorded: Apr 11 2017 43 mins
    Fusion® 2.10, the new major release from WANdisco, adds support for seamless data replication at petabyte scale from Network File Systems for NetApp devices to any mix of on-premises and cloud environments. NetApp devices are now able to continue processing normal operations while WANdisco Fusion® allows data to replicate in phases with guaranteed consistency and no disruption to target environments, including those of cloud storage providers. This new capability supports hybrid cloud use cases for on-demand burst-out processing for data analytics and offsite disaster recovery with no downtime and no data loss.
  • Building a truly hybrid cloud with Google Cloud
    Building a truly hybrid cloud with Google Cloud
    James Malone, Google Cloud Dataproc Product Manager and Paul Scott-Murphy, WANdisco VP of Product Management Recorded: Mar 30 2017 50 mins
    Join James Malone, Google Cloud Dataproc Product Manager and Paul Scott-Murphy, WANdisco VP of Product Management, as they explain how to address the challenges of operating hybrid environments that span Google and on-premises services, showing how active data replication that guarantees consistency can work at scale. Register now to learn how to provide local speed of access to data across all environments, allowing hybrid solutions to leverage the power of Google Cloud.
  • ETL and big data: Building simpler data pipelines
    ETL and big data: Building simpler data pipelines
    Paul Scott-Murphy Recorded: Feb 14 2017 61 mins
    In the traditional world of EDW, ETL pipelines are a troublesome bottleneck when preparing data for use in the data warehouse. ETL pipelines are notoriously expensive and brittle, so as companies move to Hadoop they look forward to getting rid of the ETL infrastructure.

    But is it that simple? Some companies are finding that in order to move data between clusters for backup or aggregation purposes, whether on-premises or to the cloud, they are building systems that look an awful lot like ETL.
  • Using the cloud for on-premises disaster recovery
    Using the cloud for on-premises disaster recovery
    Paul Scott-Murphy, VP Product Management Recorded: Jan 26 2017 53 mins
    The cloud greatly extends disaster recovery options, yields significant cost savings by removing the need for DR hardware and support staff on-premises, and provides insurance against a total on-premises infrastructure failure. However, solutions available for cloud DR vary greatly, directly impacting the amount of downtime and data loss experienced after an outage. Join us as we review the solutions available and explain how the cloud can be used for on-premises system DR with virtually zero downtime and data loss.
  • Big data storage: Options and recommendations
    Big data storage: Options and recommendations
    Jagane Sundar, WANdisco CTO Recorded: Jan 11 2017 41 mins
    Hadoop clusters are often built around commodity storage, but architects now have a wide selection of Big Data storage choices, including solid-state or spinning disk for clusters and enterprise storage for compatibility layers and connectors.

    In this webinar, our CTO will review the storage options available to Hadoop architects and provide recommendations for each use case, including an active-active replication option that makes data available across multiple storage systems.

Embed in website or blog