Hi [[ session.user.profile.firstName ]]

Data Virtualization

  • Date
  • Rating
  • Views
  • In Memory Parallel Processing for Big Data Scenarios
    In Memory Parallel Processing for Big Data Scenarios Paul Moxon, VP Data Architecture and Chief Evangelist, Denodo Recorded: Feb 15 2018 48 mins
    Denodo Platform offers one of the most sought after data fabric capabilities through data discovery, preparation, curation and integration across the broadest range of data sources. As data volume and variety grows exponentially, Denodo Platform 7.0 will offer in-memory massive parallel processing (MPP) capability for the most advanced query optimization in the market.

    Attend this session to learn:
    * How Denodo Platform 7.0’s native built-in integration with MPP systems will provide query acceleration and MPP caching
    * How to successfully approach highly complex big data scenarios, leveraging inexpensive MPP solutions
    * With the MPP capability in place, how data driven insights can be generated in real-time with Denodo Platform

    Agenda:
    * Challenges with traditional architectures
    Denodo Platform MPP capabilities and applications
    * Product demonstration
    * Q&A
  • GDPR Noncompliance: Avoid the Risk with Data Virtualization
    GDPR Noncompliance: Avoid the Risk with Data Virtualization Mark Pritchard, Sales Engineer, Denodo. Lakshmi Randall, Director of Product Marketing, Denodo Recorded: Jan 31 2018 58 mins
    In its recent report “Predictions 2018: A year of reckoning”, Forrester predicts that 80% of firms affected by GDPR will not comply with the regulation by May 2018 and 50% of them will have done it intentionally.

    Compliance doesn’t have to be this difficult! What if you have an opportunity to facilitate GDPR compliance with a mature technology and significant cost reduction? Data virtualization is a mature, cost-effective technology that facilitates GDPR compliance enabling privacy by design.

    Attend this session to learn:

    - How data virtualization provides a GDPR compliance foundation with data catalog, auditing, and data security.
    - How you can enable single enterprise-wide data access layer with guardrails.
    - Why data virtualization is a must-have capability for compliance use cases.
    - How Denodo’s customers have facilitated compliance.
  • Data Virtualization - Enabling Next Generation Analytics
    Data Virtualization - Enabling Next Generation Analytics Boris Evelson, Vice President and Principal Analyst, Forrester Research and Lakshmi Randall, Director, Product Marketing Recorded: Jan 25 2018 64 mins
    Majority of enterprises today are data-aware. Being data-aware, or even data-driven, however, is not enough. Are your data-driven applications providing contextual and actionable insight? Are your analytics applications driving tangible business outcomes? Are you deriving insights from all the enterprise data? Enter Systems Of Insight (SOI), Forrester's latest analytical framework for insights-driven businesses. In this webinar you will learn about the key principles that differentiate data-aware or data-driven businesses from their insights-driven peers and competitors. Specifically the webinar will explore roles data virtualization (aka Data Fabric) plays in modern SOI architectures such as

    * A single virtual catalog / view on all enterprise data sources including data lakes
    * A more agile and flexible virtual enterprise data warehouse
    * A common semantic layer for business intelligence (BI) and analytical applications (aka BI Fabric)
  • An Introduction to Data Virtualization
    An Introduction to Data Virtualization Paul Moxon, VP Data Architecture and Chief Evangelist, Denodo and Edwin Robbins, Sales Engineer, Denodo Recorded: Jan 18 2018 62 mins
    According to Gartner "Through 2020, 50% of enterprises will implement some form of data virtualization as one enterprise production option for data integration". That defines how data virtualization has become an undeniable driving force for companies to implement an agile, real-time and flexible enterprise data architecture.

    Attend this session to learn:

    -Fundamentals of data virtualization and how it differs from traditional data integration approaches
    -The key patterns and use cases of Data Virtualization
    -What to expect in subsequent sessions in the Packed Lunch Webinar Series, which will take a deeper dive into various challenges solved by data virtualization in big data analytics, cloud migration and various other scenarios

    Agenda:
    -Introduction & benefits of DV
    -Summary & Next Steps
    -Q&A
  • Data Virtualization: An Introduction (Packed Lunch Webinars)
    Data Virtualization: An Introduction (Packed Lunch Webinars) Paul Moxon, VP Data Architectures & Chief Evangelist, Denodo Recorded: May 18 2017 56 mins
    According to Gartner, “By 2018, organizations with data virtualization capabilities will spend 40% less on building and managing data integration processes for connecting distributed data assets.” This solidifies Data Virtualization as a critical piece of technology for any flexible and agile modern data architecture.

    This session will:

    Introduce data virtualization and explain how it differs from traditional data integration approaches
    Discuss key patterns and use cases of Data Virtualization
    Set the scene for subsequent sessions in the Packed Lunch Webinar Series, which will take a deeper dive into various challenges solved by data virtualization.
    Agenda:

    Introduction & benefits of DV
    Summary & Next Steps
    Q&A
  • Big Data Fabric: A Recipe for Big Data Initiatives
    Big Data Fabric: A Recipe for Big Data Initiatives Paul Moxon, VP Data Architectures & Chief Evangelist, Denodo Recorded: Mar 13 2017 55 mins
    Big data fabric combines essential big data capabilities in a single platform to automate the many facets of data discovery, preparation, curation, orchestration, and integration across a multitude of data sources.

    Attend this session to learn how Big Data Fabric enabled by data virtualization constitutes a recipe for:

    * Enabling new actionable insights with minimal effort
    * Securing big data end-to-end
    * Addressing big data skillset scarcity
    * Providing easy access to data without having to decipher various data formats

    Agenda:
    * Big Data with Data Virtualization
    * Product Demonstration
    * Summary & Next Steps
    * Q&A
  • Under the Hood of Denodo’s Query Optimizer
    Under the Hood of Denodo’s Query Optimizer Pablo Álvarez, Principal Technical Account Manager, Denodo Recorded: Mar 10 2017 57 mins
    The optimizer is one of the most complex parts of any data engine; it must ensure that the execution engine is performing at its best, so understanding how it works, is crucial.

    This webinar deep dives into the various performance optimization techniques employed by Denodo’s Dynamic Query Optimizer and illustrates these techniques via a demonstration.

    View this session to ensure that you achieve the maximum value from your data processing infrastructure:

    Various components of query optimization engine
    Different stages leveraged within the optimizer and the techniques applied.


    When, how and why to use each option available in the query optimizer
    Don’t miss the chance to discover Denodo’s Query Optimizer capabilities.
  • Powering Self Service Business Intelligence with Hadoop and Data Virtualization
    Powering Self Service Business Intelligence with Hadoop and Data Virtualization Mark Pritchard, Denodo and Sean Roberts, Hortonworks Recorded: Jan 17 2017 64 mins
    Vizient needed a unified view of their accounting and financial data marts to enable business users to discover the information they need in a self-service manner and to be able to provide excellent service to their members. Vizient selected Hortonworks Big Data Platform and Denodo Data Virtualization Platform so that they can unify their distributed data sets in a data lake, and at the same time provide an abstraction for end users for easy self-serviceable information access.

    During this webinar, you will learn:

    1) The role, use, and benefits of Hortonworks Data Platform in the Modern Data Architecture.
    2) How Hadoop and data virtualisation simplify data management and self-service data discovery.
    3)What data virtualisation is and how it can simplify big data projects. Best practices of using Hadoop with data virtualisation

    About Vizient
    Vizient, Inc. is the largest nationwide network of community-owned health care systems and their physicians in the US. Vizient™ combines the strengths of VHA, University HealthSystem Consortium (UHC), Novation and MedAssets SCM and Sg2, trusted leaders focused on solving health care's most pressing challenges. Vizient delivers brilliant resources and powerful data driven insights to healthcare organizations.
  • Initiate Logical Data Lake Strategy for IoT
    Initiate Logical Data Lake Strategy for IoT Ana Yong, Director of Product Marketing at Hortonworks, and Lakshmi Randall, Director of Product Marketing at Denodo Recorded: Jan 5 2017 61 mins
    The next industrial revolution is on the horizon, driven by the application of big data, IoT and Cloud technologies.

    Join Hortonworks and Denodo for a discussion of the synergy between Hadoop Data Lake and the Internet of Things. This webinar will include a live demonstration illustrating the ease of leveraging your IoT data lake for Enterprise Analytics.

    Attend & Learn:

    * How to leverage Hadoop Data Lakes to support Internet of Things use cases
    * How to query Hadoop Data Lakes combined with any other structured, semi-structured and unstructured data sources using a single logical data lake
    * How to avoid Data Swamps via a light weight data governance approach that helps enterprises maximize the value of their Data Lake
    * How to use a logical data lake/data warehouse to prevent a physical data lake from becoming a silo

    Don’t miss this opportunity to cut through the hype to understand the critical technologies that form the backbone for IoT solutions.
  • The Big BI Dilemma - Bimodal Logical Data Warehouse to the Rescue!
    The Big BI Dilemma - Bimodal Logical Data Warehouse to the Rescue! Rick van der Lans, Independent Industry analyst, Lakshmi Randall, Head of Product Marketing for Denodo Recorded: Dec 6 2016 59 mins
    The classic unimodal data warehouse architecture has expired because it is restricted to primarily supporting structured data but not the newer data types such as social, streaming, and IoT data. New BI architecture, such as “logical data warehouse”, is required to augment the traditional and rigid unimodal data warehouse systems with a new bimodal data warehouse architecture to support requirements that are experimental, flexible, explorative, and self-service oriented.

    Learn from the Logical Data Warehousing expert, Rick van der Lans, about how you can implement an agile data strategy using a bimodal Logical Data Warehouse architecture.
    In this webinar, you will learn:

    · Why unimodal data warehouse architectures are not suitable for newer data types
    · Why an agile data strategy is necessary to support a bimodal architecture
    · The concept of Bimodal Logical Data Warehouse architecture and why it is the future
    · How Data Virtualization enables the Bimodal Logical Data Warehouse
    · Customer case study depicting successful implementation of this architecture

Embed in website or blog