Hi [[ session.user.profile.firstName ]]

Data Virtualization

  • Date
  • Rating
  • Views
  • Data Virtualization: An Introduction (Packed Lunch Webinars)
    Data Virtualization: An Introduction (Packed Lunch Webinars) Paul Moxon, VP Data Architectures & Chief Evangelist, Denodo Recorded: May 18 2017 56 mins
    According to Gartner, “By 2018, organizations with data virtualization capabilities will spend 40% less on building and managing data integration processes for connecting distributed data assets.” This solidifies Data Virtualization as a critical piece of technology for any flexible and agile modern data architecture.

    This session will:

    Introduce data virtualization and explain how it differs from traditional data integration approaches
    Discuss key patterns and use cases of Data Virtualization
    Set the scene for subsequent sessions in the Packed Lunch Webinar Series, which will take a deeper dive into various challenges solved by data virtualization.
    Agenda:

    Introduction & benefits of DV
    Summary & Next Steps
    Q&A
  • Big Data Fabric: A Recipe for Big Data Initiatives
    Big Data Fabric: A Recipe for Big Data Initiatives Paul Moxon, VP Data Architectures & Chief Evangelist, Denodo Recorded: Mar 13 2017 55 mins
    Big data fabric combines essential big data capabilities in a single platform to automate the many facets of data discovery, preparation, curation, orchestration, and integration across a multitude of data sources.

    Attend this session to learn how Big Data Fabric enabled by data virtualization constitutes a recipe for:

    * Enabling new actionable insights with minimal effort
    * Securing big data end-to-end
    * Addressing big data skillset scarcity
    * Providing easy access to data without having to decipher various data formats

    Agenda:
    * Big Data with Data Virtualization
    * Product Demonstration
    * Summary & Next Steps
    * Q&A
  • Under the Hood of Denodo’s Query Optimizer
    Under the Hood of Denodo’s Query Optimizer Pablo Álvarez, Principal Technical Account Manager, Denodo Recorded: Mar 10 2017 57 mins
    The optimizer is one of the most complex parts of any data engine; it must ensure that the execution engine is performing at its best, so understanding how it works, is crucial.

    This webinar deep dives into the various performance optimization techniques employed by Denodo’s Dynamic Query Optimizer and illustrates these techniques via a demonstration.

    View this session to ensure that you achieve the maximum value from your data processing infrastructure:

    Various components of query optimization engine
    Different stages leveraged within the optimizer and the techniques applied.


    When, how and why to use each option available in the query optimizer
    Don’t miss the chance to discover Denodo’s Query Optimizer capabilities.
  • Powering Self Service Business Intelligence with Hadoop and Data Virtualization
    Powering Self Service Business Intelligence with Hadoop and Data Virtualization Mark Pritchard, Denodo and Sean Roberts, Hortonworks Recorded: Jan 17 2017 64 mins
    Vizient needed a unified view of their accounting and financial data marts to enable business users to discover the information they need in a self-service manner and to be able to provide excellent service to their members. Vizient selected Hortonworks Big Data Platform and Denodo Data Virtualization Platform so that they can unify their distributed data sets in a data lake, and at the same time provide an abstraction for end users for easy self-serviceable information access.

    During this webinar, you will learn:

    1) The role, use, and benefits of Hortonworks Data Platform in the Modern Data Architecture.
    2) How Hadoop and data virtualisation simplify data management and self-service data discovery.
    3)What data virtualisation is and how it can simplify big data projects. Best practices of using Hadoop with data virtualisation

    About Vizient
    Vizient, Inc. is the largest nationwide network of community-owned health care systems and their physicians in the US. Vizient™ combines the strengths of VHA, University HealthSystem Consortium (UHC), Novation and MedAssets SCM and Sg2, trusted leaders focused on solving health care's most pressing challenges. Vizient delivers brilliant resources and powerful data driven insights to healthcare organizations.
  • Initiate Logical Data Lake Strategy for IoT
    Initiate Logical Data Lake Strategy for IoT Ana Yong, Director of Product Marketing at Hortonworks, and Lakshmi Randall, Director of Product Marketing at Denodo Recorded: Jan 5 2017 61 mins
    The next industrial revolution is on the horizon, driven by the application of big data, IoT and Cloud technologies.

    Join Hortonworks and Denodo for a discussion of the synergy between Hadoop Data Lake and the Internet of Things. This webinar will include a live demonstration illustrating the ease of leveraging your IoT data lake for Enterprise Analytics.

    Attend & Learn:

    * How to leverage Hadoop Data Lakes to support Internet of Things use cases
    * How to query Hadoop Data Lakes combined with any other structured, semi-structured and unstructured data sources using a single logical data lake
    * How to avoid Data Swamps via a light weight data governance approach that helps enterprises maximize the value of their Data Lake
    * How to use a logical data lake/data warehouse to prevent a physical data lake from becoming a silo

    Don’t miss this opportunity to cut through the hype to understand the critical technologies that form the backbone for IoT solutions.
  • The Big BI Dilemma - Bimodal Logical Data Warehouse to the Rescue!
    The Big BI Dilemma - Bimodal Logical Data Warehouse to the Rescue! Rick van der Lans, Independent Industry analyst, Lakshmi Randall, Head of Product Marketing for Denodo Recorded: Dec 6 2016 59 mins
    The classic unimodal data warehouse architecture has expired because it is restricted to primarily supporting structured data but not the newer data types such as social, streaming, and IoT data. New BI architecture, such as “logical data warehouse”, is required to augment the traditional and rigid unimodal data warehouse systems with a new bimodal data warehouse architecture to support requirements that are experimental, flexible, explorative, and self-service oriented.

    Learn from the Logical Data Warehousing expert, Rick van der Lans, about how you can implement an agile data strategy using a bimodal Logical Data Warehouse architecture.
    In this webinar, you will learn:

    · Why unimodal data warehouse architectures are not suitable for newer data types
    · Why an agile data strategy is necessary to support a bimodal architecture
    · The concept of Bimodal Logical Data Warehouse architecture and why it is the future
    · How Data Virtualization enables the Bimodal Logical Data Warehouse
    · Customer case study depicting successful implementation of this architecture
  • Design Fast Data Architecture for Big Data with Logical Data Warehouse & Lakes
    Design Fast Data Architecture for Big Data with Logical Data Warehouse & Lakes Kurt Jackson, Platform Lead, Autodesk Recorded: Aug 24 2016 56 mins
    A Case Study presented by Kurt Jackson, Platform Lead, Autodesk

    Companies such as Autodesk are fast replacing the once-true- and-tried physical data warehouses with logical data warehouses/ data lakes. Why? Because they are able to accomplish the same results in 1/6 th of the time and with 1/4 th of the resources.

    In this webinar, Autodesk’s Platform Lead, Kurt Jackson,, will describe how they designed a modern fast data architecture as a single unified logical data warehouse/ data lake using data virtualization and contemporary big data analytics like Spark.

    Logical data warehouse / data lake is a virtual abstraction layer over the physical data warehouse, big data repositories, cloud, and other enterprise applications. It unifies both structured and unstructured data in real-time to power analytical and operational use cases.

    Attend and Learn:

    -Why logical data warehouse/ data lakes are the bedrock of modern data architecture
    -How you can build a logical data warehouse using data virtualization
    -How to create a single, unified enterprise-wide access and governance point for any data used within the company

    Become a fast data strategy expert in one hour!
  • Democratizing Big Data Using Data Virtualization
    Democratizing Big Data Using Data Virtualization Noel Yuhanna, Forrester Research; Matthew Morgan, Hortonworks; Ravi Shankar, Denodo Recorded: Feb 3 2016 67 mins
    Whatever use case you select, you cannot deliver business value if you implement your big data project in a silo. Tying in your big data with your data warehouse, CRM, and ERP applications is paramount to democratizing enterprise data and empowering business users with holistic answers. Data virtualization leader Denodo will demonstrate how you can virtually integrate your big data with other enterprise data at a fraction of the time and cost of physical ETL.

    This compelling content is presented by experts with deep experience in analytics, big data, and data integration. Expect to walk away with actionable insights in 60 minutes:

    * Learn about the latest innovations in big data, and which ones should matter to you most
    * The three practical use cases for which you can use big data now
    * Examples of customers who are successfully combining Hadoop with enterprise data using data virtualization

    Featured Presenters:

    Forrester: Noel Yuhanna – Principal Analyst
    Hortonworks: Matthew Morgan – VP, Product and Alliance
    Denodo: Ravi Shankar – Chief Marketing Officer

    Agenda:

    * Forrester: How to keep up with the major innovations in big data – 20 min
    * Hortonworks: Top three big data use cases – 15 min
    * Denodo: Democratizing big data with data virtualization – 15 min
    * Q&A – 10 min
  • Business Agility Must Be Based on a New Flexible and Agile Data Approach
    Business Agility Must Be Based on a New Flexible and Agile Data Approach Holger Kisker, Ph.D.,Vice President, Research Director Recorded: Oct 5 2015 73 mins
    Join guest speaker Holger Kisker Ph.D. as he discusses what companies need today: a flexible data management architecture to cope with both traditional and emerging sources of data (in any structure), advanced data analytics to extract deeper business insights, and efficient ways to deliver these insights as information or data services for better business decisions. All embedded into an efficient data virtualization layer that makes all data available when, where, and in whatever format it is needed.
  • Build a Contextual Marketing Engine and Fuel It with Data
    Build a Contextual Marketing Engine and Fuel It with Data Cory Munchbach, Analyst Recorded: Sep 17 2015 55 mins
    In this webinar, join guest speaker and analyst at Forrester Research Inc., Cory Munchbach, as she presents the contextual marketing engine, how to build one, and the role of big data and data virtualization in making it go.

Embed in website or blog