Hi [[ session.user.profile.firstName ]]

Design Fast Data Architecture for Big Data with Logical Data Warehouse & Lakes

A Case Study presented by Kurt Jackson, Platform Lead, Autodesk

Companies such as Autodesk are fast replacing the once-true- and-tried physical data warehouses with logical data warehouses/ data lakes. Why? Because they are able to accomplish the same results in 1/6 th of the time and with 1/4 th of the resources.

In this webinar, Autodesk’s Platform Lead, Kurt Jackson,, will describe how they designed a modern fast data architecture as a single unified logical data warehouse/ data lake using data virtualization and contemporary big data analytics like Spark.

Logical data warehouse / data lake is a virtual abstraction layer over the physical data warehouse, big data repositories, cloud, and other enterprise applications. It unifies both structured and unstructured data in real-time to power analytical and operational use cases.

Attend and Learn:

-Why logical data warehouse/ data lakes are the bedrock of modern data architecture
-How you can build a logical data warehouse using data virtualization
-How to create a single, unified enterprise-wide access and governance point for any data used within the company

Become a fast data strategy expert in one hour!
Recorded Aug 24 2016 56 mins
Your place is confirmed,
we'll send you email reminders
Presented by
Kurt Jackson, Platform Lead, Autodesk
Presentation preview: Design Fast Data Architecture for Big Data with Logical Data Warehouse & Lakes

Network with like-minded attendees

  • [[ session.user.profile.displayName ]]
    Add a photo
    • [[ session.user.profile.displayName ]]
    • [[ session.user.profile.jobTitle ]]
    • [[ session.user.profile.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(session.user.profile) ]]
  • [[ card.displayName ]]
    • [[ card.displayName ]]
    • [[ card.jobTitle ]]
    • [[ card.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(card) ]]
  • Channel
  • Channel profile
  • Data Virtualization - An Introduction (packed lunch July 2018) Jul 19 2018 6:00 pm UTC 60 mins
    Paul Moxon, VP Data Architecture and Chief Evangelist, Denodo
    What started to evolve as the most agile and real-time enterprise data fabric, Data Virtualization is proving to go beyond its initial promise and is becoming one of the most important enterprise big data fabrics.

    Attend this session to learn:

    * What data virtualization really is
    * How it differs from other enterprise data integration technologies
    * Why data virtualization is finding enterprise wide deployment inside some of the largest organizations
  • Are you killing the benefits of your data lake? (North America) Jun 27 2018 6:00 pm UTC 45 mins
    Rick van der Lans, Independent Business Intelligence Analyst and Lakshmi Randall, Director of Product Marketing, Denodo
    Data lakes are centralized data repositories. Data needed by data scientists is physically copied to a data lake which serves as a one storage environment. This way, data scientists can access all the data from only one entry point – a one-stop shop to get the right data. However, such an approach is not always feasible for all the data and limits it’s use to solely data scientists, making it a single-purpose system.
    So, what’s the solution?

    A multi-purpose data lake allows a broader and deeper use of the data lake without minimizing the potential value for data science and without making it an inflexible environment.

    Attend this session to learn:

    • Disadvantages and limitations that are weakening or even killing the potential benefits of a data lake.
    • Why a multi-purpose data lake is essential in building a universal data delivery system.
    • How to build a logical multi-purpose data lake using data virtualization.

    Do not miss this opportunity to make your data lake project successful and beneficial.
  • Self-Service Analytics with Guard Rails Recorded: Jun 21 2018 61 mins
    Saptarshi Sengupta and Ed Robbins
    Self-Service BI promises to remove the bottleneck that exists between IT and business users. The truth is, if data is handed over to a wide range of data consumers without proper guardrails in place, it can result in data anarchy.

    Attend this session to learn why data virtualization:

    * Is a must for implementing the right self-service BI
    * Makes self-service BI useful for every business user
    * Accelerates any self-service BI initiative
  • An Introduction to Data Virtualization Recorded: May 22 2018 59 mins
    Ravi Shankar, Chief Marketing Officer, Denodo and Danny Thien, Senior Data Architect, Denodo
    According to Gartner "Through 2020, 50% of enterprises will implement some form of data virtualization as one enterprise production option for data integration". That defines how data virtualization has become an undeniable driving force for companies to implement an agile, real-time and flexible enterprise data architecture.

    Attend this session to learn:

    -Fundamentals of data virtualization and how it differs from traditional data integration approaches
    -The key patterns and use cases of Data Virtualization
    -What to expect in subsequent sessions in the Packed Lunch Webinar Series, which will take a deeper dive into various challenges solved by data virtualization in big data analytics, cloud migration and various other scenarios

    Agenda:
    -Introduction & benefits of DV
    -Summary & Next Steps
    -Q&A
  • Big Data Fabric: A Necessity For Any Successful Big Data Initiative Recorded: May 17 2018 44 mins
    Paul Moxon, VP Data Architecture and Chief Evangelist, Denodo and Naren, Sales Engineer
    While big data initiatives have become necessary for any business to generate actionable insights, big data fabric has become a necessity for any successful big data initiative. The best of breed big data fabrics should deliver actionable insights to the business users with minimal effort, provide end-to-end security to the entire enterprise data platform and provide real-time data integration, while delivering self-service data platform to business users. Attend this session to learn how big data fabric enabled by data virtualization:

    * Provides lightning fast self-service data access to business users
    * Centralizes data security, governance and data privacy
    * Fulfills the promise of data lakes to provide actionable insights
  • A Successful Journey to the Cloud Recorded: Apr 19 2018 60 mins
    Paul Moxon, VP Data Architecture and Chief Evangelist, Denodo
    By 2020, a corporate "no-cloud" policy will be as rare as a "no-internet" policy is today, according to Gartner, Inc. While cloud makes enterprises more flexible and agile, various cloud adoption scenarios such as hybrid cloud, infrastructure modernization and cloud based analytics present a few challenges of their own.

    Attend this session to learn:

    • Challenges involved with various cloud adoption scenarios
    • How data virtualization solves cloud adoption challenges while centralizing data governance and security mechanism
    • How companies are using data virtualization to tackle complex modern customer segmentation problems such as “customer genomics”

    Agenda:

    • Challenges of cloud adoption
    • Data virtualization to the rescue
    • Product demonstration
    • Q&A
  • Building a multi-purpose Data Lake for Increased Business Agility Recorded: Mar 27 2018 39 mins
    Alba Fernández-Arias, Sales Engineering at Denodo.
    The data contained in the data lake is too valuable to restrict its use to just data scientists. It would make the investment in a data lake more worthwhile if the target audience can be enlarged without hindering the original users. However, this is not the case today, most data lakes are single-purpose. Also, the physical nature of data lakes have potential disadvantages and limitations weakening the benefits and possibly even killing a data lake project entirely.

    A multi-purpose data lake allows a broader and greater use of the data lake investment without minimizing the potential value for data science or for making it a less flexible environment. Multi-purpose data lakes are data delivery environments architected to support a broad range of users, from traditional self-service BI users to sophisticated data scientists.

    Attend this session to learn:

    * The challenges of a physical data lake
    * How to create an architecture that makes a physical data lake more flexible
    * How to drive the adoption of the data lake by a larger audience
  • Self-Service Information Consumption Using Data Catalog Recorded: Mar 15 2018 54 mins
    Paul Moxon, VP Data Architecture and Chief Evangelist, Denodo and Phoebe Bakanas, Sales Engineer, Denodo
    Market research shows that around 70% of the self-service initiatives fare “average” or below. Denodo 7.0 information self-service tool will offer data analysts, business users and app developers searching and browsing capability of data and metadata in a business friendly manner for self-service exploration and analytics.

    Attend this session to learn:

    • How business users will be able to use Denodo Platform integrated google-like search for both content and catalog
    • With web based query UI how business users can refine queries without SQL knowledge
    • With tags and business categorization, how to standardize business / canonical views while decoupling development artifacts from the business users

    Agenda:

    • The role of information self-service tool
    • Product demonstration
    • Summary & Next Steps
    • Q&A
  • Data Virtualization - Enabling Next Generation Analytics (EMEA) Recorded: Feb 27 2018 61 mins
    Boris Evelson, Vice President & Principal Analyst, Forrester Research | Lakshmi Randall, Director, Product Marketing, Denodo
    Although most of today’s enterprises are data-aware, this may not be sufficient to drive tangible business outcomes.

    - Are your data-driven applications providing contextual and actionable insight?
    - Are you deriving insights from all the enterprise data?

    Embrace Forrester’s latest analytical framework for insights-driven businesses: Systems of Insight (SOI).

    Join this session to discover the key principles that differentiate data-aware or data-driven businesses from their insights-driven peers and competitors. The session will explore the roles that data virtualization (aka Data Fabric) plays in modern SOI architectures, such as:

    - A single virtual catalog / view on all enterprise data sources including data lakes.
    - A more agile and flexible virtual enterprise data warehouse.
    - A common semantic layer for business intelligence (BI) and analytical applications (aka BI Fabric).
  • In Memory Parallel Processing for Big Data Scenarios Recorded: Feb 15 2018 48 mins
    Paul Moxon, VP Data Architecture and Chief Evangelist, Denodo
    Denodo Platform offers one of the most sought after data fabric capabilities through data discovery, preparation, curation and integration across the broadest range of data sources. As data volume and variety grows exponentially, Denodo Platform 7.0 will offer in-memory massive parallel processing (MPP) capability for the most advanced query optimization in the market.

    Attend this session to learn:
    * How Denodo Platform 7.0’s native built-in integration with MPP systems will provide query acceleration and MPP caching
    * How to successfully approach highly complex big data scenarios, leveraging inexpensive MPP solutions
    * With the MPP capability in place, how data driven insights can be generated in real-time with Denodo Platform

    Agenda:
    * Challenges with traditional architectures
    Denodo Platform MPP capabilities and applications
    * Product demonstration
    * Q&A
  • GDPR Noncompliance: Avoid the Risk with Data Virtualization Recorded: Jan 31 2018 58 mins
    Mark Pritchard, Sales Engineer, Denodo. Lakshmi Randall, Director of Product Marketing, Denodo
    In its recent report “Predictions 2018: A year of reckoning”, Forrester predicts that 80% of firms affected by GDPR will not comply with the regulation by May 2018 and 50% of them will have done it intentionally.

    Compliance doesn’t have to be this difficult! What if you have an opportunity to facilitate GDPR compliance with a mature technology and significant cost reduction? Data virtualization is a mature, cost-effective technology that facilitates GDPR compliance enabling privacy by design.

    Attend this session to learn:

    - How data virtualization provides a GDPR compliance foundation with data catalog, auditing, and data security.
    - How you can enable single enterprise-wide data access layer with guardrails.
    - Why data virtualization is a must-have capability for compliance use cases.
    - How Denodo’s customers have facilitated compliance.
  • Data Virtualization - Enabling Next Generation Analytics Recorded: Jan 25 2018 64 mins
    Boris Evelson, Vice President and Principal Analyst, Forrester Research and Lakshmi Randall, Director, Product Marketing
    Majority of enterprises today are data-aware. Being data-aware, or even data-driven, however, is not enough. Are your data-driven applications providing contextual and actionable insight? Are your analytics applications driving tangible business outcomes? Are you deriving insights from all the enterprise data? Enter Systems Of Insight (SOI), Forrester's latest analytical framework for insights-driven businesses. In this webinar you will learn about the key principles that differentiate data-aware or data-driven businesses from their insights-driven peers and competitors. Specifically the webinar will explore roles data virtualization (aka Data Fabric) plays in modern SOI architectures such as

    * A single virtual catalog / view on all enterprise data sources including data lakes
    * A more agile and flexible virtual enterprise data warehouse
    * A common semantic layer for business intelligence (BI) and analytical applications (aka BI Fabric)
  • An Introduction to Data Virtualization Recorded: Jan 18 2018 62 mins
    Paul Moxon, VP Data Architecture and Chief Evangelist, Denodo and Edwin Robbins, Sales Engineer, Denodo
    According to Gartner "Through 2020, 50% of enterprises will implement some form of data virtualization as one enterprise production option for data integration". That defines how data virtualization has become an undeniable driving force for companies to implement an agile, real-time and flexible enterprise data architecture.

    Attend this session to learn:

    -Fundamentals of data virtualization and how it differs from traditional data integration approaches
    -The key patterns and use cases of Data Virtualization
    -What to expect in subsequent sessions in the Packed Lunch Webinar Series, which will take a deeper dive into various challenges solved by data virtualization in big data analytics, cloud migration and various other scenarios

    Agenda:
    -Introduction & benefits of DV
    -Summary & Next Steps
    -Q&A
  • Data Virtualization: An Introduction (Packed Lunch Webinars) Recorded: May 18 2017 56 mins
    Paul Moxon, VP Data Architectures & Chief Evangelist, Denodo
    According to Gartner, “By 2018, organizations with data virtualization capabilities will spend 40% less on building and managing data integration processes for connecting distributed data assets.” This solidifies Data Virtualization as a critical piece of technology for any flexible and agile modern data architecture.

    This session will:

    Introduce data virtualization and explain how it differs from traditional data integration approaches
    Discuss key patterns and use cases of Data Virtualization
    Set the scene for subsequent sessions in the Packed Lunch Webinar Series, which will take a deeper dive into various challenges solved by data virtualization.
    Agenda:

    Introduction & benefits of DV
    Summary & Next Steps
    Q&A
  • Big Data Fabric: A Recipe for Big Data Initiatives Recorded: Mar 13 2017 55 mins
    Paul Moxon, VP Data Architectures & Chief Evangelist, Denodo
    Big data fabric combines essential big data capabilities in a single platform to automate the many facets of data discovery, preparation, curation, orchestration, and integration across a multitude of data sources.

    Attend this session to learn how Big Data Fabric enabled by data virtualization constitutes a recipe for:

    * Enabling new actionable insights with minimal effort
    * Securing big data end-to-end
    * Addressing big data skillset scarcity
    * Providing easy access to data without having to decipher various data formats

    Agenda:
    * Big Data with Data Virtualization
    * Product Demonstration
    * Summary & Next Steps
    * Q&A
  • Under the Hood of Denodo’s Query Optimizer Recorded: Mar 10 2017 57 mins
    Pablo Álvarez, Principal Technical Account Manager, Denodo
    The optimizer is one of the most complex parts of any data engine; it must ensure that the execution engine is performing at its best, so understanding how it works, is crucial.

    This webinar deep dives into the various performance optimization techniques employed by Denodo’s Dynamic Query Optimizer and illustrates these techniques via a demonstration.

    View this session to ensure that you achieve the maximum value from your data processing infrastructure:

    Various components of query optimization engine
    Different stages leveraged within the optimizer and the techniques applied.


    When, how and why to use each option available in the query optimizer
    Don’t miss the chance to discover Denodo’s Query Optimizer capabilities.
  • Powering Self Service Business Intelligence with Hadoop and Data Virtualization Recorded: Jan 17 2017 64 mins
    Mark Pritchard, Denodo and Sean Roberts, Hortonworks
    Vizient needed a unified view of their accounting and financial data marts to enable business users to discover the information they need in a self-service manner and to be able to provide excellent service to their members. Vizient selected Hortonworks Big Data Platform and Denodo Data Virtualization Platform so that they can unify their distributed data sets in a data lake, and at the same time provide an abstraction for end users for easy self-serviceable information access.

    During this webinar, you will learn:

    1) The role, use, and benefits of Hortonworks Data Platform in the Modern Data Architecture.
    2) How Hadoop and data virtualisation simplify data management and self-service data discovery.
    3)What data virtualisation is and how it can simplify big data projects. Best practices of using Hadoop with data virtualisation

    About Vizient
    Vizient, Inc. is the largest nationwide network of community-owned health care systems and their physicians in the US. Vizient™ combines the strengths of VHA, University HealthSystem Consortium (UHC), Novation and MedAssets SCM and Sg2, trusted leaders focused on solving health care's most pressing challenges. Vizient delivers brilliant resources and powerful data driven insights to healthcare organizations.
  • Initiate Logical Data Lake Strategy for IoT Recorded: Jan 5 2017
    Ana Yong, Director of Product Marketing at Hortonworks, and Lakshmi Randall, Director of Product Marketing at Denodo
    The next industrial revolution is on the horizon, driven by the application of big data, IoT and Cloud technologies.

    Join Hortonworks and Denodo for a discussion of the synergy between Hadoop Data Lake and the Internet of Things. This webinar will include a live demonstration illustrating the ease of leveraging your IoT data lake for Enterprise Analytics.

    Attend & Learn:

    * How to leverage Hadoop Data Lakes to support Internet of Things use cases
    * How to query Hadoop Data Lakes combined with any other structured, semi-structured and unstructured data sources using a single logical data lake
    * How to avoid Data Swamps via a light weight data governance approach that helps enterprises maximize the value of their Data Lake
    * How to use a logical data lake/data warehouse to prevent a physical data lake from becoming a silo

    Don’t miss this opportunity to cut through the hype to understand the critical technologies that form the backbone for IoT solutions.
  • Initiate Logical Data Lake Strategy for IoT Recorded: Jan 5 2017 61 mins
    Ana Yong, Director of Product Marketing at Hortonworks, and Lakshmi Randall, Director of Product Marketing at Denodo
    The next industrial revolution is on the horizon, driven by the application of big data, IoT and Cloud technologies.

    Join Hortonworks and Denodo for a discussion of the synergy between Hadoop Data Lake and the Internet of Things. This webinar will include a live demonstration illustrating the ease of leveraging your IoT data lake for Enterprise Analytics.

    Attend & Learn:

    * How to leverage Hadoop Data Lakes to support Internet of Things use cases
    * How to query Hadoop Data Lakes combined with any other structured, semi-structured and unstructured data sources using a single logical data lake
    * How to avoid Data Swamps via a light weight data governance approach that helps enterprises maximize the value of their Data Lake
    * How to use a logical data lake/data warehouse to prevent a physical data lake from becoming a silo

    Don’t miss this opportunity to cut through the hype to understand the critical technologies that form the backbone for IoT solutions.
  • The Big BI Dilemma - Bimodal Logical Data Warehouse to the Rescue! Recorded: Dec 6 2016 59 mins
    Rick van der Lans, Independent Industry analyst, Lakshmi Randall, Head of Product Marketing for Denodo
    The classic unimodal data warehouse architecture has expired because it is restricted to primarily supporting structured data but not the newer data types such as social, streaming, and IoT data. New BI architecture, such as “logical data warehouse”, is required to augment the traditional and rigid unimodal data warehouse systems with a new bimodal data warehouse architecture to support requirements that are experimental, flexible, explorative, and self-service oriented.

    Learn from the Logical Data Warehousing expert, Rick van der Lans, about how you can implement an agile data strategy using a bimodal Logical Data Warehouse architecture.
    In this webinar, you will learn:

    · Why unimodal data warehouse architectures are not suitable for newer data types
    · Why an agile data strategy is necessary to support a bimodal architecture
    · The concept of Bimodal Logical Data Warehouse architecture and why it is the future
    · How Data Virtualization enables the Bimodal Logical Data Warehouse
    · Customer case study depicting successful implementation of this architecture
Achieving Business Agility with Data Virtualization
For IT professionals who are focused on data integration and enterprise data management and are overwhelmed by the growing number of data and data types, data virtualization provides real-time integration with agility to access and integrate disparate sources with ease. For business professionals, Data Virtualization brings agile information access that in turn drives business agility. The webcasts provided in this channel by Denodo, the leader in Data Virtualization, provide the latest in common usage patterns, use cases, best practices and strategies for driving business value with data virtualization.

Embed in website or blog

Successfully added emails: 0
Remove all
  • Title: Design Fast Data Architecture for Big Data with Logical Data Warehouse & Lakes
  • Live at: Aug 24 2016 3:00 pm
  • Presented by: Kurt Jackson, Platform Lead, Autodesk
  • From:
Your email has been sent.
or close