Hi [[ session.user.profile.firstName ]]

Managing Big Data in the Bedrock Data Lake Management Platform

In this webinar, Adam Diaz, Director of Field Engineering at Zaloni, will demonstrate how the Bedrock platform can run as a data federation layer that makes it easy to organize and manage data, regardless of volume, across your enterprise data architecture.

Data Federation describes any approach to data management that allows an application to retrieve and manipulate data because it understands the metadata associated. To do this and obtain business value, an enterprise­-grade data lake management and governance platform like Bedrock is required. The use of data across multiple clusters, databases and streams of data is enabled via Bedrock workflows.

In this context, topics covered will include:

- Classic Enterprise Architectures and Data Siloing

- Data Lake 360 Solution: a Holistic Approach to Modern Data Management

- Industry Trends - Movement to the Cloud and Hybrid Environments

- Metadata Management - and Handling Non-Persistent Environments

- Introduction to Bedrock

- Demonstration of Bedrock - How do I manage multiple systems?
Recorded Jan 18 2017 43 mins
Your place is confirmed,
we'll send you email reminders
Presented by
Adam Diaz, Director of Field Engineering
Presentation preview: Managing Big Data in the Bedrock Data Lake Management Platform

Network with like-minded attendees

  • [[ session.user.profile.displayName ]]
    Add a photo
    • [[ session.user.profile.displayName ]]
    • [[ session.user.profile.jobTitle ]]
    • [[ session.user.profile.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(session.user.profile) ]]
  • [[ card.displayName ]]
    • [[ card.displayName ]]
    • [[ card.jobTitle ]]
    • [[ card.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(card) ]]
  • Channel
  • Channel profile
  • Everyone is a Stakeholder in a Data-Driven Enterprise Mar 2 2017 6:00 pm UTC 60 mins
    Dave Wells & Kelly Schupp
    Almost everyone is concerned with the tooling to manage the big data lifecycle. From business people engaged with self-service analytics, to data scientists, data analysts, and data professionals from BI and IT organizations, it seems that nearly everyone is both a consumer and a provider of data.

    Big data management software spans the data lifecycle supporting data profiling, transformation, enrichment, cleansing, matching and other functions. It is the glue that binds a big data environment together, fostering continuous alignment of data with dynamic and changing business needs.

    During this webinar, Dave Wells, Research Analyst at Eckerson Group, and Kelly Schupp, VP of Data-driven Marketing at Zaloni, will discuss the tools, and how to leverage them for high-impact analytics, leveraging research from Dave’s recent industry report titled “Big Data Management Software for the Data-Driven Enterprise”. Topics addressed:

    - The kinds of tools that are needed to meet the challenges of big data
    - The purpose, functions, and characteristics of data preparation tools
    - The purpose, functions, and characteristics of pipeline management tools
    - The purpose, functions, and characteristics of data cataloging tools
    - The role of big data management tools for high-impact analytics

    Speaker Bios:

    Dave Wells is an advisory consultant, educator, and industry analyst et Eckerson Group. He is dedicated to building meaningful connections throughout the path from data to business value. He works at the intersection of information management and business management, driving business impact through analytics, business intelligence, and active data management.

    Kelly Schupp is Vice President of Marketing for Zaloni. Kelly has 20 years of experience in the enterprise software and technology industry. She has held a variety of global marketing leadership roles, and previously worked at IBM, Micromuse and Porter Novelli.
  • Techniques to Establish Your Data Lake: How to Achieve Data Quality and Security Recorded: Feb 16 2017 63 mins
    Ben Sharma, CEO and Co-Founder, Zaloni
    The growing volume and variety of data makes it imperative for organizations to manage and govern their data in a way that's scalable and cost-effective. The data lake – once considered just a relatively inexpensive storage solution – can now be a tool for deriving true business value. By implementing a set of best practices for establishing and managing your data lake, you can achieve 360-degree control and visibility of your data.

    In this webcast, Ben Sharma, Zaloni's co-founder and CEO discusses techniques to balancing the flexibility a data lake can provide with the requirements for privacy and security that are critical for enterprise data.

    Topics covered include:

    - How to establish a managed data ingestion process - that includes metadata management - in order to create a solid foundation for your data lake

    - Techniques for establishing data lineage and provenance

    - Tips for achieving data quality

    - Key considerations for data privacy and security

    - Unique stages along the data lake lifecycle and management concepts for each stage

    - Why a data catalog is important

    - Considerations for self-service data preparation

    About the speaker:

    Ben Sharma is CEO and co-founder of Zaloni. He is a passionate technologist and thought leader in big data, analytics and enterprise infrastructure solutions. Having previously worked in technology leadership at NetApp, Fujitsu and others, Ben's expertise ranges from business development to production deployment in a wide array of technologies including Hadoop, HBase, databases, virtualization and storage. Ben is co-author of Architecting Data Lakes and Java in Telecommunications. He holds two patents.
  • Data Warehouse Augmentation: Cut Costs, Increase Power Recorded: Jan 26 2017 46 mins
    Pradeep Varadan and Scott Gidley
    The data warehouse (DW) is one of the most effective tools for complex data analytics. But the downside is… it can drain your budget. How do you maximize the value of your DW and save on storage costs?

    Don't clog your DW's analytics bandwidth with less valuable storage and ETL processing. Those in the know are migrating storage and large-scale or batch processing to data lakes built on scale-out architectures such as Hadoop and the Cloud to save on costs and get increased processing power for the DW.

    Pradeep Varadan, Verizon's Wireline OSS Data Science Lead and Scott Gidley, Zaloni's VP, Product Management, discuss the benefits of augmenting your DW with a data lake. They also address how migrating to a data lake allows you to efficiently exploit original raw data of all types for data exploration and new use cases.

    Topics covered include:

    - Reducing storage costs
    - Increasing process speeds
    - Maximizing EDW for business intelligence
    - Extending data retention and extracting more value from all data

    About the Speaker:

    Pradeep Varadan is a data scientist and enterprise architect who specializes in data challenges within telecommunications. As the Wireline OSS Data Science Lead at Verizon Data Services, Pradeep is tasked with providing a competitive edge focused on utilizing data analytics to drive effective decision-making. He is skilled in creating systems that can be used to understand and make better decisions involving rapid technology shifts, customer lifestyle and behavior trends and relevant changes that impact the Verizon Network.
  • Managing Big Data in the Bedrock Data Lake Management Platform Recorded: Jan 18 2017 43 mins
    Adam Diaz, Director of Field Engineering
    In this webinar, Adam Diaz, Director of Field Engineering at Zaloni, will demonstrate how the Bedrock platform can run as a data federation layer that makes it easy to organize and manage data, regardless of volume, across your enterprise data architecture.

    Data Federation describes any approach to data management that allows an application to retrieve and manipulate data because it understands the metadata associated. To do this and obtain business value, an enterprise­-grade data lake management and governance platform like Bedrock is required. The use of data across multiple clusters, databases and streams of data is enabled via Bedrock workflows.

    In this context, topics covered will include:

    - Classic Enterprise Architectures and Data Siloing

    - Data Lake 360 Solution: a Holistic Approach to Modern Data Management

    - Industry Trends - Movement to the Cloud and Hybrid Environments

    - Metadata Management - and Handling Non-Persistent Environments

    - Introduction to Bedrock

    - Demonstration of Bedrock - How do I manage multiple systems?
  • Data Lake 360: Extending Storage and Lifecycle of Data Recorded: Oct 5 2016 40 mins
    Gus Horn and Scott Gidley
    Highly competitive enterprises are continually looking for ways to maximize and accelerate value that can be derived from their data. The influx of diverse data is causing data lakes to emerge as a powerful architectural approach. Data lakes allow you to analyze the most relevant and valuable data stored in the most efficient and high performing way possible. The question is: Are you managing the life of your data in your data lake correctly?

    Gus Horn, NetApp’s Senior Global Consulting Engineer and Scott Gidley, Zaloni’s VP, Product Management, discuss effective data lake lifecycle management and data architecture modernization. This webinar addresses the best ways to achieve new levels of data insight and how to get superior value from your data.

    Topics covered include:

    - Improving infrastructure efficiency
    - Building smaller, more efficient analytics platforms
    - Capabilities of Zaloni's Bedrock Data Lake Management Platform
    - Defining logical data lakes on and off the premises or a combination of both with NetApp E-Series and NetApp StorageGRID® Webscale
    - Extending storage and lifecycle of the data

    About the Speaker:

    Gus Horn has over 25 years experience of product and software development with a proven track record in quality product delivery. Horn has been immersed in the Big Data Analytic World, designing and building for some of the largest enterprise class analytic platforms in the world. Horn is currently Senior Global Consulting Engineer at NetApp. He is responsible for both strategic and tactical advancement of analytics, data management and integration solutions.
  • The 360-Degree Approach to Managing a Data Lake Recorded: Aug 31 2016 54 mins
    Adam Diaz and Kelly Schupp
    In order to achieve the agility, shorter time-to-insight, and scalability that a big data lake promises today’s enterprise, customers need a unified, holistic approach that addresses data visibility, reliability, security and privacy and provides democratized access to useful data. Zaloni’s Data Lake 360° solution addresses the need for this holistic approach, giving complete control and full visibility into data lakes by providing a 360-degree view.

    Join Adam Diaz, Director of Field Engineering, for a discussion about how Zaloni's Data Lake 360° Solution reduces the complexity of creating a modern, integrated big data architecture for advanced analytics and data science. A demo of the latest versions of Bedrock and Mica is provided.

    About the speaker:

    Adam Diaz is a long time technologist working in the software industry for roughly twenty years. With roots in Bioinformatics, Statistics and CAD/CAE, Adam has spent many years with both HPC and Hadoop to enable high performance and parallel computing customer data solutions at scale. His background includes companies like SAS, Teradata, Hortonworks and MapR. Adam currently works at Zaloni enabling Hadoop Data Lake solutions utilizing Bedrock as the Director of Field Engineering.
  • Risky Business: How to Balance Innovation & Risk in Big Data Recorded: Aug 25 2016 50 mins
    Nik Rouda and Scott Gidley
    Big data is a game-changer for organizations that use it right. However, a dynamic tension always exists between rapid innovation using big data and the high level of production maturity required for an enterprise implementation. Is it possible to find the right mix?

    We say yes. Nik Rouda, senior big data analyst for Enterprise Strategy Group, reveals insights from his research and best practices for success. Join Nik and Zaloni’s Vice President of Product, Scott Gidley, for a discussion on how to find that balance between the lofty promises of big data and the mundane necessities of building a data lake environment that delivers business value.

    Topics covered include:

    - Big data business priorities and real-life use cases

    - The range of people and organizations involved in big data projects

    - A look at the time-to-business value that most organizations experience

    - An overview of the qualities and capabilities desired in data lakes

    - A typical data lake adoption lifecycle

    - Zaloni’s Data lake 360 solution as a holistic approach to building and leveraging a big data lake

    About the speaker:

    ESG Senior Analyst Nik Rouda covers big data, analytics, business intelligence, databases, and data management. With 20 years of experience in IT around the world, he understands the challenges of both vendors and buyers, and how to find the true value of innovative technologies. Using the knowledge of strategic leadership that he gathered previously helping to accelerate growth for fast-paced startups and Fortune 100 enterprises, Nik’s goal is to strengthen messaging, embolden market strategy, and ultimately, maximize his client’s gain.
  • Understanding Metadata: Why it's essential to your big data solution Recorded: Jun 21 2016 63 mins
    Ben Sharma and Vikram Skreekanti
    Metadata is essential for managing, migrating, accessing, and deploying a big data solution. Without it, enterprises have limited visibility into the data itself and cannot trust in its quality—negating the value of data in the first place. Creating end-to-end data visibility allows you to keep track of data, enable search and query across big data systems, safeguards your data, and reduces risk.

    In this O'Reilly webcast replay, Ben Sharma (co-founder and CEO of Zaloni) and Vikram Sreekanti (software engineer in the AMPLab at UC Berkeley) discuss the value of collecting and analyzing metadata, and its potential to impact your big data solution and your business.

    Learning how to access to your data's lineage allows you to know where data has come from, where it is, and how it is being used. This webinar takes a deep dive into a new open-source project under development at UC Berkeley — Ground. Ground is a data context system that enables users to uncover what data they have, where the data is flowing to and from, who is using the data, and when and how it changes. We explore how data context stretches the bounds of what we have traditionally considered metadata.

    Topics covered include:

    - The role of metadata in data analysis

    - Key considerations for managing metadata

    - How to establish data lineage and provenance, in order to create a repeatable process

    - How Ground is making an impact on a wide range of data tasks, including data inventory, usage tracking, model-specific interpretation, reproducibility, interoperability, and collective governance

    - Initial work on Ground, and how this data context system is making an impact on a wide range of data tasks, including: data inventory, usage tracking, model-specific interpretation, reproducibility, interoperability, and collective governance
The Data Lake Company
Zaloni, the data lake company, is an award-winning provider of enterprise data lake management solutions. Our software, Bedrock and Mica, enables customers to gain their own competitive advantage through organized, actionable big data lakes. Serving the Fortune 500, Zaloni has helped its customers build production implementations at many of the world’s leading companies.

Embed in website or blog

Successfully added emails: 0
Remove all
  • Title: Managing Big Data in the Bedrock Data Lake Management Platform
  • Live at: Jan 18 2017 6:00 pm
  • Presented by: Adam Diaz, Director of Field Engineering
  • From:
Your email has been sent.
or close