Hi [[ session.user.profile.firstName ]]

Managing Big Data in the Bedrock Data Lake Management Platform

In this webinar, Adam Diaz, Director of Field Engineering at Zaloni, will demonstrate how the Bedrock platform can run as a data federation layer that makes it easy to organize and manage data, regardless of volume, across your enterprise data architecture.

Data Federation describes any approach to data management that allows an application to retrieve and manipulate data because it understands the metadata associated. To do this and obtain business value, an enterprise­-grade data lake management and governance platform like Bedrock is required. The use of data across multiple clusters, databases and streams of data is enabled via Bedrock workflows.

In this context, topics covered will include:

- Classic Enterprise Architectures and Data Siloing

- Data Lake 360 Solution: a Holistic Approach to Modern Data Management

- Industry Trends - Movement to the Cloud and Hybrid Environments

- Metadata Management - and Handling Non-Persistent Environments

- Introduction to Bedrock

- Demonstration of Bedrock - How do I manage multiple systems?
Recorded Jan 18 2017 43 mins
Your place is confirmed,
we'll send you email reminders
Presented by
Adam Diaz, Director of Field Engineering
Presentation preview: Managing Big Data in the Bedrock Data Lake Management Platform

Network with like-minded attendees

  • [[ session.user.profile.displayName ]]
    Add a photo
    • [[ session.user.profile.displayName ]]
    • [[ session.user.profile.jobTitle ]]
    • [[ session.user.profile.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(session.user.profile) ]]
  • [[ card.displayName ]]
    • [[ card.displayName ]]
    • [[ card.jobTitle ]]
    • [[ card.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(card) ]]
  • Channel
  • Channel profile
  • Data Warehouse Augmentation: Cut Costs, Increase Power Jan 26 2017 9:00 pm UTC 45 mins
    Pradeep Varadan and Scott Gidley
    The data warehouse (DW) is one of the most effective tools for complex data analytics. But the downside is… it can drain your budget. How do you maximize the value of your DW and save on storage costs?

    Don't clog your DW's analytics bandwidth with less valuable storage and ETL processing. Those in the know are migrating storage and large-scale or batch processing to data lakes built on scale-out architectures such as Hadoop and the Cloud to save on costs and get increased processing power for the DW.

    Pradeep Varadan, Verizon's Wireline OSS Data Science Lead and Scott Gidley, Zaloni's VP, Product Management, discuss the benefits of augmenting your DW with a data lake. They also address how migrating to a data lake allows you to efficiently exploit original raw data of all types for data exploration and new use cases.

    Topics covered include:

    - Reducing storage costs
    - Increasing process speeds
    - Maximizing EDW for business intelligence
    - Extending data retention and extracting more value from all data

    About the Speaker:

    Pradeep Varadan is a data scientist and enterprise architect who specializes in data challenges within telecommunications. As the Wireline OSS Data Science Lead at Verizon Data Services, Pradeep is tasked with providing a competitive edge focused on utilizing data analytics to drive effective decision-making. He is skilled in creating systems that can be used to understand and make better decisions involving rapid technology shifts, customer lifestyle and behavior trends and relevant changes that impact the Verizon Network.
  • Managing Big Data in the Bedrock Data Lake Management Platform Recorded: Jan 18 2017 43 mins
    Adam Diaz, Director of Field Engineering
    In this webinar, Adam Diaz, Director of Field Engineering at Zaloni, will demonstrate how the Bedrock platform can run as a data federation layer that makes it easy to organize and manage data, regardless of volume, across your enterprise data architecture.

    Data Federation describes any approach to data management that allows an application to retrieve and manipulate data because it understands the metadata associated. To do this and obtain business value, an enterprise­-grade data lake management and governance platform like Bedrock is required. The use of data across multiple clusters, databases and streams of data is enabled via Bedrock workflows.

    In this context, topics covered will include:

    - Classic Enterprise Architectures and Data Siloing

    - Data Lake 360 Solution: a Holistic Approach to Modern Data Management

    - Industry Trends - Movement to the Cloud and Hybrid Environments

    - Metadata Management - and Handling Non-Persistent Environments

    - Introduction to Bedrock

    - Demonstration of Bedrock - How do I manage multiple systems?
  • Techniques to Establish Your Data Lake: How to Achieve Data Quality and Security Recorded: Oct 13 2016 63 mins
    Ben Sharma
    The growing volume and variety of data makes it imperative for organizations to manage and govern their data in a way that's scalable and cost-effective. The data lake – once considered just a relatively inexpensive storage solution – can now be a tool for deriving true business value. By implementing a set of best practices for establishing and managing your data lake, you can achieve 360-degree control and visibility of your data.

    In this webcast, Ben Sharma, Zaloni's co-founder and CEO discusses techniques to balancing the flexibility a data lake can provide with the requirements for privacy and security that are critical for enterprise data.

    Topics covered include:

    - How to establish a managed data ingestion process - that includes metadata management - in order to create a solid foundation for your data lake

    - Techniques for establishing data lineage and provenance

    - Tips for achieving data quality

    - Key considerations for data privacy and security

    - Unique stages along the data lake lifecycle and management concepts for each stage

    - Why a data catalog is important

    - Considerations for self-service data preparation

    About the speaker:

    Ben Sharma is CEO and co-founder of Zaloni. He is a passionate technologist and thought leader in big data, analytics and enterprise infrastructure solutions. Having previously worked in technology leadership at NetApp, Fujitsu and others, Ben's expertise ranges from business development to production deployment in a wide array of technologies including Hadoop, HBase, databases, virtualization and storage. Ben is co-author of Architecting Data Lakes and Java in Telecommunications. He holds two patents.
  • Data Lake 360: Extending Storage and Lifecycle of Data Recorded: Oct 5 2016 40 mins
    Gus Horn and Scott Gidley
    Gus Horn, NetApp’s Senior Global Consulting Engineer and Scott Gidley, Zaloni’s VP, Product Management, discuss effective data lake lifecycle management and data architecture modernization. This webinar will address the best ways to achieve new levels of data insight and how to get superior value from your data.
  • The 360-Degree Approach to Managing a Data Lake Recorded: Aug 31 2016 54 mins
    Adam Diaz and Kelly Schupp
    In order to achieve the agility, shorter time-to-insight, and scalability that a big data lake promises today’s enterprise, customers need a unified, holistic approach that addresses data visibility, reliability, security and privacy and provides democratized access to useful data. Zaloni’s Data Lake 360° solution addresses the need for this holistic approach, giving complete control and full visibility into data lakes by providing a 360-degree view.

    Join Adam Diaz, Director of Field Engineering, for a discussion about how Zaloni's Data Lake 360° Solution reduces the complexity of creating a modern, integrated big data architecture for advanced analytics and data science. A demo of the latest versions of Bedrock and Mica is provided.

    About the speaker:

    Adam Diaz is a long time technologist working in the software industry for roughly twenty years. With roots in Bioinformatics, Statistics and CAD/CAE, Adam has spent many years with both HPC and Hadoop to enable high performance and parallel computing customer data solutions at scale. His background includes companies like SAS, Teradata, Hortonworks and MapR. Adam currently works at Zaloni enabling Hadoop Data Lake solutions utilizing Bedrock as the Director of Field Engineering.
  • Risky Business: How to Balance Innovation & Risk in Big Data Recorded: Aug 25 2016 50 mins
    Nik Rouda and Scott Gidley
    Big data is a game-changer for organizations that use it right. However, a dynamic tension always exists between rapid innovation using big data and the high level of production maturity required for an enterprise implementation. Is it possible to find the right mix?

    We say yes. Nik Rouda, senior big data analyst for Enterprise Strategy Group, reveals insights from his research and best practices for success. Join Nik and Zaloni’s Vice President of Product, Scott Gidley, for a discussion on how to find that balance between the lofty promises of big data and the mundane necessities of building a data lake environment that delivers business value.

    Topics covered include:

    - Big data business priorities and real-life use cases

    - The range of people and organizations involved in big data projects

    - A look at the time-to-business value that most organizations experience

    - An overview of the qualities and capabilities desired in data lakes

    - A typical data lake adoption lifecycle

    - Zaloni’s Data lake 360 solution as a holistic approach to building and leveraging a big data lake

    About the speaker:

    ESG Senior Analyst Nik Rouda covers big data, analytics, business intelligence, databases, and data management. With 20 years of experience in IT around the world, he understands the challenges of both vendors and buyers, and how to find the true value of innovative technologies. Using the knowledge of strategic leadership that he gathered previously helping to accelerate growth for fast-paced startups and Fortune 100 enterprises, Nik’s goal is to strengthen messaging, embolden market strategy, and ultimately, maximize his client’s gain.
  • Understanding Metadata: Why it's essential to your big data solution Recorded: Jun 21 2016 63 mins
    Ben Sharma and Vikram Skreekanti
    Metadata is essential for managing, migrating, accessing, and deploying a big data solution. Without it, enterprises have limited visibility into the data itself and cannot trust in its quality—negating the value of data in the first place. Creating end-to-end data visibility allows you to keep track of data, enable search and query across big data systems, safeguards your data, and reduces risk.

    In this O'Reilly webcast replay, Ben Sharma (co-founder and CEO of Zaloni) and Vikram Sreekanti (software engineer in the AMPLab at UC Berkeley) discuss the value of collecting and analyzing metadata, and its potential to impact your big data solution and your business.

    Learning how to access to your data's lineage allows you to know where data has come from, where it is, and how it is being used. This webinar takes a deep dive into a new open-source project under development at UC Berkeley — Ground. Ground is a data context system that enables users to uncover what data they have, where the data is flowing to and from, who is using the data, and when and how it changes. We explore how data context stretches the bounds of what we have traditionally considered metadata.

    Topics covered include:

    - The role of metadata in data analysis

    - Key considerations for managing metadata

    - How to establish data lineage and provenance, in order to create a repeatable process

    - How Ground is making an impact on a wide range of data tasks, including data inventory, usage tracking, model-specific interpretation, reproducibility, interoperability, and collective governance

    - Initial work on Ground, and how this data context system is making an impact on a wide range of data tasks, including: data inventory, usage tracking, model-specific interpretation, reproducibility, interoperability, and collective governance
The Data Lake Company
Zaloni, the data lake company, is an award-winning provider of enterprise data lake management solutions. Our software, Bedrock and Mica, enables customers to gain their own competitive advantage through organized, actionable big data lakes. Serving the Fortune 500, Zaloni has helped its customers build production implementations at many of the world’s leading companies.

Embed in website or blog

Successfully added emails: 0
Remove all
  • Title: Managing Big Data in the Bedrock Data Lake Management Platform
  • Live at: Jan 18 2017 6:00 pm
  • Presented by: Adam Diaz, Director of Field Engineering
  • From:
Your email has been sent.
or close