Best Practices for Modernizing Your Hadoop Workloads to AWS with Dremio

Logo
Presented by

Jorge A. Lopez, Product Specialist for Analytics, AWS, Jeremiah Morrow, Director of Product Marketing, Dremio

About this talk

Many organizations turned to HDFS to address the challenge of storing growing volumes of semi-structured and unstructured data. However, Hadoop never managed to replace the data warehouse for enterprise-grade Business Intelligence and Reporting, and most teams ended up with separate monolithic architectures including data lakes and data warehouses, with siloed data and analytic workloads That is why data teams are increasingly considering a data lakehouse architecture that combines the flexibility and scalability of data lake storage with the data management, data governance, and enterprise-grade analytic performance of the data warehouse. In this video, Jorge A. Lopez, Product Specialist for Analytics at AWS, and Dremio's Jeremiah Morrow will discuss best practices for modernizing analytic workloads from Hadoop to an open data lakehouse architecture, including: - Choosing the right storage solution for your data lakehouse, and what features and functionality, such as performance, scalability reliability, and more, you should be evaluating. - Specific steps and best practices for gradually shifting on-premises workloads to a cloud data lakehouse while ensuring business continuity. - Consolidating data silos to achieve a complete view of your customer and operational data before, during, and after migration.
Related topics:

More from this channel

Upcoming talks (0)
On-demand talks (101)
Subscribers (4441)
Dremio is the easy and open data lakehouse, providing self-service analytics with data warehouse functionality and data lake flexibility across all of your data. Dremio increases agility with a revolutionary data-as-code approach that enables Git-like data experimentation, version control, and governance.