Hadoop to Data Lakehouse Migration: Use Cases, Benefits, & Success Stories

Logo
Presented by

Kamran Hussain, Field Solution Architect, Dremio, Tony Truong, Sr. Product Marketing Manager, Dremio

About this talk

Most companies use Hadoop for big data analytical workloads. The problem is, on-premises Hadoop deployments have failed to deliver business value after it is implemented. Over time, the high cost of operations and poor performance places a limitation on an organization’s ability to be agile. As a result, data platform teams are looking to modernize their Hadoop workloads to the data lakehouse. In this video, you'll learn about: - Use cases for modernizing Hadoop workloads - How the data lakehouse solves the inefficiencies of on-premises Hadoop - Success stories from organizations that have modernized Hadoop with the data lakehouse on Dremio
Related topics:

More from this channel

Upcoming talks (0)
On-demand talks (97)
Subscribers (4400)
Dremio is the easy and open data lakehouse, providing self-service analytics with data warehouse functionality and data lake flexibility across all of your data. Dremio increases agility with a revolutionary data-as-code approach that enables Git-like data experimentation, version control, and governance.