Apache Kafka®, Confluent, and the Data Mesh

Logo
Presented by

Chun Sing Chan, Solutions Engineer at Confluent

About this talk

Data mesh is a new approach to designing data architectures by embracing organisational and data-centric constructs, such as data management and governance. The idea is that data should be easily accessible and interconnected across the entire business. In this talk, will cover: - The basics of building a streaming data mesh with Kafka & how Confluent enables this. - The four principles of the data mesh: domain-driven decentralisation, data as a product, self-service data platform, and federated governance. - The differences between working with event streams versus centralised approaches. - How to onboard data from existing systems into a mesh, modelling the communication within the mesh - How to deal with changes to your domain’s “public” data, give examples of global standards for governance - The importance of taking a product-centric view of data sources and the data sets they share.
Related topics:

More from this channel

Upcoming talks (0)
On-demand talks (328)
Subscribers (10600)
Confluent is building the foundational platform for data in motion. Our cloud-native offering is designed to be the intelligent connective tissue enabling real-time data, from multiple sources, to constantly stream across the organisation. With Confluent, organisations can create a central nervous system to innovate and win in a digital-first world.