From digital banking to industry 4.0 the nature of business is changing. Increasingly businesses are becoming software. And the lifeblood of software is data. Dealing with data at the enterprise level is tough, and there have been some missteps along the way.
This session will consider the increasingly popular idea of a 'data mesh' - the problems it solves and, perhaps most importantly, how a data in motion or event streaming platform forms the bedrock of this new paradigm.
Data mesh is a relatively recent term that describes a set of principles that good modern data systems uphold. A kind of “microservices” for the data-centric world. While the data mesh is not technology-specific as a pattern, the building of systems that adopt and implement data mesh principles have a relatively long history under different guises.
In this talk, we'll cover:
- building a streaming data mesh with Kafka
-the four principles of the data mesh: domain-driven decentralisation, data as a product, self-service data platform, and federated governance.
-the differences between working with event streams versus centralised approaches and highlight the key characteristics that make streams a great fit for implementing a mesh, such as their ability to capture both real-time and historical data.
- how to onboard data from existing systems into a mesh, modelling the communication within the mesh
-how to deal with changes to your domain’s “public” data, give examples of global standards for governance
-the importance of taking a product-centric view on data sources and the data sets they share.
Mumbai 9am / Jakarta 10:30am / Singapore 11:30am / Sydney 1:30pm / Auckland 3:30pm