Building an Event Driven Global Data Fabric with Apache Kafka

Presented by

Will LaForest, CTO Confluent Public Sector

About this talk

Government agencies are grappling with a growing challenge of distributing data across a geographically diverse set of locations around the US and globally. In order to ensure mission success, data needs to flow to all of these locations rapidly. Additionally, latency, bandwidth and reliability of communication can prove to be a challenge for agencies. A global data fabric is an emerging approach to help connect mission to data across multiple locations and deliver uniformity and consistency at scale. This on-demand webinar will cover: An overview of Apache Kafka and and how an event streaming platform can support your agencies mission Considerations around handling varying quality communication links Synchronous vs asynchronous data replication New multi-region capabilities in Confluent Platform for Global Data Fabric

Related topics:

More from this channel

Upcoming talks (0)
On-demand talks (285)
Subscribers (9781)
Confluent is building the foundational platform for data in motion. Our cloud-native offering is designed to be the intelligent connective tissue enabling real-time data, from multiple sources, to constantly stream across the organisation. With Confluent, organisations can create a central nervous system to innovate and win in a digital-first world.