Reliability Guarantees in Apache Kafka®

Logo
Presented by

Gwen Shapira, Product Manager - Confluent

About this talk

In the financial industry, losing data is unacceptable. Financial firms are adopting Kafka for their critical applications. Kafka provides the low latency, high throughput, high availability, and scale that these applications require. But can it also provide complete reliability? As a system architect, when asked, “Can you guarantee that we will always get every transaction,” you want to be able to say “yes” with total confidence. In this session, we go over everything that happens to a message – from producer to consumer, and pinpoint all the places where data can be lost – if you are not careful. You will learn how developers and operation teams can work together to build a bulletproof data pipeline with Kafka. And if you need proof that you built a reliable system – we’ll show you how you can build the system to prove this too. This is part 2 out of 5 in the Best Practices for Apache Kafka in Production Confluent Online Talk Series.
Related topics:

More from this channel

Upcoming talks (0)
On-demand talks (328)
Subscribers (10605)
Confluent is building the foundational platform for data in motion. Our cloud-native offering is designed to be the intelligent connective tissue enabling real-time data, from multiple sources, to constantly stream across the organisation. With Confluent, organisations can create a central nervous system to innovate and win in a digital-first world.