How to Build an Apache Kafka® Connector

Logo
Presented by

Jeff Bean, Partner Solution Architect, Confluent

About this talk

Apache Kafka® is the technology behind event streaming which is fast becoming the central nervous system of flexible, scalable, modern data architectures. Customers want to connect their databases, data warehouses, applications, microservices and more, to power the event streaming platform. To connect to Apache Kafka, you need a connector! This online talk dives into the new Verified Integrations Program and the integration requirements, the Connect API and sources and sinks that use Kafka Connect. We cover the verification steps and provide code samples created by popular application and database companies. We will discuss the resources available to support you through the connector development process. This is Part 2 of 2 in Building Kafka Connectors - The Why and How
Related topics:

More from this channel

Upcoming talks (0)
On-demand talks (328)
Subscribers (10600)
Confluent is building the foundational platform for data in motion. Our cloud-native offering is designed to be the intelligent connective tissue enabling real-time data, from multiple sources, to constantly stream across the organisation. With Confluent, organisations can create a central nervous system to innovate and win in a digital-first world.