Hi [[ session.user.profile.firstName ]]

On Track with Apache Kafka®: Building a Streaming ETL Solution with Rail Data

As data engineers, we frequently need to build scalable systems working with data from a variety of sources and with various ingest rates, sizes, and formats. This talk takes an in-depth look at how Apache Kafka can be used to provide a common platform on which to build data infrastructure driving both real-time analytics as well as event-driven applications.

Using a public feed of railway data it will show how to ingest data from message queues such as ActiveMQ with Kafka Connect, as well as from static sources such as S3 and REST endpoints. We'll then see how to use stream processing to transform the data into a form useful for streaming to analytics in tools such as Elasticsearch and Neo4j. The same data will be used to drive a real-time notifications service through Telegram.

If you're wondering how to build your next scalable data platform, how to reconcile the impedance mismatch between stream and batch, and how to wrangle streams of data—this talk is for you!
Recorded May 29 2020 58 mins
Your place is confirmed,
we'll send you email reminders
Presented by
Robin Moffatt, Developer Advocate, Confluent
Presentation preview: On Track with Apache Kafka®: Building a Streaming ETL Solution with Rail Data

Network with like-minded attendees

  • [[ session.user.profile.displayName ]]
    Add a photo
    • [[ session.user.profile.displayName ]]
    • [[ session.user.profile.jobTitle ]]
    • [[ session.user.profile.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(session.user.profile) ]]
  • [[ card.displayName ]]
    • [[ card.displayName ]]
    • [[ card.jobTitle ]]
    • [[ card.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(card) ]]
  • Channel
  • Channel profile
  • Stream me to the Cloud (and back) with Confluent & MongoDB Dec 17 2020 2:15 pm UTC 65 mins
    Gianluca Natali, Confluent & Felix Reichenbach, MongoDB
    Companies collect and store their data in various data stores and use a number of business applications and services to access, analyze and act on their data. Pulling all the data from disparate sources is difficult to manage, inefficient and ineffective in producing results. Event streaming and stream processing changes this paradigm. By enabling robust and reactive data pipelines between all your data stores, apps and services, you can make real-time decisions that are critical to your business.


    In this online talk, we’ll explore how and why companies are leveraging Confluent and MongoDB to modernize their architecture and leverage the scalability of the cloud and the velocity of streaming. Based upon a sample retail business scenario, we will explain how changes in an on-premise database are streamed via the Confluent Cloud to MongoDB Atlas and back.

    Key Learnings

    Modernize your architecture without revolutionizing it.

    Stream your data from multiple applications and data centers into the Cloud and back

    Confluent as the Central Nervous System of your architecture

    MongoDB Atlas as the flexible and scalable Modern Data Platform combining data from different sources and powering your frontend applications

    Why MongoDB and Confluent is such a great combination

    This architectural approach will allow you to dynamically scale the customer facing frontend, avoid over provisioning and enable the development team to rapidly implement new functionality which will differentiate you from your competition
  • Confluent Control Centre & KSQLDB Dec 8 2020 10:15 pm UTC 32 mins
    Ala Alsharif, Confluent
    Join Ala Alsharif of Confluent for this jam-packed technology in practice session in which you will experience:
    •A demo of Confluent Control Panel
    •A demo of KSQLDB
    •An insight into the ease with which you can build event streaming applications
    •An overview of Confluent’s stream processing capability
  • Using Confluent & Google Apigee to Enable your Events on Hybrid Deployment Model Dec 2 2020 3:00 pm UTC 58 mins
    Dan Croft, Confluent & Mathilde Fabre, Google Cloud Apigee
    Today’s customers expect channels, whether physical or digital, to blend together into a cohesive connected experience. APIs help enterprises to not only meet customer expectations, but also participate in software ecosystems and provide unprecedented opportunities for reach and economies of scale. By attending this online talk hosted by Google Apigee and Confluent you will learn how APIs allow you to streamline, secure and monetize access to your data and services to deliver a cohesive experience. In addition you will see just how easy it is to integrate the Confluent platform with Google Apigee.


    Register now to learn more about:



    •The challenges faced by CIOs face when it comes to getting value out of data
    • How to simplify the capture, generation and consumption of data, in a secure and monitored way
    • How to deal with the exponentially growing volume of data, devices and systems

    • Why Apigee offers a reliable and scalable API management platform

    • How APIs touch every stage of the digital journey

    • The value of having a layer of microservices that allows for agile developments
  • Confluent Control Centre & KSQLDB Nov 25 2020 12:15 pm UTC 32 mins
    Ala Alsharif, Confluent
    Join Ala Alsharif of Confluent for this jam-packed technology in practice session in which you will experience:
    •A demo of Confluent Control Panel
    •A demo of KSQLDB
    •An insight into the ease with which you can build event streaming applications
    •An overview of Confluent’s stream processing capability
  • Stream me to the Cloud (and back) with Confluent & MongoDB Nov 19 2020 1:15 pm UTC 65 mins
    Gianluca Natali, Confluent & Felix Reichenbach, MongoDB
    Companies collect and store their data in various data stores and use a number of business applications and services to access, analyze and act on their data. Pulling all the data from disparate sources is difficult to manage, inefficient and ineffective in producing results. Event streaming and stream processing changes this paradigm. By enabling robust and reactive data pipelines between all your data stores, apps and services, you can make real-time decisions that are critical to your business.


    In this online talk, we’ll explore how and why companies are leveraging Confluent and MongoDB to modernize their architecture and leverage the scalability of the cloud and the velocity of streaming. Based upon a sample retail business scenario, we will explain how changes in an on-premise database are streamed via the Confluent Cloud to MongoDB Atlas and back.

    Key Learnings

    Modernize your architecture without revolutionizing it.

    Stream your data from multiple applications and data centers into the Cloud and back

    Confluent as the Central Nervous System of your architecture

    MongoDB Atlas as the flexible and scalable Modern Data Platform combining data from different sources and powering your frontend applications

    Why MongoDB and Confluent is such a great combination

    This architectural approach will allow you to dynamically scale the customer facing frontend, avoid over provisioning and enable the development team to rapidly implement new functionality which will differentiate you from your competition
  • Using Confluent & Google Apigee to Enable your Events on Hybrid Deployment Model Nov 10 2020 4:00 pm UTC 58 mins
    Dan Croft, Confluent & Mathilde Fabre, Google Cloud Apigee
    Today’s customers expect channels, whether physical or digital, to blend together into a cohesive connected experience. APIs help enterprises to not only meet customer expectations, but also participate in software ecosystems and provide unprecedented opportunities for reach and economies of scale. By attending this online talk hosted by Google Apigee and Confluent you will learn how APIs allow you to streamline, secure and monetize access to your data and services to deliver a cohesive experience. In addition you will see just how easy it is to integrate the Confluent platform with Google Apigee.


    Register now to learn more about:



    •The challenges faced by CIOs face when it comes to getting value out of data
    • How to simplify the capture, generation and consumption of data, in a secure and monitored way
    • How to deal with the exponentially growing volume of data, devices and systems

    • Why Apigee offers a reliable and scalable API management platform

    • How APIs touch every stage of the digital journey

    • The value of having a layer of microservices that allows for agile developments
  • Stream me to the Cloud (and back) with Confluent & MongoDB Oct 18 2020 8:00 am UTC 65 mins
    Gianluca Natali, Confluent & Felix Reichenbach, MongoDB
    Companies collect and store their data in various data stores and use a number of business applications and services to access, analyze and act on their data. Pulling all the data from disparate sources is difficult to manage, inefficient and ineffective in producing results. Event streaming and stream processing changes this paradigm. By enabling robust and reactive data pipelines between all your data stores, apps and services, you can make real-time decisions that are critical to your business.


    In this online talk, we’ll explore how and why companies are leveraging Confluent and MongoDB to modernize their architecture and leverage the scalability of the cloud and the velocity of streaming. Based upon a sample retail business scenario, we will explain how changes in an on-premise database are streamed via the Confluent Cloud to MongoDB Atlas and back.

    Key Learnings

    Modernize your architecture without revolutionizing it.

    Stream your data from multiple applications and data centers into the Cloud and back

    Confluent as the Central Nervous System of your architecture

    MongoDB Atlas as the flexible and scalable Modern Data Platform combining data from different sources and powering your frontend applications

    Why MongoDB and Confluent is such a great combination

    This architectural approach will allow you to dynamically scale the customer facing frontend, avoid over provisioning and enable the development team to rapidly implement new functionality which will differentiate you from your competition
  • Confluent Control Centre & KSQLDB Oct 13 2020 1:00 pm UTC 32 mins
    Ala Alsharif, Confluent
    Join Ala Alsharif of Confluent for this jam-packed technology in practice session in which you will experience:
    •A demo of Confluent Control Panel
    •A demo of KSQLDB
    •An insight into the ease with which you can build event streaming applications
    •An overview of Confluent’s stream processing capability
  • Using Confluent & Google Apigee to Enable your Events on Hybrid Deployment Model Oct 13 2020 9:00 am UTC 58 mins
    Dan Croft, Confluent & Mathilde Fabre, Google Cloud Apigee
    Today’s customers expect channels, whether physical or digital, to blend together into a cohesive connected experience. APIs help enterprises to not only meet customer expectations, but also participate in software ecosystems and provide unprecedented opportunities for reach and economies of scale. By attending this online talk hosted by Google Apigee and Confluent you will learn how APIs allow you to streamline, secure and monetize access to your data and services to deliver a cohesive experience. In addition you will see just how easy it is to integrate the Confluent platform with Google Apigee.


    Register now to learn more about:



    •The challenges faced by CIOs face when it comes to getting value out of data
    • How to simplify the capture, generation and consumption of data, in a secure and monitored way
    • How to deal with the exponentially growing volume of data, devices and systems

    • Why Apigee offers a reliable and scalable API management platform

    • How APIs touch every stage of the digital journey

    • The value of having a layer of microservices that allows for agile developments
  • Stream me to the Cloud (and back) with Confluent & MongoDB Oct 1 2020 1:30 pm UTC 65 mins
    Gianluca Natali, Confluent & Felix Reichenbach, MongoDB
    Companies collect and store their data in various data stores and use a number of business applications and services to access, analyze and act on their data. Pulling all the data from disparate sources is difficult to manage, inefficient and ineffective in producing results. Event streaming and stream processing changes this paradigm. By enabling robust and reactive data pipelines between all your data stores, apps and services, you can make real-time decisions that are critical to your business.


    In this online talk, we’ll explore how and why companies are leveraging Confluent and MongoDB to modernize their architecture and leverage the scalability of the cloud and the velocity of streaming. Based upon a sample retail business scenario, we will explain how changes in an on-premise database are streamed via the Confluent Cloud to MongoDB Atlas and back.

    Key Learnings

    Modernize your architecture without revolutionizing it.

    Stream your data from multiple applications and data centers into the Cloud and back

    Confluent as the Central Nervous System of your architecture

    MongoDB Atlas as the flexible and scalable Modern Data Platform combining data from different sources and powering your frontend applications

    Why MongoDB and Confluent is such a great combination

    This architectural approach will allow you to dynamically scale the customer facing frontend, avoid over provisioning and enable the development team to rapidly implement new functionality which will differentiate you from your competition
  • Stream me to the Cloud (and back) with Confluent & MongoDB Sep 24 2020 3:00 pm UTC 65 mins
    Gianluca Natali, Confluent & elix Reichenbach, MongoDB
    Companies collect and store their data in various data stores and use a number of business applications and services to access, analyze and act on their data. Pulling all the data from disparate sources is difficult to manage, inefficient and ineffective in producing results. Event streaming and stream processing changes this paradigm. By enabling robust and reactive data pipelines between all your data stores, apps and services, you can make real-time decisions that are critical to your business.


    In this online talk, we’ll explore how and why companies are leveraging Confluent and MongoDB to modernize their architecture and leverage the scalability of the cloud and the velocity of streaming. Based upon a sample retail business scenario, we will explain how changes in an on-premise database are streamed via the Confluent Cloud to MongoDB Atlas and back.

    Key Learnings

    Modernize your architecture without revolutionizing it.

    Stream your data from multiple applications and data centers into the Cloud and back

    Confluent as the Central Nervous System of your architecture

    MongoDB Atlas as the flexible and scalable Modern Data Platform combining data from different sources and powering your frontend applications

    Why MongoDB and Confluent is such a great combination

    This architectural approach will allow you to dynamically scale the customer facing frontend, avoid over provisioning and enable the development team to rapidly implement new functionality which will differentiate you from your competition
  • An Introduction to KSQL & Kafka Streams Processing with Ticketmaster Sep 24 2020 5:00 am UTC 63 mins
    Dani Traphagen, Sr. Systems Engineer, Confluent + Chris Smith, VP Engineering Data Science, Ticketmaster
    In this all too fabulous talk with Ticketmaster, we will be addressing the wonderful and new wonders of KSQL vs. KStreams.

    If you are new-ish to Apache Kafka® you may ask yourself, “What is a large Apache Kafka deployment?” And you may tell yourself, “This is not my beautiful KSQL use case!” And you may tell yourself, “This is not my beautiful KStreams use case!” And you may ask yourself, “What is a beautiful Apache Kafka use case?” And you may ask yourself, “Am I right about this architecture? Am I wrong?” And you may say to yourself, “My God! What have I done?”

    In this session, we’re going to delve into all these issues and more with Chris Smith, VP of Engineering Data Science at Ticketmaster.

    Watch now to learn:
    -Ticketmaster Apache Kafka Architecture
    -KSQL Architecture and Use Cases
    -KSQL Performance Considerations
    -When to KSQL and When to Live the KStream
    -How Ticketmaster uses KSQL and KStreams in production to reduce development friction in machine learning products
  • Confluent Control Centre & KSQLDB Sep 20 2020 12:00 pm UTC 32 mins
    Ala Alsharif, Confluent
    Join Ala Alsharif of Confluent for this jam-packed technology in practice session in which you will experience:
    •A demo of Confluent Control Panel
    •A demo of KSQLDB
    •An insight into the ease with which you can build event streaming applications
    •An overview of Confluent’s stream processing capability
  • End-to-End Integration from the IoT Edge to Confluent Cloud Recorded: Sep 16 2020 27 mins
    Kai Waehner, Technology Evangelist, Confluent + Konstantin Karantasis, Software Engineer, Confluent
    This interactive whiteboard presentation discusses use cases leveraging the Apache Kafka® open source ecosystem as a streaming platform to process IoT data. The session shows architectural alternatives of how devices like cars, machines or mobile devices connect to Apache Kafka via IoT standards like MQTT or OPC-UA.

    Learn how to analyze the IoT data either natively on Apache Kafka with Kafka Streams / KSQL or other tools leveraging Kafka Connect. Kai Waehner will also discuss the benefits of Confluent Cloud and other tools like Confluent Replicator or MQTT Proxy to build bidirectional real time integration from the edge to the cloud.

    Watch now to:
    -Understand end-to-end use cases from different industries where you integrate IoT devices with enterprise IT using open source technologies and standards
    -See how Apache Kafka enables bidirectional end-to-end integration processing from IoT data to various backend applications in the cloud
    -Compare different architectural alternatives and see their benefits and caveats
    -Learn about various standards, APIs and tools of integrating and processing IoT data with different open source components of the Apache Kafka ecosystem
    -Understand the benefits of Confluent Cloud, which provides a highly available and scalable Apache Kafka ecosystem as a managed service
  • Stream me to the Cloud (and back) with Confluent & MongoDB Recorded: Sep 16 2020 65 mins
    Gianluca Natali, Confluent & Felix Reichenbach, MongoDB
    Companies collect and store their data in various data stores and use a number of business applications and services to access, analyze and act on their data. Pulling all the data from disparate sources is difficult to manage, inefficient and ineffective in producing results. Event streaming and stream processing changes this paradigm. By enabling robust and reactive data pipelines between all your data stores, apps and services, you can make real-time decisions that are critical to your business.


    In this online talk, we’ll explore how and why companies are leveraging Confluent and MongoDB to modernize their architecture and leverage the scalability of the cloud and the velocity of streaming. Based upon a sample retail business scenario, we will explain how changes in an on-premise database are streamed via the Confluent Cloud to MongoDB Atlas and back.

    Key Learnings

    Modernize your architecture without revolutionizing it.

    Stream your data from multiple applications and data centers into the Cloud and back

    Confluent as the Central Nervous System of your architecture

    MongoDB Atlas as the flexible and scalable Modern Data Platform combining data from different sources and powering your frontend applications

    Why MongoDB and Confluent is such a great combination

    This architectural approach will allow you to dynamically scale the customer facing frontend, avoid over provisioning and enable the development team to rapidly implement new functionality which will differentiate you from your competition
  • Using Confluent & Google Apigee to Enable your Events on Hybrid Deployment Model Recorded: Sep 15 2020 58 mins
    Dan Croft, Confluent & Mathilde Fabre, Google Cloud Apigee
    Today’s customers expect channels, whether physical or digital, to blend together into a cohesive connected experience. APIs help enterprises to not only meet customer expectations, but also participate in software ecosystems and provide unprecedented opportunities for reach and economies of scale. By attending this online talk hosted by Google Apigee and Confluent you will learn how APIs allow you to streamline, secure and monetize access to your data and services to deliver a cohesive experience. In addition you will see just how easy it is to integrate the Confluent platform with Google Apigee.


    Register now to learn more about:



    •The challenges faced by CIOs face when it comes to getting value out of data
    • How to simplify the capture, generation and consumption of data, in a secure and monitored way
    • How to deal with the exponentially growing volume of data, devices and systems

    • Why Apigee offers a reliable and scalable API management platform

    • How APIs touch every stage of the digital journey

    • The value of having a layer of microservices that allows for agile developments
  • Confluent Control Centre & KSQLDB Recorded: Sep 10 2020 32 mins
    Ala Alsharif, Confluent
    Join Ala Alsharif of Confluent for this jam-packed technology in practice session in which you will experience:
    •A demo of Confluent Control Panel
    •A demo of KSQLDB
    •An insight into the ease with which you can build event streaming applications
    •An overview of Confluent’s stream processing capability
  • Unleashing Apache Kafka and TensorFlow in the Cloud Recorded: Sep 10 2020 59 mins
    Kai Waehner, Technology Evangelist, Confluent
    In this online talk, Technology Evangelist Kai Waehner will discuss and demo how you can leverage technologies such as TensorFlow with your Kafka deployments to build a scalable, mission-critical machine learning infrastructure for ingesting, preprocessing, training, deploying and monitoring analytic models.

    He will explain challenges and best practices for building a scalable infrastructure for machine learning using Confluent Cloud on Google Cloud Platform (GCP), Confluent Cloud on AWS and on-premise deployments.

    The discussed architecture will include capabilities like scalable data preprocessing for training and predictions, combination of different deep learning frameworks, data replication between data centers, intelligent real-time microservices running on Kubernetes and local deployment of analytic models for offline predictions.

    Join us to learn about the following:
    -Extreme scalability and unique features of Confluent Cloud
    -Building and deploying analytic models using TensorFlow, Confluent Cloud and GCP components such as Google Storage, Google ML Engine, Google Cloud AutoML and Google Kubernetes Engine in a hybrid cloud environment
    -Leveraging the Kafka ecosystem and Confluent Platform in hybrid infrastructures
  • How Apache Kafka® Works Recorded: Sep 9 2020 62 mins
    Michael Bingham, Technical Trainer, Confluent
    Pick up best practices for developing applications that use Apache Kafka, beginning with a high level code overview for a basic producer and consumer. From there we’ll cover strategies for building powerful stream processing applications, including high availability through replication, data retention policies, producer design and producer guarantees.

    We’ll delve into the details of delivery guarantees, including exactly-once semantics, partition strategies and consumer group rebalances. The talk will finish with a discussion of compacted topics, troubleshooting strategies and a security overview.

    This session is part 3 of 4 in our Fundamentals for Apache Kafka series.
  • Stream me to the Cloud (and back) with Confluent & MongoDB Recorded: Sep 8 2020 65 mins
    Gianluca Natali, Confluent & Felix Reichenbach, MongoDB
    Companies collect and store their data in various data stores and use a number of business applications and services to access, analyze and act on their data. Pulling all the data from disparate sources is difficult to manage, inefficient and ineffective in producing results. Event streaming and stream processing changes this paradigm. By enabling robust and reactive data pipelines between all your data stores, apps and services, you can make real-time decisions that are critical to your business.


    In this online talk, we’ll explore how and why companies are leveraging Confluent and MongoDB to modernize their architecture and leverage the scalability of the cloud and the velocity of streaming. Based upon a sample retail business scenario, we will explain how changes in an on-premise database are streamed via the Confluent Cloud to MongoDB Atlas and back.

    Key Learnings

    Modernize your architecture without revolutionizing it.

    Stream your data from multiple applications and data centers into the Cloud and back

    Confluent as the Central Nervous System of your architecture

    MongoDB Atlas as the flexible and scalable Modern Data Platform combining data from different sources and powering your frontend applications

    Why MongoDB and Confluent is such a great combination

    This architectural approach will allow you to dynamically scale the customer facing frontend, avoid over provisioning and enable the development team to rapidly implement new functionality which will differentiate you from your competition
We provide a central nervous system for streaming real-time data.
Confluent, founded by the creators of open source Apache Kafka®, provides the leading streaming platform that enables enterprises to maximize the value of data. Confluent Platform empowers leaders in industries such as retail, logistics, manufacturing, financial services, technology and media, to move data from isolated systems into a real-time data pipeline where they can act on it immediately.

Backed by Benchmark, Index Ventures and Sequoia, Confluent is based in Palo Alto, California. To learn more, please visit www.confluent.io.

Embed in website or blog

Successfully added emails: 0
Remove all
  • Title: On Track with Apache Kafka®: Building a Streaming ETL Solution with Rail Data
  • Live at: May 29 2020 10:00 pm
  • Presented by: Robin Moffatt, Developer Advocate, Confluent
  • From:
Your email has been sent.
or close