Hi [[ session.user.profile.firstName ]]

Spring Kafka Beyond the Basics - Lessons learned on our Kafka Journey at ING Ban

In this talk, Tim van Baarsen will take you on a journey beyond the basics of Spring Kafka and will share his knowledge, pitfalls and lessons learned based on real-life Kafka projects that are running in production for many years at ING Bank in the Netherlands.
Recorded Jun 9 2020 57 mins
Your place is confirmed,
we'll send you email reminders
Presented by
Tim van Baarsen
Presentation preview: Spring Kafka Beyond the Basics - Lessons learned on our Kafka Journey at ING Ban

Network with like-minded attendees

  • [[ session.user.profile.displayName ]]
    Add a photo
    • [[ session.user.profile.displayName ]]
    • [[ session.user.profile.jobTitle ]]
    • [[ session.user.profile.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(session.user.profile) ]]
  • [[ card.displayName ]]
    • [[ card.displayName ]]
    • [[ card.jobTitle ]]
    • [[ card.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(card) ]]
  • Channel
  • Channel profile
  • Apache Kafka Architecture & Fundamentals Explained Aug 26 2020 9:00 pm UTC 57 mins
    Joe Desmond, Technical Trainer, Confluent
    This session explains Apache Kafka’s internal design and architecture. Companies like LinkedIn are now sending more than 1 trillion messages per day to Apache Kafka. Learn about the underlying design in Kafka that leads to such high throughput.

    This talk provides a comprehensive overview of Kafka architecture and internal functions, including:
    -Topics, partitions and segments
    -The commit log and streams
    -Brokers and broker replication
    -Producer basics
    -Consumers, consumer groups and offsets

    This session is part 2 of 4 in our Fundamentals for Apache Kafka series.
  • Bosch Power Tools Enables Real-time Analytics on IoT Event Streams Aug 25 2020 11:30 pm UTC 43 mins
    Ralph Debusmann, IoT Solution Architect, Bosch Power Tools + Jeff Bean, Confluent + Igor Canadi, Rockset
    Bosch Power Tools, a division of The Bosch Group, is among the world’s leading providers of power tools, power tool accessories, and measuring tools. To accelerate its IoT initiatives, Bosch uses event streaming with manufacturing data from IoT sensor devices to monitor and respond to inefficiencies in the production system, changes in the supply chain, and product quality reports in the field. In this online talk, Bosch’s Ralph Debusmann outlines their architectural vision for bringing many data streams into a single platform, surrounded by databases that can power complex real-time analytics.
  • How Apache Kafka® Works Aug 24 2020 10:00 pm UTC 62 mins
    Michael Bingham, Technical Trainer, Confluent
    Pick up best practices for developing applications that use Apache Kafka, beginning with a high level code overview for a basic producer and consumer. From there we’ll cover strategies for building powerful stream processing applications, including high availability through replication, data retention policies, producer design and producer guarantees.

    We’ll delve into the details of delivery guarantees, including exactly-once semantics, partition strategies and consumer group rebalances. The talk will finish with a discussion of compacted topics, troubleshooting strategies and a security overview.

    This session is part 3 of 4 in our Fundamentals for Apache Kafka series.
  • Being an Apache Kafka® Expert in a Multi-Cloud World Aug 20 2020 7:00 pm UTC 30 mins
    Ricardo Ferreira, Developer Advocate, Confluent
    Apache Kafka is an amazing piece of technology, that has been furiously adopted by companies all around the world to implement event-driven architectures. While its adoption continues to increase, the reality is that most developers often complain about the complexity of managing the clusters by themselves, which seriously decreases their ability to be agile.

    This 30-minute demo will introduce Confluent Cloud, a service that offers Apache Kafka and the Confluent Platform so developers can focus on what they do best: the coding part. We will show you how to quickly reuse code written for standard Kafka APIs to connect to Confluent Cloud and how an event-streaming application is built and deployed.
  • Enabling Event Streaming at AO.com Aug 20 2020 10:00 am UTC 50 mins
    Jon Vines, AO.com
    Learn how AO.com are enabling real-time event-driven applications to improve customer experience using Confluent Platform.
    The introduction of Apache Kafka and the Confluent platform is supporting AO.com in modernizing the technical approach to delighting its customers. A key part of this enablement is the introduction of an event-streaming eco-system enabling event-driven applications and architecture.
    Throughout this session, we’ll look at the challenges AO.com faced when looking to adopt Kafka, their use of Confluent Platform including Kafka Connect and KSQL and the adoption of Confluent Cloud. We’ll look at the first steps, where the team are at now and what the future looks like.
  • Journey to the event-driven business with Apache Kafka Aug 18 2020 12:00 pm UTC 60 mins
    Ala Al-sharif
    According to Gartner, “By 2020 event sourced, real-time situational awareness will be a required characteristic for 80% of digital business solutions. And 80% of new business ecosystems will require support for event processing.”

    In this webinar Whishworks and Confluent discuss why event streaming has become so important for business success and what it takes to become an event-driven organisation:

    Why companies are adopting Real Time and Event Streaming
    - Business Digitisation Trends - modernising data architecture
    - Enabling new outcomes

    The journey to become an event-driven business with Kafka
    - 5 steps to the event-driven business
    - Enterprise scale event streaming
    - Use cases across industries
  • Enabling Smarter Cities and Connected Vehicles with Apache Kafka Aug 14 2020 10:00 pm UTC 57 mins
    Kai Waehner, Technology Evangelist, Confluent + Rob Cowert, Systems Engineer, Confluent
    Many cities are investing in technologies to transform their cities into smart city environments in which data collection and analysis is utilized to manage assets and resources efficiently. Modern technology can help connect the right data, at the right time, to the right people, processes and systems. Innovations around smart cities and the Internet of Things give cities the ability to improve motor safety, unify and manage transportation systems and traffic, save energy and provide a better experience for the residents.

    By utilizing an event streaming platform, like Confluent, cities are able to process data in real-time from thousands of sources, such as sensors. By aggregating that data and analyzing real-time data streams, more informed decisions can be made and fine-tuned operations developed for a positive impact on everyday challenges faced by cities.

    Watch this webinar to learn how to:
    -Overcome challenges for building a smarter city
    -Build a real time infrastructure to correlate relevant events
    -Connect thousands of devices, machines, and people
    -Leverage open source and fully-managed solutions from the Apache Kafka ecosystem
  • Bosch Power Tools Enables Real-time Analytics on IoT Event Streams Aug 12 2020 6:15 pm UTC 43 mins
    Ralph Debusmann, IoT Solution Architect, Bosch Power Tools + Jeff Bean, Confluent + Igor Canadi, Rockset
    Bosch Power Tools, a division of The Bosch Group, is among the world’s leading providers of power tools, power tool accessories, and measuring tools. To accelerate its IoT initiatives, Bosch uses event streaming with manufacturing data from IoT sensor devices to monitor and respond to inefficiencies in the production system, changes in the supply chain, and product quality reports in the field. In this online talk, Bosch’s Ralph Debusmann outlines their architectural vision for bringing many data streams into a single platform, surrounded by databases that can power complex real-time analytics.
  • How Apache Kafka® Works Aug 10 2020 8:00 pm UTC 62 mins
    Michael Bingham, Technical Trainer, Confluent
    Pick up best practices for developing applications that use Apache Kafka, beginning with a high level code overview for a basic producer and consumer. From there we’ll cover strategies for building powerful stream processing applications, including high availability through replication, data retention policies, producer design and producer guarantees.

    We’ll delve into the details of delivery guarantees, including exactly-once semantics, partition strategies and consumer group rebalances. The talk will finish with a discussion of compacted topics, troubleshooting strategies and a security overview.

    This session is part 3 of 4 in our Fundamentals for Apache Kafka series.
  • Elastically Scaling Kafka Using Confluent Aug 7 2020 5:00 pm UTC 60 mins
    Josh Rosenberg, Group Product Marketing, Confluent + Ricardo Ferreira, Developer Advocate, Confluent
    The number of mission-critical apps and the amount of data underpinning them has grown exponentially in the increasingly digital world - with no sign of slowing down. But rigid data architectures slow organizations down, forcing them to spend too much up front on resources they don’t use, and causing lag or downtime across their apps. In today’s hypercompetitive digital world where customer loyalty connects with the best performing applications, every millisecond delay could be another lost customer.

    Adjusting to the real-time needs of your mission-critical apps is only possible with an architecture that scales elastically. Confluent re-engineered Apache Kafka into an elastically scalable, next-gen event streaming platform that processes real-time data wherever it lives - making it accessible for any budget or use case.

    Register now to learn how to:
    -Quickly deploy Kafka in Confluent Cloud with just a few clicks and elastically scale your workloads
    -Effortlessly connect your critical data sources and sinks to Kafka to build a complete data pipeline for your real-time apps
    -Easily process streaming data with a simple interactive SQL interface using fully-managed KSQL
    -Accelerate the deployment of standardized, self-managed Kafka clusters as cloud-native systems on Kubernetes to achieve elastic scale with Confluent Platform
  • Journey to the event-driven business with Apache Kafka Aug 5 2020 1:00 pm UTC 60 mins
    Ala Al-sharif
    According to Gartner, “By 2020 event sourced, real-time situational awareness will be a required characteristic for 80% of digital business solutions. And 80% of new business ecosystems will require support for event processing.”

    In this webinar Whishworks and Confluent discuss why event streaming has become so important for business success and what it takes to become an event-driven organisation:

    Why companies are adopting Real Time and Event Streaming
    - Business Digitisation Trends - modernising data architecture
    - Enabling new outcomes

    The journey to become an event-driven business with Kafka
    - 5 steps to the event-driven business
    - Enterprise scale event streaming
    - Use cases across industries
  • Data-Centric Security, Governance and Encryption for Apache Kafka at Scale Recorded: Aug 3 2020 46 mins
    Ala Al-sharif, Sales Engineer, Confluent
    As businesses unite around Apache Kafka® as the single source of truth and Confluent Platform as the organisation's central nervous system, enabling data-centric security and governance features becomes crucial. Don’t miss ‘Data Security, Governance & Encryption at Scale’, an online talk in which speakers from Confluent, SecuPi and Marionete will discuss the things you must absolutely get right for data protection and privacy when using Apache Kafka and Confluent KSQL.



    Join us on Tuesday June 2nd at 2.00 pm London time to learn more about:


    1. Must-have data-centric security and GDPR/CCPA privacy requirements of streaming large scale near real time (NRT) data
    2. How to classify, monitor, audit and encrypt (with Hold Your Own Key - HYOK)/mask/filter sensitive data flows for Kafka and Confluent KSQL using SecuPi Data-Centric Platform
    3. How Confluent Platform and SecuPi integrate with Marionete expert services for end to end data protection and privacy compliance capabilities for Kafka implementations
  • Event Streaming in the Telco Industry with Apache Kafka and Confluent Recorded: Jul 31 2020 60 mins
    Kai Waehner
    Real-time data streaming is a hot topic in the Telecommunications Industry. As telecommunications companies strive to offer high speed, integrated networks with reduced connection times, connect countless devices at reduced latency, and transform the digital experience worldwide, more and more companies are turning to Apache Kafka’s data stream processing solutions to deliver a scalable, real-time infrastructure for OSS and BSS scenarios. Enabling a combination of on-premise data centers, edge processing, and multi-cloud architectures is becoming the new normal in the Telco Industry. This combination is enabling accelerated growth from value-added services delivered over mobile networks.

    Join Kai Waehner, Technology Evangelist at Confluent, for this session which explores various telecommunications use cases, including data integration, infrastructure monitoring, data distribution, data processing and business applications. Different architectures and components from the Kafka ecosystem are also discussed.

    Join this online talk to learn how to:
    - Overcome challenges for building a modern hybrid telco infrastructure
    - Build a real time infrastructure to correlate relevant events
    - Connect thousands of devices, networks, infrastructures, and people
    - Work together with different companies, organisations and business models
    - Leverage open source and fully managed solutions from the Apache Kafka ecosystem, Confluent Platform and Confluent Cloud
  • Elastically Scaling Kafka Using Confluent Recorded: Jul 30 2020 60 mins
    Josh Rosenberg, Group Product Marketing, Confluent + Ricardo Ferreira, Developer Advocate, Confluent
    The number of mission-critical apps and the amount of data underpinning them has grown exponentially in the increasingly digital world - with no sign of slowing down. But rigid data architectures slow organizations down, forcing them to spend too much up front on resources they don’t use, and causing lag or downtime across their apps. In today’s hypercompetitive digital world where customer loyalty connects with the best performing applications, every millisecond delay could be another lost customer.

    Adjusting to the real-time needs of your mission-critical apps is only possible with an architecture that scales elastically. Confluent re-engineered Apache Kafka into an elastically scalable, next-gen event streaming platform that processes real-time data wherever it lives - making it accessible for any budget or use case.

    Register now to learn how to:
    -Quickly deploy Kafka in Confluent Cloud with just a few clicks and elastically scale your workloads
    -Effortlessly connect your critical data sources and sinks to Kafka to build a complete data pipeline for your real-time apps
    -Easily process streaming data with a simple interactive SQL interface using fully-managed KSQL
    -Accelerate the deployment of standardized, self-managed Kafka clusters as cloud-native systems on Kubernetes to achieve elastic scale with Confluent Platform
  • Introducing Events and Stream Processing into Nationwide Building Society. Recorded: Jul 22 2020 49 mins
    Rob Jackson, Head of Application Architecture at Nationwide Building Society
    Open Banking regulations compel the UK’s largest banks, and building societies to enable their customers to share personal information with other regulated companies securely. As a result companies such as Nationwide Building Society are re-architecting their processes and infrastructure around customer needs to reduce the risk of losing relevance and the ability to innovate.

    In this online talk, you will learn why, when facing Open Banking regulation and rapidly increasing transaction volumes, Nationwide decided to take load off their back-end systems through real-time streaming of data changes into Apache Kafka®. You will hear how Nationwide started their journey with Apache Kafka®, beginning with the initial use case of creating a real-time data cache using Change Data Capture, Confluent Platform and Microservices. Rob Jackson, Head of Application Architecture, will also cover how Confluent enabled Nationwide to build the stream processing backbone that is being used to re-engineer the entire banking experience including online banking, payment processing and mortgage applications.
  • Event Streaming in the Telco Industry with Apache Kafka and Confluent Recorded: Jul 8 2020 60 mins
    Kai Waehner
    Real-time data streaming is a hot topic in the Telecommunications Industry. As telecommunications companies strive to offer high speed, integrated networks with reduced connection times, connect countless devices at reduced latency, and transform the digital experience worldwide, more and more companies are turning to Apache Kafka’s data stream processing solutions to deliver a scalable, real-time infrastructure for OSS and BSS scenarios. Enabling a combination of on-premise data centers, edge processing, and multi-cloud architectures is becoming the new normal in the Telco Industry. This combination is enabling accelerated growth from value-added services delivered over mobile networks.

    Join Kai Waehner, Technology Evangelist at Confluent, for this session which explores various telecommunications use cases, including data integration, infrastructure monitoring, data distribution, data processing and business applications. Different architectures and components from the Kafka ecosystem are also discussed.

    Join this online talk to learn how to:
    - Overcome challenges for building a modern hybrid telco infrastructure
    - Build a real time infrastructure to correlate relevant events
    - Connect thousands of devices, networks, infrastructures, and people
    - Work together with different companies, organisations and business models
    - Leverage open source and fully managed solutions from the Apache Kafka ecosystem, Confluent Platform and Confluent Cloud
  • Data-Centric Security, Governance and Encryption for Apache Kafka at Scale Recorded: Jun 25 2020 46 mins
    Ala Al-sharif, Sales Engineer, Confluent
    As businesses unite around Apache Kafka® as the single source of truth and Confluent Platform as the organisation's central nervous system, enabling data-centric security and governance features becomes crucial. Don’t miss ‘Data Security, Governance & Encryption at Scale’, an online talk in which speakers from Confluent, SecuPi and Marionete will discuss the things you must absolutely get right for data protection and privacy when using Apache Kafka and Confluent KSQL.



    Join us on Tuesday June 2nd at 2.00 pm London time to learn more about:


    1. Must-have data-centric security and GDPR/CCPA privacy requirements of streaming large scale near real time (NRT) data
    2. How to classify, monitor, audit and encrypt (with Hold Your Own Key - HYOK)/mask/filter sensitive data flows for Kafka and Confluent KSQL using SecuPi Data-Centric Platform
    3. How Confluent Platform and SecuPi integrate with Marionete expert services for end to end data protection and privacy compliance capabilities for Kafka implementations
  • Journey to the event-driven business with Apache Kafka Recorded: Jun 17 2020 60 mins
    Ala Al-sharif
    According to Gartner, “By 2020 event sourced, real-time situational awareness will be a required characteristic for 80% of digital business solutions. And 80% of new business ecosystems will require support for event processing.”

    In this webinar Whishworks and Confluent discuss why event streaming has become so important for business success and what it takes to become an event-driven organisation:

    Why companies are adopting Real Time and Event Streaming
    - Business Digitisation Trends - modernising data architecture
    - Enabling new outcomes

    The journey to become an event-driven business with Kafka
    - 5 steps to the event-driven business
    - Enterprise scale event streaming
    - Use cases across industries
  • Spring Kafka Beyond the Basics - Lessons learned on our Kafka Journey at ING Ban Recorded: Jun 9 2020 57 mins
    Tim van Baarsen
    In this talk, Tim van Baarsen will take you on a journey beyond the basics of Spring Kafka and will share his knowledge, pitfalls and lessons learned based on real-life Kafka projects that are running in production for many years at ING Bank in the Netherlands.
  • On Track with Apache Kafka®: Building a Streaming ETL Solution with Rail Data Recorded: May 29 2020 58 mins
    Robin Moffatt, Developer Advocate, Confluent
    As data engineers, we frequently need to build scalable systems working with data from a variety of sources and with various ingest rates, sizes, and formats. This talk takes an in-depth look at how Apache Kafka can be used to provide a common platform on which to build data infrastructure driving both real-time analytics as well as event-driven applications.

    Using a public feed of railway data it will show how to ingest data from message queues such as ActiveMQ with Kafka Connect, as well as from static sources such as S3 and REST endpoints. We'll then see how to use stream processing to transform the data into a form useful for streaming to analytics in tools such as Elasticsearch and Neo4j. The same data will be used to drive a real-time notifications service through Telegram.

    If you're wondering how to build your next scalable data platform, how to reconcile the impedance mismatch between stream and batch, and how to wrangle streams of data—this talk is for you!
We provide a central nervous system for streaming real-time data.
Confluent, founded by the creators of open source Apache Kafka®, provides the leading streaming platform that enables enterprises to maximize the value of data. Confluent Platform empowers leaders in industries such as retail, logistics, manufacturing, financial services, technology and media, to move data from isolated systems into a real-time data pipeline where they can act on it immediately.

Backed by Benchmark, Index Ventures and Sequoia, Confluent is based in Palo Alto, California. To learn more, please visit www.confluent.io.

Embed in website or blog

Successfully added emails: 0
Remove all
  • Title: Spring Kafka Beyond the Basics - Lessons learned on our Kafka Journey at ING Ban
  • Live at: Jun 9 2020 10:00 pm
  • Presented by: Tim van Baarsen
  • From:
Your email has been sent.
or close