Hi [[ session.user.profile.firstName ]]

What’s New in Confluent Platform 5.5

Join the Confluent Product Marketing team as we provide an overview of Confluent Platform 5.5, which makes Apache Kafka and event streaming more broadly accessible to developers with enhancements to data compatibility, multi-language development, and ksqlDB.

Building an event-driven architecture with Apache Kafka allows you to transition from traditional silos and monolithic applications to modern microservices and event streaming applications. With these benefits has come an increased demand for Kafka developers from a wide range of industries. The Dice Tech Salary Report recently ranked Kafka as the highest-paid technological skill of 2019, a year removed from ranking it second.

With Confluent Platform 5.5, we are making it even simpler for developers to connect to Kafka and start building event streaming applications, regardless of their preferred programming languages or the underlying data formats used in their applications.

This session will cover the key features of this latest release, including:
-Support for Protobuf and JSON schemas in Confluent Schema Registry and throughout our entire platform
-Exactly once semantics for non-Java clients
-Admin functions in REST Proxy (preview)
-ksqlDB 0.7 and ksqlDB Flow View in Confluent Control Center
Recorded May 7 2020 39 mins
Your place is confirmed,
we'll send you email reminders
Presented by
Nick Bryan, Product Marketing Manager, Confluent + David Araujo, Sr. Product Manager, Confluent
Presentation preview: What’s New in Confluent Platform 5.5

Network with like-minded attendees

  • [[ session.user.profile.displayName ]]
    Add a photo
    • [[ session.user.profile.displayName ]]
    • [[ session.user.profile.jobTitle ]]
    • [[ session.user.profile.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(session.user.profile) ]]
  • [[ card.displayName ]]
    • [[ card.displayName ]]
    • [[ card.jobTitle ]]
    • [[ card.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(card) ]]
  • Channel
  • Channel profile
  • Building an Event Driven Global Data Fabric with Apache Kafka Apr 13 2021 5:00 pm UTC 58 mins
    Will LaForest, CTO Confluent Public Sector
    Agencies are grappling with a growing challenge of distributing data across a geographically diverse set of locations around the US and globally. In order to ensure mission success, data needs to flow to all of these locations rapidly. Additionally, latency, bandwidth and reliability of communication can prove to be a challenge for agencies. A global data fabric is an emerging approach to help connect mission to data across multiple locations and deliver uniformity and consistency at scale.

    This webinar will cover:

    An overview of Apache Kafka and and how an event streaming platform can support your agencies mission

    Considerations around handling varying quality communication links
    Synchronous vs asynchronous data replication
    New multi-region capabilities in Confluent Platform for Global Data Fabric
  • Bringing Industry Machine Learning to Government with Apache Kafka Mar 30 2021 5:00 pm UTC 49 mins
    Will LaForest, Public Sector CTO, Confluent
    Apache Kafka® & Machine Learning
    Connecting the Dots from Inception to Production

    Join this session to understand how and why Apache Kafka® has become the de facto standard for reliable and scalable streaming infrastructures. AI/Machine learning and the Apache Kafka ecosystem are a great combination for training, deploying and monitoring analytic models at scale in real time. They are showing up more and more in projects, but still feel like buzzwords and hype for science projects.

    See how to connect the dots!

    How are Kafka and Machine Learning related?
    How can they be combined to productionize analytic models in mission-critical and scalable real time applications?
    We will discuss a step-by-step approach to build a scalable and reliable real time infrastructure for anomaly detection in cyber data
  • Applied Apache Kafka for Government: Introduction to Kafka Mar 16 2021 4:00 pm UTC 55 mins
    Ken McCaleb, Enterprise Data Streaming Solutions Advisor, Confluent
    This is an introduction to Kafka and how government is modernizing systems to meet critical mission requirements using a modern event streaming platform.

    We will explain the origin of Kafka; why it is relevant; provide high level uses case examples of how it is used by the U.S. Government to greatly improve its capabilities to serve its citizens.
  • Confluent Real time event streaming - why it matters for Industry 4.0 Mar 10 2021 3:00 am UTC 60 mins
    James Gollan
    Companies now run global businesses that span the globe and hop between clouds in real-time, breaking down data silos to create seamless applications that connect the organisation internally and externally. This continuous state of change means that legacy architectures are insufficient or unsuitable to meet the needs of the modern organisation. Applications must be able to run 24×7 and be elastic, global, and cloud-native.

    Mining, resources, chemical, energy and industrial organisations must process billions of these events per day in real-time and ensure consistent and reliable data processing and correlation across machines, sensors and standard software.

    Enter event-driven architecture (EDA), a type of software architecture that ingests, processes, stores, and reacts to real-time data as it’s being generated, opening new capabilities in the way businesses run.

    Real-time streaming data enables you to modernise existing processes, streamline costs and extract more value out of your business data.

    Join us on Wednesday, March 10th 2pm AEST to learn how event streaming with Apache Kafka, Confluent Platform and Confluent Cloud provide a scalable, reliable, and efficient infrastructure to ensure you can leverage the value of real time data.

    In this session James Gollan, Senior Solutions Architect at Confluent, will discuss use cases and architectures for various scenarios, including:

    Agenda:


    10,000 Feet View – Event Streaming for Industry 4.0
    Events – What are they, and why do they matter?
    The three pillars of an event streaming platform
    Event driven microservices
    Event driven architecture and IoT and use cases
    Core data offload
    Machine learning for anomaly detection
    Monitoring telemetry on trucks - sensor detector
    Supply Chain Management
    Cybersecurity
    Q&A




    Advance registration is requested. We look forward to your participation!
  • Confluent “Enabling real time Digital Transformation in the public sector” Recorded: Mar 4 2021 48 mins
    Johnny Mirza
    Each one of us now has first hand experience of living through a pandemic where situations change minute by minute, and critical information, updates and changes must be rolled out to citizens, businesses, intra government agencies and other key stakeholders.

    Today, the need for access to real time data is undeniable and the Public Sector has never played more of a pivotal role in every citizen’s life.

    Where once real time may have been something to aim for in the future, today even near real time is not enough.

    Join Johnny Mirza, Senior Solutions Engineer at Confluent for a 45 mins + 15 min Q&A virtual discussion “Enabling real time Digital Transformation in the Public Sector” on Thursday March 4th at 2pm AEST 11am SGT / 12pm AEST. You will learn how event streaming is a real time infrastructure revolution that is fundamentally changing how public sector organisations think about data and build applications to resolve the real time challenge that thousands of other organisations are addressing around us.

    Beyond the current pandemic, event-driven architecture is the future of data infrastructure, and this talk is designed to demonstrate and share practical examples of how to leverage the power of your event streams in real-time to deliver on mission outcomes, better serve citizens, ensure security and compliance, enhance IT efficiency, and maximise productivity.


    This session is suitable for both non-technical and technical attendees.

    Attendees will gain:
    -an overview of key event streaming concepts including “everything is an event”
    -a framework for solution architects and developers looking to learn a new skill
    -an understanding of common Use Cases in the Public Sector such as
    -an introduction to some of the technology behind event streaming, including Apache Kafka
  • The New Cyber: Faster, Better, Cheaper Recorded: Feb 25 2021 29 mins
    Johnny Varley, Central Government Solutions Engineer at Confluent
    Government agencies face the need to augment traditional SIEM systems and to do so in a manner that is better, faster and cheaper than before. This need for SIEM augmentation is driven by three factors - threat, scale, and cost. In addition most agencies now realise that they want to be more independent from cloud and SIEM vendors, so having a way to bring on new analytic destinations, including modern SIEMs, is an emerging requirement.



    Confluent enables you to bridge the gap between old-school SIEM solutions and next-gen offerings by consolidating, categorising and enriching event logs, network data and log data generated by all relevant data sources for the purpose of real-time monitoring and security forensics.



    Join the Confluent team for this online talk in which you will learn:

    1. The top enterprise SIEM challenges facing government agencies
    2. The value of augmenting your SIEM with Confluent
    3. An overview of the Confluent solution and how it works
    4. Customer use cases and examples
    5. An integration example with Splunk
    6. The role Kafka plays in all of this
  • Transforming Financial Services with Event Stream Data Recorded: Feb 25 2021 44 mins
    Ananda Bose, Senior Solutions Engineering Manager, APAC, Confluent
    A recent research (attached) shows that nearly all financial services organizations (97%) consider it important to accelerate speed the flow of information and improve the responsiveness of the organization. With the advent of streaming data technologies that capture and process large volumes of data in real time, organizations can quickly turn events into valuable business outcomes in the form of new products and services or revenue.

    Banks such as Citigroup, Bank Rakyat Indonesia, RBC, DBS, Bank Mandiri and Euronext have partnered with Confluent to identify organizational or revenue producing improvements and develop new products or services with event streams.

    Attend this live FORUM as we delve deeper into the rise of streaming data ; how it is transforming the financial services industry by providing the critical, timely information needed to :
    - Deliver world-class customer service
    - Detect and prevent fraud,
    - Tie together disparate legacy systems
    - Power new services and sources of revenue.

    Agenda
    - Welcome and Introductions
    - Keynote: Transforming Financial Services with Event Stream data
    - Top Use Cases in Financial Services
    - Q & A
  • The Top 5 Event Streaming Use Cases & Architectures in 2021 Recorded: Feb 17 2021 36 mins
    Kai Waehner, Technology Evangelist, Confluent
    With just a few weeks of 2020 left, it's time to make some predictions on the top event streaming use cases that the Confluent Team expects to see in 2021. Given the unpredictability of 2020 this may seem brave or even a little fool-hardy but Kai Waehner, Senior Solutions Engineer at Confluent, is willing to strike out and offer his predictions for next year.


    Register now to access this fascinating online talk in which Kai will discuss his top five cutting-edge use cases and architectures that will be adopted by more and more enterprises in 2021. Learn how companies will leverage event streaming, Apache Kafka, and Confluent to meet the demand of a real-time market, rising regulations, and customer expectations, and much more in 2021:

    1. Edge deployments outside the data center: It's time to challenge the normality of limited hardware and disconnected infrastructure. Event streaming can provide low latency and cost-efficient data integration and data processing in retail stores, restaurants, trains, and other remote locations.


    2. Hybrid architectures: Discover how these span multiple sites across regions, continents, data centers, and clouds with real-time information at scale to connect legacy and modern infrastructures.



    3. Service mesh-based microservice architectures: Learn what becomes possible when organisations can provide a cloud-native event-based infrastructure for elastic and scalable applications and integration scenarios.

    4. Streaming machine learning: In 2021, many companies will move to streaming machine learning in production without the need for a data lake that enables scalable real-time analytics.

    5. Cybersecurity: While security never goes out of style, in 2021 we will see cybersecurity in real-time at scale with openness and flexibility at its core. This protects computer systems and networks and prevents the theft of or damage to software and data.
  • The New Cyber: Faster, Better, Cheaper Recorded: Feb 4 2021 30 mins
    Johnny Varley, Central Government Solutions Engineer at Confluent
    Government agencies face the need to augment traditional SIEM systems and to do so in a manner that is better, faster and cheaper than before. This need for SIEM augmentation is driven by three factors - threat, scale, and cost. In addition most agencies now realise that they want to be more independent from cloud and SIEM vendors, so having a way to bring on new analytic destinations, including modern SIEMs, is an emerging requirement.



    Confluent enables you to bridge the gap between old-school SIEM solutions and next-gen offerings by consolidating, categorising and enriching event logs, network data and log data generated by all relevant data sources for the purpose of real-time monitoring and security forensics.



    Join the Confluent team for this online talk in which you will learn:

    1. The top enterprise SIEM challenges facing government agencies
    2. The value of augmenting your SIEM with Confluent
    3. An overview of the Confluent solution and how it works
    4. Customer use cases and examples
    5. An integration example with Splunk
    6. The role Kafka plays in all of this
  • Kafka: The Bridge to ISS A/S One-Stop-Shop for Data Recorded: Feb 4 2021 61 mins
    Anders Møldrup, Technical Integration Architect
    Join Anders Møldrup, Technical Integration Architect, to discover how he and his colleagues built and deployed a one-stop-shop for enterprise data using Apache Kafka and Confluent Cloud for ISS A/S, an organisation that serves over 60k customers in 30-plus countries.



    This 60-minute online talk is packed with practical insights and by attending you will learn how Kafka fits into a data ecosystem that spans a global enterprise and supports use cases for both data ingestion and integration. Anders will share a strategic view of ISS A/S' global Kafka use cases and architecture, before delving into his unique learnings on what it takes to develop an Apache Kafka project from a concept on a slide all the way through to going into production in a global organisation.
  • Kafka: The Bridge to ISS A/S One-Stop-Shop for Data Recorded: Jan 28 2021 62 mins
    Anders Møldrup, Technical Integration Architect
    Join Anders Møldrup, Technical Integration Architect, to discover how he and his colleagues built and deployed a one-stop-shop for enterprise data using Apache Kafka and Confluent Cloud for ISS A/S, an organisation that serves over 60k customers in 30-plus countries.



    This 60-minute online talk is packed with practical insights and by attending you will learn how Kafka fits into a data ecosystem that spans a global enterprise and supports use cases for both data ingestion and integration. Anders will share a strategic view of ISS A/S' global Kafka use cases and architecture, before delving into his unique learnings on what it takes to develop an Apache Kafka project from a concept on a slide all the way through to going into production in a global organisation.
  • Driving Digital Innovation Recorded: Jan 28 2021 27 mins
    Mohammed Sleeq, Chief Digital Officer, Aramex
    Hear Mohammed Sleeq, Chief Digital Officer of Aramex discuss the role of Kafka and event streaming in digitizing Aramex's business.
  • Top 5 Event Streaming Architectures and Use Cases for 2021 Recorded: Jan 28 2021 59 mins
    Johnny Mirza, Senior Solutions Engineer, APAC, Confluent
    As we enter the new year, it's time to make some predictions on the top event streaming use cases that the Confluent Team expects to see in 2021. Given the unpredictability of 2020 this may seem brave or even a little fool-hardy but the team is willing to strike out and offer predictions for next year.


    Register now to access this fascinating online talk in which Johnny Mirza, Senior Solutions Engineer, APAC at Confluent will discuss his top five cutting-edge use cases and architectures that will be adopted by more and more enterprises in 2021. Learn how companies will leverage event streaming, Apache Kafka, and Confluent to meet the demand of a real-time market, rising regulations, and customer expectations, and much more in 2021:

    1. Edge deployments outside the data center: It's time to challenge the normality of limited hardware and disconnected infrastructure. Event streaming can provide low latency and cost-efficient data integration and data processing in retail stores, restaurants, trains, and other remote locations.


    2. Hybrid architectures: Discover how these span multiple sites across regions, continents, data centers, and clouds with real-time information at scale to connect legacy and modern infrastructures.



    3. Service mesh-based microservice architectures: Learn what becomes possible when organisations can provide a cloud-native event-based infrastructure for elastic and scalable applications and integration scenarios.

    4. Streaming machine learning: In 2021, many companies will move to streaming machine learning in production without the need for a data lake that enables scalable real-time analytics.

    5. Cybersecurity: While security never goes out of style, in 2021 we will see cybersecurity in real-time at scale with openness and flexibility at its core. This protects computer systems and networks and prevents the theft of or damage to software and data.
  • Journey to Event-driven Architecture Recorded: Jan 21 2021 62 mins
    Naveen Nandan, Solutions Engineer, Confluent Asia Pacific
    The world is changing. New problems need to be solved. Companies now run global businesses that span the globe and hop between clouds, breaking down silos to create seamless applications that connect the organisation. There is a continuous state of change that organisations must manage and innovate with.

    Traditional architectures simply cannot meet the challenges of real time and extreme scale. Today, we are addressing these new, rising needs through microservices, IoT, cloud, machine learning and more. At some point it becomes obvious that we need to go back to basics, back to first principles of system design, and start again.

    The common element of all these new world problems is that they revolve around the notion of events. These events drive actions and reactions, and transform between different streams, splitting, merging and evolving like the pathways of your brain.

    To understand the importance of being event driven, we’ll examine why events have become so pivotal in our thinking today. We will then evaluate the qualities and how events have become a first-class concern for the modern organisation, as awareness of events underpins event-first thinking and design. In this discussion we we will examine:

    History of “events” – Why do they matter?
    -Adoption journey of the “event”
    -Considerations of the event-driven architecture
    -Transitioning to event-first thinking
    -Event-first versus event-command patterns for event-driven design
    -Event-command pattern
    -Benefits of the event-first approach
  • Delivering Technology Excellence in a Changing World Recorded: Jan 20 2021 20 mins
    Neil Drennan, CTO, 10x Future Technologies
    Listen to Neil Drennan of 10x Future Technologies discuss how 10x is transforming banking and the technology stack and strategy that he and his team are implementing.
  • Apache Kafka® Use Cases for Financial Services Recorded: Jan 12 2021 62 mins
    Tom Green, Senior Solutions Architect, Confluent.
    Traditional systems were designed in an era that predates large-scale distributed systems. These systems often lack the ability to scale to meet the needs of the modern data-driven organisation. Adding to this is the accumulation of technologies and the explosion of data which can result in complex point-to-point integrations where data becomes siloed or separated across the enterprise.



    The demand for fast results and decision making, have generated the need for real-time event streaming and processing of data adoption in financial institutions to be on the competitive edge. Apache Kafka and the Confluent Platform are designed to solve the problems associated with traditional systems and provide a modern, distributed architecture and Real-time Data streaming capability. In addition these technologies open up a range of use cases for Financial Services organisations, many of which will be explored in this talk. .



    By attending this talk you will develop a new understanding of:



    •How Apache Kafka enables a 360 view of the customer

    •How to provide a back bone for distribution of trade data

    •How Kafka and Confluent Platform enable you to meet regulatory requirements for trade information, payments, liquidity

    •How to overcome security concerns with SIEM

    •How to integrate mainframe data with event streaming and the cloud

    •How to reduce fraud with real-time fraud processing, fraud analytics and fraud notifications.

    •How to Develop and enhance microservices.
  • The Top 5 Event Streaming Use Cases & Architectures in 2021 Recorded: Dec 30 2020 36 mins
    Kai Waehner, Technology Evangelist, Confluent
    With just a few weeks of 2020 left, it's time to make some predictions on the top event streaming use cases that the Confluent Team expects to see in 2021. Given the unpredictability of 2020 this may seem brave or even a little fool-hardy but Kai Waehner, Senior Solutions Engineer at Confluent, is willing to strike out and offer his predictions for next year.


    Register now to access this fascinating online talk in which Kai will discuss his top five cutting-edge use cases and architectures that will be adopted by more and more enterprises in 2021. Learn how companies will leverage event streaming, Apache Kafka, and Confluent to meet the demand of a real-time market, rising regulations, and customer expectations, and much more in 2021:

    1. Edge deployments outside the data center: It's time to challenge the normality of limited hardware and disconnected infrastructure. Event streaming can provide low latency and cost-efficient data integration and data processing in retail stores, restaurants, trains, and other remote locations.


    2. Hybrid architectures: Discover how these span multiple sites across regions, continents, data centers, and clouds with real-time information at scale to connect legacy and modern infrastructures.



    3. Service mesh-based microservice architectures: Learn what becomes possible when organisations can provide a cloud-native event-based infrastructure for elastic and scalable applications and integration scenarios.

    4. Streaming machine learning: In 2021, many companies will move to streaming machine learning in production without the need for a data lake that enables scalable real-time analytics.

    5. Cybersecurity: While security never goes out of style, in 2021 we will see cybersecurity in real-time at scale with openness and flexibility at its core. This protects computer systems and networks and prevents the theft of or damage to software and data.
  • Driving Digital Innovation Recorded: Dec 22 2020 20 mins
    Neil Drennan, 10x Future Technologies
    Listen to Neil Drennan, CTO of 10x Future Technologies discuss how 10x is transforming banking and the technology stack and strategy that he and his team are implementing.
  • Stream me to the Cloud (and back) with Confluent & MongoDB Recorded: Dec 17 2020 65 mins
    Gianluca Natali, Confluent & Felix Reichenbach, MongoDB
    Companies collect and store their data in various data stores and use a number of business applications and services to access, analyze and act on their data. Pulling all the data from disparate sources is difficult to manage, inefficient and ineffective in producing results. Event streaming and stream processing changes this paradigm. By enabling robust and reactive data pipelines between all your data stores, apps and services, you can make real-time decisions that are critical to your business.


    In this online talk, we’ll explore how and why companies are leveraging Confluent and MongoDB to modernize their architecture and leverage the scalability of the cloud and the velocity of streaming. Based upon a sample retail business scenario, we will explain how changes in an on-premise database are streamed via the Confluent Cloud to MongoDB Atlas and back.

    Key Learnings

    Modernize your architecture without revolutionizing it.

    Stream your data from multiple applications and data centers into the Cloud and back

    Confluent as the Central Nervous System of your architecture

    MongoDB Atlas as the flexible and scalable Modern Data Platform combining data from different sources and powering your frontend applications

    Why MongoDB and Confluent is such a great combination

    This architectural approach will allow you to dynamically scale the customer facing frontend, avoid over provisioning and enable the development team to rapidly implement new functionality which will differentiate you from your competition
  • Building a Secure, Tamper-Proof & Scalable Blockchain with AiB’s KafkaBlockchain Recorded: Dec 16 2020 54 mins
    Kai Waehner, Technology Evangelist, Confluent Stephen Reed, CTO, Co-Founder, AiB
    Apache Kafka is an open-source event streaming platform used to complement or replace existing middleware, integrate applications, and build microservice architectures. Used at almost every large company today, it's understood, battled-tested, highly scalable, and reliable.

    Blockchain is a different story. Being related to cryptocurrencies like Bitcoin, it's often in the news. But what is the value for software architectures? And how is it related to an integration architecture and event streaming platform?

    This session explores blockchain use cases and different alternatives such as Hyperledger, Ethereum, and Kafka-native blockchain implementation. We discuss the value blockchain brings for different architectures, and how it can be integrated with the Kafka ecosystem to build a highly scalable and reliable event streaming infrastructure.
We provide a central nervous system for streaming real-time data.
Confluent, founded by the creators of open source Apache Kafka®, provides the leading streaming platform that enables enterprises to maximize the value of data. Confluent Platform empowers leaders in industries such as retail, logistics, manufacturing, financial services, technology and media, to move data from isolated systems into a real-time data pipeline where they can act on it immediately.

Backed by Benchmark, Index Ventures and Sequoia, Confluent is based in Palo Alto, California. To learn more, please visit www.confluent.io.

Embed in website or blog

Successfully added emails: 0
Remove all
  • Title: What’s New in Confluent Platform 5.5
  • Live at: May 7 2020 9:35 pm
  • Presented by: Nick Bryan, Product Marketing Manager, Confluent + David Araujo, Sr. Product Manager, Confluent
  • From:
Your email has been sent.
or close