Hi [[ session.user.profile.firstName ]]

Using Confluent & Google Apigee to Enable your Events on Hybrid Deployment Model

Today’s customers expect channels, whether physical or digital, to blend together into a cohesive connected experience. APIs help enterprises to not only meet customer expectations, but also participate in software ecosystems and provide unprecedented opportunities for reach and economies of scale. By attending this online talk hosted by Google Apigee and Confluent you will learn how APIs allow you to streamline, secure and monetize access to your data and services to deliver a cohesive experience. In addition you will see just how easy it is to integrate the Confluent platform with Google Apigee.


Register now to learn more about:



•The challenges faced by CIOs face when it comes to getting value out of data
• How to simplify the capture, generation and consumption of data, in a secure and monitored way
• How to deal with the exponentially growing volume of data, devices and systems

• Why Apigee offers a reliable and scalable API management platform

• How APIs touch every stage of the digital journey

• The value of having a layer of microservices that allows for agile developments
Recorded Oct 15 2020 58 mins
Your place is confirmed,
we'll send you email reminders
Presented by
Dan Croft, Confluent & Mathilde Fabre, Google Cloud Apigee
Presentation preview: Using Confluent & Google Apigee to Enable your Events on Hybrid Deployment Model

Network with like-minded attendees

  • [[ session.user.profile.displayName ]]
    Add a photo
    • [[ session.user.profile.displayName ]]
    • [[ session.user.profile.jobTitle ]]
    • [[ session.user.profile.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(session.user.profile) ]]
  • [[ card.displayName ]]
    • [[ card.displayName ]]
    • [[ card.jobTitle ]]
    • [[ card.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(card) ]]
  • Channel
  • Channel profile
  • Driving Digital Innovation Jan 28 2021 10:15 am UTC 27 mins
    Mohammed Sleeq, Chief Digital Officer, Aramex
    Hear Mohammed Sleeq, Chief Digital Officer of Aramex discuss the role of Kafka and event streaming in digitizing Aramex's business.
  • Top 5 Event Streaming Architectures and Use Cases for 2021 Jan 28 2021 3:00 am UTC 45 mins
    Johnny Mirza, Senior Solutions Engineer, APAC, Confluent
    As we enter the new year, it's time to make some predictions on the top event streaming use cases that the Confluent Team expects to see in 2021. Given the unpredictability of 2020 this may seem brave or even a little fool-hardy but the team is willing to strike out and offer predictions for next year.


    Register now to access this fascinating online talk in which Johnny Mirza, Senior Solutions Engineer, APAC at Confluent will discuss his top five cutting-edge use cases and architectures that will be adopted by more and more enterprises in 2021. Learn how companies will leverage event streaming, Apache Kafka, and Confluent to meet the demand of a real-time market, rising regulations, and customer expectations, and much more in 2021:

    1. Edge deployments outside the data center: It's time to challenge the normality of limited hardware and disconnected infrastructure. Event streaming can provide low latency and cost-efficient data integration and data processing in retail stores, restaurants, trains, and other remote locations.


    2. Hybrid architectures: Discover how these span multiple sites across regions, continents, data centers, and clouds with real-time information at scale to connect legacy and modern infrastructures.



    3. Service mesh-based microservice architectures: Learn what becomes possible when organisations can provide a cloud-native event-based infrastructure for elastic and scalable applications and integration scenarios.

    4. Streaming machine learning: In 2021, many companies will move to streaming machine learning in production without the need for a data lake that enables scalable real-time analytics.

    5. Cybersecurity: While security never goes out of style, in 2021 we will see cybersecurity in real-time at scale with openness and flexibility at its core. This protects computer systems and networks and prevents the theft of or damage to software and data.
  • Journey to Event-driven Architecture Recorded: Jan 21 2021 62 mins
    Naveen Nandan, Solutions Engineer, Confluent Asia Pacific
    The world is changing. New problems need to be solved. Companies now run global businesses that span the globe and hop between clouds, breaking down silos to create seamless applications that connect the organisation. There is a continuous state of change that organisations must manage and innovate with.

    Traditional architectures simply cannot meet the challenges of real time and extreme scale. Today, we are addressing these new, rising needs through microservices, IoT, cloud, machine learning and more. At some point it becomes obvious that we need to go back to basics, back to first principles of system design, and start again.

    The common element of all these new world problems is that they revolve around the notion of events. These events drive actions and reactions, and transform between different streams, splitting, merging and evolving like the pathways of your brain.

    To understand the importance of being event driven, we’ll examine why events have become so pivotal in our thinking today. We will then evaluate the qualities and how events have become a first-class concern for the modern organisation, as awareness of events underpins event-first thinking and design. In this discussion we we will examine:

    History of “events” – Why do they matter?
    -Adoption journey of the “event”
    -Considerations of the event-driven architecture
    -Transitioning to event-first thinking
    -Event-first versus event-command patterns for event-driven design
    -Event-command pattern
    -Benefits of the event-first approach

    How to Participate
  • Delivering Technology Excellence in a Changing World Recorded: Jan 20 2021 20 mins
    Neil Drennan, CTO, 10x Future Technologies
    Listen to Neil Drennan of 10x Future Technologies discuss how 10x is transforming banking and the technology stack and strategy that he and his team are implementing.
  • Apache Kafka® Use Cases for Financial Services Recorded: Jan 12 2021 62 mins
    Tom Green, Senior Solutions Architect, Confluent.
    Traditional systems were designed in an era that predates large-scale distributed systems. These systems often lack the ability to scale to meet the needs of the modern data-driven organisation. Adding to this is the accumulation of technologies and the explosion of data which can result in complex point-to-point integrations where data becomes siloed or separated across the enterprise.



    The demand for fast results and decision making, have generated the need for real-time event streaming and processing of data adoption in financial institutions to be on the competitive edge. Apache Kafka and the Confluent Platform are designed to solve the problems associated with traditional systems and provide a modern, distributed architecture and Real-time Data streaming capability. In addition these technologies open up a range of use cases for Financial Services organisations, many of which will be explored in this talk. .



    By attending this talk you will develop a new understanding of:



    •How Apache Kafka enables a 360 view of the customer

    •How to provide a back bone for distribution of trade data

    •How Kafka and Confluent Platform enable you to meet regulatory requirements for trade information, payments, liquidity

    •How to overcome security concerns with SIEM

    •How to integrate mainframe data with event streaming and the cloud

    •How to reduce fraud with real-time fraud processing, fraud analytics and fraud notifications.

    •How to Develop and enhance microservices.
  • The Top 5 Event Streaming Use Cases & Architectures in 2021 Recorded: Dec 30 2020 36 mins
    Kai Waehner, Technology Evangelist, Confluent
    With just a few weeks of 2020 left, it's time to make some predictions on the top event streaming use cases that the Confluent Team expects to see in 2021. Given the unpredictability of 2020 this may seem brave or even a little fool-hardy but Kai Waehner, Senior Solutions Engineer at Confluent, is willing to strike out and offer his predictions for next year.


    Register now to access this fascinating online talk in which Kai will discuss his top five cutting-edge use cases and architectures that will be adopted by more and more enterprises in 2021. Learn how companies will leverage event streaming, Apache Kafka, and Confluent to meet the demand of a real-time market, rising regulations, and customer expectations, and much more in 2021:

    1. Edge deployments outside the data center: It's time to challenge the normality of limited hardware and disconnected infrastructure. Event streaming can provide low latency and cost-efficient data integration and data processing in retail stores, restaurants, trains, and other remote locations.


    2. Hybrid architectures: Discover how these span multiple sites across regions, continents, data centers, and clouds with real-time information at scale to connect legacy and modern infrastructures.



    3. Service mesh-based microservice architectures: Learn what becomes possible when organisations can provide a cloud-native event-based infrastructure for elastic and scalable applications and integration scenarios.

    4. Streaming machine learning: In 2021, many companies will move to streaming machine learning in production without the need for a data lake that enables scalable real-time analytics.

    5. Cybersecurity: While security never goes out of style, in 2021 we will see cybersecurity in real-time at scale with openness and flexibility at its core. This protects computer systems and networks and prevents the theft of or damage to software and data.
  • Driving Digital Innovation Recorded: Dec 22 2020 20 mins
    Neil Drennan, 10x Future Technologies
    Listen to Neil Drennan, CTO of 10x Future Technologies discuss how 10x is transforming banking and the technology stack and strategy that he and his team are implementing.
  • Stream me to the Cloud (and back) with Confluent & MongoDB Recorded: Dec 17 2020 65 mins
    Gianluca Natali, Confluent & Felix Reichenbach, MongoDB
    Companies collect and store their data in various data stores and use a number of business applications and services to access, analyze and act on their data. Pulling all the data from disparate sources is difficult to manage, inefficient and ineffective in producing results. Event streaming and stream processing changes this paradigm. By enabling robust and reactive data pipelines between all your data stores, apps and services, you can make real-time decisions that are critical to your business.


    In this online talk, we’ll explore how and why companies are leveraging Confluent and MongoDB to modernize their architecture and leverage the scalability of the cloud and the velocity of streaming. Based upon a sample retail business scenario, we will explain how changes in an on-premise database are streamed via the Confluent Cloud to MongoDB Atlas and back.

    Key Learnings

    Modernize your architecture without revolutionizing it.

    Stream your data from multiple applications and data centers into the Cloud and back

    Confluent as the Central Nervous System of your architecture

    MongoDB Atlas as the flexible and scalable Modern Data Platform combining data from different sources and powering your frontend applications

    Why MongoDB and Confluent is such a great combination

    This architectural approach will allow you to dynamically scale the customer facing frontend, avoid over provisioning and enable the development team to rapidly implement new functionality which will differentiate you from your competition
  • Building a Secure, Tamper-Proof & Scalable Blockchain with AiB’s KafkaBlockchain Recorded: Dec 16 2020 54 mins
    Kai Waehner, Technology Evangelist, Confluent Stephen Reed, CTO, Co-Founder, AiB
    Apache Kafka is an open-source event streaming platform used to complement or replace existing middleware, integrate applications, and build microservice architectures. Used at almost every large company today, it's understood, battled-tested, highly scalable, and reliable.

    Blockchain is a different story. Being related to cryptocurrencies like Bitcoin, it's often in the news. But what is the value for software architectures? And how is it related to an integration architecture and event streaming platform?

    This session explores blockchain use cases and different alternatives such as Hyperledger, Ethereum, and Kafka-native blockchain implementation. We discuss the value blockchain brings for different architectures, and how it can be integrated with the Kafka ecosystem to build a highly scalable and reliable event streaming infrastructure.
  • The Top 5 Event Streaming Use Cases & Architectures in 2021 Recorded: Dec 16 2020 37 mins
    Kai Waehner, Technology Evangelist, Confluent
    With just a few weeks of 2020 left, it's time to make some predictions on the top event streaming use cases that the Confluent Team expects to see in 2021. Given the unpredictability of 2020 this may seem brave or even a little fool-hardy but Kai Waehner, Senior Solutions Engineer at Confluent, is willing to strike out and offer his predictions for next year.


    Register now to access this fascinating online talk in which Kai will discuss his top five cutting-edge use cases and architectures that will be adopted by more and more enterprises in 2021. Learn how companies will leverage event streaming, Apache Kafka, and Confluent to meet the demand of a real-time market, rising regulations, and customer expectations, and much more in 2021:

    1. Edge deployments outside the data center: It's time to challenge the normality of limited hardware and disconnected infrastructure. Event streaming can provide low latency and cost-efficient data integration and data processing in retail stores, restaurants, trains, and other remote locations.


    2. Hybrid architectures: Discover how these span multiple sites across regions, continents, data centers, and clouds with real-time information at scale to connect legacy and modern infrastructures.



    3. Service mesh-based microservice architectures: Learn what becomes possible when organisations can provide a cloud-native event-based infrastructure for elastic and scalable applications and integration scenarios.

    4. Streaming machine learning: In 2021, many companies will move to streaming machine learning in production without the need for a data lake that enables scalable real-time analytics.

    5. Cybersecurity: While security never goes out of style, in 2021 we will see cybersecurity in real-time at scale with openness and flexibility at its core. This protects computer systems and networks and prevents the theft of or damage to software and data.
  • Developing Our Cloud Native Future Recorded: Dec 9 2020 31 mins
    Tim Berglund, Senior Director of Developer Advocacy
    Everything that is not dead is developing. Plants and animals develop. Friendships develop. Organizations develop. And while data infrastructure is rightly considered not to be a living thing, event streaming platforms develop too.

    The lens of development as a phenomenon brings clarity to the future direction of Apache Kafka/Confluent Platform, as we see the elements of a completed streaming platform combining with the properties of a cloud-native system to produce a mature platform capable of things earlier versions of itself could not envision. Warm milk and repeated viewings of Frozen will no longer do. It is time for Kafka’s future.
  • Driving Digital Innovation Recorded: Dec 4 2020 28 mins
    Mohammed Sleeq, Chief Digital Officer, Aramex
    Hear Mohammed Sleeq, Chief Digital Officer of Aramex discuss the role of Kafka and event streaming in digitizing Aramex's business.
  • Delivering Technology Excellence in a Changing World Recorded: Dec 4 2020 21 mins
    Neil Drennan, CTO, 10x Future Technologies
    Listen to Neil Drennan of 10x Future Technologies discuss how 10x is transforming banking and the technology stack and strategy that he and his team are implementing.
  • Becoming an Event-Driven Business in the IoT Age: A Case Study Recorded: Dec 2 2020 44 mins
    Dan Croft, Solutions Architect, Confluent
    "According to Gartner, “By the end of 2020 event-sourced, real-time situational awareness will be a required characteristic for 80% of digital business solutions. And 80% of new business ecosystems will require support for event processing.”

    Companies are adopting real-time and event streaming to keep up with business digitisation trends while modernising data architecture and enabling new outcomes for the IoT age.

    In this webinar Confluent discuss:
    - Why event streaming has become so important for business success
    - What it takes to become an event-driven organisation
    - The journey to becoming an event-driven business with Kafka"
    - Compelling customer use cases from a range of industries
    - How the streaming platform has become the backbone of IoT projects within the automotive industry
  • Using Confluent & Google Apigee to Enable your Events on Hybrid Deployment Model Recorded: Dec 2 2020 58 mins
    Dan Croft, Confluent & Mathilde Fabre, Google Cloud Apigee
    Today’s customers expect channels, whether physical or digital, to blend together into a cohesive connected experience. APIs help enterprises to not only meet customer expectations, but also participate in software ecosystems and provide unprecedented opportunities for reach and economies of scale. By attending this online talk hosted by Google Apigee and Confluent you will learn how APIs allow you to streamline, secure and monetize access to your data and services to deliver a cohesive experience. In addition you will see just how easy it is to integrate the Confluent platform with Google Apigee.


    Register now to learn more about:



    •The challenges faced by CIOs face when it comes to getting value out of data
    • How to simplify the capture, generation and consumption of data, in a secure and monitored way
    • How to deal with the exponentially growing volume of data, devices and systems

    • Why Apigee offers a reliable and scalable API management platform

    • How APIs touch every stage of the digital journey

    • The value of having a layer of microservices that allows for agile developments
  • Developing Our Cloud Native Future Recorded: Nov 26 2020 32 mins
    Tim Berglund, Senior. Director, Developer Advocacy
    Everything that is not dead is developing. Plants and animals develop. Friendships develop. Organizations develop. And while data infrastructure is rightly considered not to be a living thing, event streaming platforms develop too.

    The lens of development as a phenomenon brings clarity to the future direction of Apache Kafka/Confluent Platform, as we see the elements of a completed streaming platform combining with the properties of a cloud-native system to produce a mature platform capable of things earlier versions of itself could not envision. Warm milk and repeated viewings of Frozen will no longer do. It is time for Kafka’s future.
  • Confluent Control Centre & KSQLDB Recorded: Nov 25 2020 32 mins
    Ala Alsharif, Confluent
    Join Ala Alsharif of Confluent for this jam-packed technology in practice session in which you will experience:
    •A demo of Confluent Control Panel
    •A demo of KSQLDB
    •An insight into the ease with which you can build event streaming applications
    •An overview of Confluent’s stream processing capability
  • Apache Kafka® Use Cases for Financial Services Recorded: Nov 20 2020 62 mins
    Tom Green, Senior Solutions Architect, Confluent.
    Traditional systems were designed in an era that predates large-scale distributed systems. These systems often lack the ability to scale to meet the needs of the modern data-driven organisation. Adding to this is the accumulation of technologies and the explosion of data which can result in complex point-to-point integrations where data becomes siloed or separated across the enterprise.



    The demand for fast results and decision making, have generated the need for real-time event streaming and processing of data adoption in financial institutions to be on the competitive edge. Apache Kafka and the Confluent Platform are designed to solve the problems associated with traditional systems and provide a modern, distributed architecture and Real-time Data streaming capability. In addition these technologies open up a range of use cases for Financial Services organisations, many of which will be explored in this talk. .



    By attending this talk you will develop a new understanding of:



    •How Apache Kafka enables a 360 view of the customer

    •How to provide a back bone for distribution of trade data

    •How Kafka and Confluent Platform enable you to meet regulatory requirements for trade information, payments, liquidity

    •How to overcome security concerns with SIEM

    •How to integrate mainframe data with event streaming and the cloud

    •How to reduce fraud with real-time fraud processing, fraud analytics and fraud notifications.

    •How to Develop and enhance microservices.
  • Stream me to the Cloud (and back) with Confluent & MongoDB Recorded: Nov 19 2020 65 mins
    Gianluca Natali, Confluent & Felix Reichenbach, MongoDB
    Companies collect and store their data in various data stores and use a number of business applications and services to access, analyze and act on their data. Pulling all the data from disparate sources is difficult to manage, inefficient and ineffective in producing results. Event streaming and stream processing changes this paradigm. By enabling robust and reactive data pipelines between all your data stores, apps and services, you can make real-time decisions that are critical to your business.


    In this online talk, we’ll explore how and why companies are leveraging Confluent and MongoDB to modernize their architecture and leverage the scalability of the cloud and the velocity of streaming. Based upon a sample retail business scenario, we will explain how changes in an on-premise database are streamed via the Confluent Cloud to MongoDB Atlas and back.

    Key Learnings

    Modernize your architecture without revolutionizing it.

    Stream your data from multiple applications and data centers into the Cloud and back

    Confluent as the Central Nervous System of your architecture

    MongoDB Atlas as the flexible and scalable Modern Data Platform combining data from different sources and powering your frontend applications

    Why MongoDB and Confluent is such a great combination

    This architectural approach will allow you to dynamically scale the customer facing frontend, avoid over provisioning and enable the development team to rapidly implement new functionality which will differentiate you from your competition
  • Becoming an Event-Driven Business in the IoT Age: A Case Study Recorded: Nov 17 2020 45 mins
    Dan Croft, Solutions Architect, Confluent
    "According to Gartner, “By the end of 2020 event-sourced, real-time situational awareness will be a required characteristic for 80% of digital business solutions. And 80% of new business ecosystems will require support for event processing.”

    Companies are adopting real-time and event streaming to keep up with business digitisation trends while modernising data architecture and enabling new outcomes for the IoT age.

    In this webinar Confluent discuss:
    - Why event streaming has become so important for business success
    - What it takes to become an event-driven organisation
    - The journey to becoming an event-driven business with Kafka"
    - Compelling customer use cases from a range of industries
    - How the streaming platform has become the backbone of IoT projects within the automotive industry
We provide a central nervous system for streaming real-time data.
Confluent, founded by the creators of open source Apache Kafka®, provides the leading streaming platform that enables enterprises to maximize the value of data. Confluent Platform empowers leaders in industries such as retail, logistics, manufacturing, financial services, technology and media, to move data from isolated systems into a real-time data pipeline where they can act on it immediately.

Backed by Benchmark, Index Ventures and Sequoia, Confluent is based in Palo Alto, California. To learn more, please visit www.confluent.io.

Embed in website or blog

Successfully added emails: 0
Remove all
  • Title: Using Confluent & Google Apigee to Enable your Events on Hybrid Deployment Model
  • Live at: Oct 15 2020 2:15 pm
  • Presented by: Dan Croft, Confluent & Mathilde Fabre, Google Cloud Apigee
  • From:
Your email has been sent.
or close