Hi [[ session.user.profile.firstName ]]

Streamsheets and Apache Kafka – Interactively build real-time Dashboards and Str

A powerful stream processing platform and an end-user friendly spreadsheet-interface, if this combination rings a bell, you should definitely attend our 'Streamsheets and Apache Kafka' webinar. While development is interactive with a web user interface, Streamsheets applications can run as mission-critical applications. They directly consume and produce event streams in Apache Kafka. One popular option is to run everything in the cloud leveraging the fully managed Confluent Cloud service on AWS, GCP or Azure. Without any coding or scripting, end-users leverage their existing spreadsheet skills to build customized streaming apps for analysis, dashboarding, condition monitoring or any kind of real-time pre-and post-processing of Kafka or KsqlDB streams and tables.

Hear Kai Waehner of Confluent and Kristian Raue of Cedalo on these topics:

• Where Apache Kafka and Streamsheets fit in the data ecosystem (Industrial IoT, Smart Energy, Clinical Applications, Finance Applications)
• Customer Story: How the Freiburg University Hospital uses Kafka and Streamsheets for dashboarding the utilization of clinical assets
• 15-Minutes Live Demonstration: Building a financial fraud detection dashboard based on Confluent Cloud, ksqlDB and Cedalo Cloud Streamsheets just using spreadsheet formulas.
Recorded Sep 24 2020 57 mins
Your place is confirmed,
we'll send you email reminders
Presented by
Kai Waehner, Technology Evangelist, Confluent Kristian Raue, Founder & Chief Technologist, cedalo
Presentation preview: Streamsheets and Apache Kafka – Interactively build real-time Dashboards and Str

Network with like-minded attendees

  • [[ session.user.profile.displayName ]]
    Add a photo
    • [[ session.user.profile.displayName ]]
    • [[ session.user.profile.jobTitle ]]
    • [[ session.user.profile.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(session.user.profile) ]]
  • [[ card.displayName ]]
    • [[ card.displayName ]]
    • [[ card.jobTitle ]]
    • [[ card.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(card) ]]
  • Channel
  • Channel profile
  • Apache Kafka® Use Cases for Financial Services Jan 12 2021 10:15 am UTC 62 mins
    Tom Green, Senior Solutions Architect, Confluent.
    Traditional systems were designed in an era that predates large-scale distributed systems. These systems often lack the ability to scale to meet the needs of the modern data-driven organisation. Adding to this is the accumulation of technologies and the explosion of data which can result in complex point-to-point integrations where data becomes siloed or separated across the enterprise.



    The demand for fast results and decision making, have generated the need for real-time event streaming and processing of data adoption in financial institutions to be on the competitive edge. Apache Kafka and the Confluent Platform are designed to solve the problems associated with traditional systems and provide a modern, distributed architecture and Real-time Data streaming capability. In addition these technologies open up a range of use cases for Financial Services organisations, many of which will be explored in this talk. .



    By attending this talk you will develop a new understanding of:



    •How Apache Kafka enables a 360 view of the customer

    •How to provide a back bone for distribution of trade data

    •How Kafka and Confluent Platform enable you to meet regulatory requirements for trade information, payments, liquidity

    •How to overcome security concerns with SIEM

    •How to integrate mainframe data with event streaming and the cloud

    •How to reduce fraud with real-time fraud processing, fraud analytics and fraud notifications.

    •How to Develop and enhance microservices.
  • Apache Kafka® Use Cases for Financial Services Dec 22 2020 3:00 pm UTC 62 mins
    Tom Green, Senior Solutions Architect, Confluent.
    Traditional systems were designed in an era that predates large-scale distributed systems. These systems often lack the ability to scale to meet the needs of the modern data-driven organisation. Adding to this is the accumulation of technologies and the explosion of data which can result in complex point-to-point integrations where data becomes siloed or separated across the enterprise.



    The demand for fast results and decision making, have generated the need for real-time event streaming and processing of data adoption in financial institutions to be on the competitive edge. Apache Kafka and the Confluent Platform are designed to solve the problems associated with traditional systems and provide a modern, distributed architecture and Real-time Data streaming capability. In addition these technologies open up a range of use cases for Financial Services organisations, many of which will be explored in this talk. .



    By attending this talk you will develop a new understanding of:



    •How Apache Kafka enables a 360 view of the customer

    •How to provide a back bone for distribution of trade data

    •How Kafka and Confluent Platform enable you to meet regulatory requirements for trade information, payments, liquidity

    •How to overcome security concerns with SIEM

    •How to integrate mainframe data with event streaming and the cloud

    •How to reduce fraud with real-time fraud processing, fraud analytics and fraud notifications.

    •How to Develop and enhance microservices.
  • Stream me to the Cloud (and back) with Confluent & MongoDB Dec 17 2020 2:15 pm UTC 65 mins
    Gianluca Natali, Confluent & Felix Reichenbach, MongoDB
    Companies collect and store their data in various data stores and use a number of business applications and services to access, analyze and act on their data. Pulling all the data from disparate sources is difficult to manage, inefficient and ineffective in producing results. Event streaming and stream processing changes this paradigm. By enabling robust and reactive data pipelines between all your data stores, apps and services, you can make real-time decisions that are critical to your business.


    In this online talk, we’ll explore how and why companies are leveraging Confluent and MongoDB to modernize their architecture and leverage the scalability of the cloud and the velocity of streaming. Based upon a sample retail business scenario, we will explain how changes in an on-premise database are streamed via the Confluent Cloud to MongoDB Atlas and back.

    Key Learnings

    Modernize your architecture without revolutionizing it.

    Stream your data from multiple applications and data centers into the Cloud and back

    Confluent as the Central Nervous System of your architecture

    MongoDB Atlas as the flexible and scalable Modern Data Platform combining data from different sources and powering your frontend applications

    Why MongoDB and Confluent is such a great combination

    This architectural approach will allow you to dynamically scale the customer facing frontend, avoid over provisioning and enable the development team to rapidly implement new functionality which will differentiate you from your competition
  • Building a Secure, Tamper-Proof & Scalable Blockchain with AiB’s KafkaBlockchain Dec 16 2020 2:15 pm UTC 54 mins
    Kai Waehner, Technology Evangelist, Confluent Stephen Reed, CTO, Co-Founder, AiB
    Apache Kafka is an open-source event streaming platform used to complement or replace existing middleware, integrate applications, and build microservice architectures. Used at almost every large company today, it's understood, battled-tested, highly scalable, and reliable.

    Blockchain is a different story. Being related to cryptocurrencies like Bitcoin, it's often in the news. But what is the value for software architectures? And how is it related to an integration architecture and event streaming platform?

    This session explores blockchain use cases and different alternatives such as Hyperledger, Ethereum, and Kafka-native blockchain implementation. We discuss the value blockchain brings for different architectures, and how it can be integrated with the Kafka ecosystem to build a highly scalable and reliable event streaming infrastructure.
  • Confluent Control Centre & KSQLDB Dec 8 2020 10:15 pm UTC 32 mins
    Ala Alsharif, Confluent
    Join Ala Alsharif of Confluent for this jam-packed technology in practice session in which you will experience:
    •A demo of Confluent Control Panel
    •A demo of KSQLDB
    •An insight into the ease with which you can build event streaming applications
    •An overview of Confluent’s stream processing capability
  • Using Confluent & Google Apigee to Enable your Events on Hybrid Deployment Model Dec 2 2020 3:00 pm UTC 58 mins
    Dan Croft, Confluent & Mathilde Fabre, Google Cloud Apigee
    Today’s customers expect channels, whether physical or digital, to blend together into a cohesive connected experience. APIs help enterprises to not only meet customer expectations, but also participate in software ecosystems and provide unprecedented opportunities for reach and economies of scale. By attending this online talk hosted by Google Apigee and Confluent you will learn how APIs allow you to streamline, secure and monetize access to your data and services to deliver a cohesive experience. In addition you will see just how easy it is to integrate the Confluent platform with Google Apigee.


    Register now to learn more about:



    •The challenges faced by CIOs face when it comes to getting value out of data
    • How to simplify the capture, generation and consumption of data, in a secure and monitored way
    • How to deal with the exponentially growing volume of data, devices and systems

    • Why Apigee offers a reliable and scalable API management platform

    • How APIs touch every stage of the digital journey

    • The value of having a layer of microservices that allows for agile developments
  • Confluent Control Centre & KSQLDB Nov 25 2020 12:15 pm UTC 32 mins
    Ala Alsharif, Confluent
    Join Ala Alsharif of Confluent for this jam-packed technology in practice session in which you will experience:
    •A demo of Confluent Control Panel
    •A demo of KSQLDB
    •An insight into the ease with which you can build event streaming applications
    •An overview of Confluent’s stream processing capability
  • Apache Kafka® Use Cases for Financial Services Nov 20 2020 2:00 pm UTC 62 mins
    Tom Green, Senior Solutions Architect, Confluent.
    Traditional systems were designed in an era that predates large-scale distributed systems. These systems often lack the ability to scale to meet the needs of the modern data-driven organisation. Adding to this is the accumulation of technologies and the explosion of data which can result in complex point-to-point integrations where data becomes siloed or separated across the enterprise.



    The demand for fast results and decision making, have generated the need for real-time event streaming and processing of data adoption in financial institutions to be on the competitive edge. Apache Kafka and the Confluent Platform are designed to solve the problems associated with traditional systems and provide a modern, distributed architecture and Real-time Data streaming capability. In addition these technologies open up a range of use cases for Financial Services organisations, many of which will be explored in this talk. .



    By attending this talk you will develop a new understanding of:



    •How Apache Kafka enables a 360 view of the customer

    •How to provide a back bone for distribution of trade data

    •How Kafka and Confluent Platform enable you to meet regulatory requirements for trade information, payments, liquidity

    •How to overcome security concerns with SIEM

    •How to integrate mainframe data with event streaming and the cloud

    •How to reduce fraud with real-time fraud processing, fraud analytics and fraud notifications.

    •How to Develop and enhance microservices.
  • Stream me to the Cloud (and back) with Confluent & MongoDB Nov 19 2020 1:15 pm UTC 65 mins
    Gianluca Natali, Confluent & Felix Reichenbach, MongoDB
    Companies collect and store their data in various data stores and use a number of business applications and services to access, analyze and act on their data. Pulling all the data from disparate sources is difficult to manage, inefficient and ineffective in producing results. Event streaming and stream processing changes this paradigm. By enabling robust and reactive data pipelines between all your data stores, apps and services, you can make real-time decisions that are critical to your business.


    In this online talk, we’ll explore how and why companies are leveraging Confluent and MongoDB to modernize their architecture and leverage the scalability of the cloud and the velocity of streaming. Based upon a sample retail business scenario, we will explain how changes in an on-premise database are streamed via the Confluent Cloud to MongoDB Atlas and back.

    Key Learnings

    Modernize your architecture without revolutionizing it.

    Stream your data from multiple applications and data centers into the Cloud and back

    Confluent as the Central Nervous System of your architecture

    MongoDB Atlas as the flexible and scalable Modern Data Platform combining data from different sources and powering your frontend applications

    Why MongoDB and Confluent is such a great combination

    This architectural approach will allow you to dynamically scale the customer facing frontend, avoid over provisioning and enable the development team to rapidly implement new functionality which will differentiate you from your competition
  • Becoming an Event-Driven Business in the IoT Age: A Case Study Nov 17 2020 12:00 pm UTC 60 mins
    Dan Croft, Solutions Architect, Confluent
    "According to Gartner, “By the end of 2020 event-sourced, real-time situational awareness will be a required characteristic for 80% of digital business solutions. And 80% of new business ecosystems will require support for event processing.”

    Companies are adopting real-time and event streaming to keep up with business digitisation trends while modernising data architecture and enabling new outcomes for the IoT age.

    In this webinar Confluent discuss:
    - Why event streaming has become so important for business success
    - What it takes to become an event-driven organisation
    - The journey to becoming an event-driven business with Kafka"
    - Compelling customer use cases from a range of industries
    - How the streaming platform has become the backbone of IoT projects within the automotive industry
  • Building a Secure, Tamper-Proof & Scalable Blockchain with AiB’s KafkaBlockchain Nov 15 2020 5:00 pm UTC 54 mins
    Kai Waehner, Technology Evangelist, Confluent Stephen Reed, CTO, Co-Founder, AiB
    Apache Kafka is an open-source event streaming platform used to complement or replace existing middleware, integrate applications, and build microservice architectures. Used at almost every large company today, it's understood, battled-tested, highly scalable, and reliable.

    Blockchain is a different story. Being related to cryptocurrencies like Bitcoin, it's often in the news. But what is the value for software architectures? And how is it related to an integration architecture and event streaming platform?

    This session explores blockchain use cases and different alternatives such as Hyperledger, Ethereum, and Kafka-native blockchain implementation. We discuss the value blockchain brings for different architectures, and how it can be integrated with the Kafka ecosystem to build a highly scalable and reliable event streaming infrastructure.
  • Apache Kafka Architectures and Fundamentals Nov 11 2020 1:00 pm UTC 37 mins
    Henrik Janzon, Solutions Engineer
    Apache Kafka is a community distributed event streaming platform capable of handling trillions of events a day. Initially conceived as a messaging queue, Kafka is based on an abstraction of a distributed commit log. Since being created and open sourced by LinkedIn in 2011, Kafka has quickly evolved from messaging queue to a full-fledged event streaming platform.



    In this session Martijn Kieboom, Senior Solutions Engineer at Confluent, explains Apache Kafka’s internal design and architecture. by attending you will learn why companies like LinkedIn, ING, Dominos Pizza, Nordea, & Royal Bank of Canada are now sending trillions of messages per day to Apache Kafka. In addition you will come to understand the underlying design in Kafka that leads to such high throughput.



    This talk provides a comprehensive overview of Kafka architecture and internal functions, including:



    •Topics, partitions and segments
    •The commit log and streams
    •Brokers and broker replication
    •Producer basics
    •Consumers, consumer groups and offset
  • Using Confluent & Google Apigee to Enable your Events on Hybrid Deployment Model Nov 10 2020 4:00 pm UTC 58 mins
    Dan Croft, Confluent & Mathilde Fabre, Google Cloud Apigee
    Today’s customers expect channels, whether physical or digital, to blend together into a cohesive connected experience. APIs help enterprises to not only meet customer expectations, but also participate in software ecosystems and provide unprecedented opportunities for reach and economies of scale. By attending this online talk hosted by Google Apigee and Confluent you will learn how APIs allow you to streamline, secure and monetize access to your data and services to deliver a cohesive experience. In addition you will see just how easy it is to integrate the Confluent platform with Google Apigee.


    Register now to learn more about:



    •The challenges faced by CIOs face when it comes to getting value out of data
    • How to simplify the capture, generation and consumption of data, in a secure and monitored way
    • How to deal with the exponentially growing volume of data, devices and systems

    • Why Apigee offers a reliable and scalable API management platform

    • How APIs touch every stage of the digital journey

    • The value of having a layer of microservices that allows for agile developments
  • Apache Kafka® Use Cases for Financial Services Recorded: Oct 22 2020 62 mins
    Tom Green, Senior Solutions Architect, Confluent.
    Traditional systems were designed in an era that predates large-scale distributed systems. These systems often lack the ability to scale to meet the needs of the modern data-driven organisation. Adding to this is the accumulation of technologies and the explosion of data which can result in complex point-to-point integrations where data becomes siloed or separated across the enterprise.



    The demand for fast results and decision making, have generated the need for real-time event streaming and processing of data adoption in financial institutions to be on the competitive edge. Apache Kafka and the Confluent Platform are designed to solve the problems associated with traditional systems and provide a modern, distributed architecture and Real-time Data streaming capability. In addition these technologies open up a range of use cases for Financial Services organisations, many of which will be explored in this talk. .



    By attending this talk you will develop a new understanding of:



    •How Apache Kafka enables a 360 view of the customer

    •How to provide a back bone for distribution of trade data

    •How Kafka and Confluent Platform enable you to meet regulatory requirements for trade information, payments, liquidity

    •How to overcome security concerns with SIEM

    •How to integrate mainframe data with event streaming and the cloud

    •How to reduce fraud with real-time fraud processing, fraud analytics and fraud notifications.

    •How to Develop and enhance microservices.
  • Enabling Smarter Cities and Connected Vehicles with Apache Kafka Recorded: Oct 21 2020 57 mins
    Kai Waehner, Technology Evangelist, Confluent + Rob Cowert, Systems Engineer, Confluent
    Many cities are investing in technologies to transform their cities into smart city environments in which data collection and analysis is utilized to manage assets and resources efficiently. Modern technology can help connect the right data, at the right time, to the right people, processes and systems. Innovations around smart cities and the Internet of Things give cities the ability to improve motor safety, unify and manage transportation systems and traffic, save energy and provide a better experience for the residents.

    By utilizing an event streaming platform, like Confluent, cities are able to process data in real-time from thousands of sources, such as sensors. By aggregating that data and analyzing real-time data streams, more informed decisions can be made and fine-tuned operations developed for a positive impact on everyday challenges faced by cities.

    Watch this webinar to learn how to:
    -Overcome challenges for building a smarter city
    -Build a real time infrastructure to correlate relevant events
    -Connect thousands of devices, machines, and people
    -Leverage open source and fully-managed solutions from the Apache Kafka ecosystem
  • Stream me to the Cloud (and back) with Confluent & MongoDB Recorded: Oct 18 2020 65 mins
    Gianluca Natali, Confluent & Felix Reichenbach, MongoDB
    Companies collect and store their data in various data stores and use a number of business applications and services to access, analyze and act on their data. Pulling all the data from disparate sources is difficult to manage, inefficient and ineffective in producing results. Event streaming and stream processing changes this paradigm. By enabling robust and reactive data pipelines between all your data stores, apps and services, you can make real-time decisions that are critical to your business.


    In this online talk, we’ll explore how and why companies are leveraging Confluent and MongoDB to modernize their architecture and leverage the scalability of the cloud and the velocity of streaming. Based upon a sample retail business scenario, we will explain how changes in an on-premise database are streamed via the Confluent Cloud to MongoDB Atlas and back.

    Key Learnings

    Modernize your architecture without revolutionizing it.

    Stream your data from multiple applications and data centers into the Cloud and back

    Confluent as the Central Nervous System of your architecture

    MongoDB Atlas as the flexible and scalable Modern Data Platform combining data from different sources and powering your frontend applications

    Why MongoDB and Confluent is such a great combination

    This architectural approach will allow you to dynamically scale the customer facing frontend, avoid over provisioning and enable the development team to rapidly implement new functionality which will differentiate you from your competition
  • Using Confluent & Google Apigee to Enable your Events on Hybrid Deployment Model Recorded: Oct 15 2020 58 mins
    Dan Croft, Confluent & Mathilde Fabre, Google Cloud Apigee
    Today’s customers expect channels, whether physical or digital, to blend together into a cohesive connected experience. APIs help enterprises to not only meet customer expectations, but also participate in software ecosystems and provide unprecedented opportunities for reach and economies of scale. By attending this online talk hosted by Google Apigee and Confluent you will learn how APIs allow you to streamline, secure and monetize access to your data and services to deliver a cohesive experience. In addition you will see just how easy it is to integrate the Confluent platform with Google Apigee.


    Register now to learn more about:



    •The challenges faced by CIOs face when it comes to getting value out of data
    • How to simplify the capture, generation and consumption of data, in a secure and monitored way
    • How to deal with the exponentially growing volume of data, devices and systems

    • Why Apigee offers a reliable and scalable API management platform

    • How APIs touch every stage of the digital journey

    • The value of having a layer of microservices that allows for agile developments
  • Apache Kafka Architectures and Fundamentals Recorded: Oct 14 2020 37 mins
    Henrik Janzon, Solutions Engineer
    Apache Kafka is a community distributed event streaming platform capable of handling trillions of events a day. Initially conceived as a messaging queue, Kafka is based on an abstraction of a distributed commit log. Since being created and open sourced by LinkedIn in 2011, Kafka has quickly evolved from messaging queue to a full-fledged event streaming platform.



    In this session Martijn Kieboom, Senior Solutions Engineer at Confluent, explains Apache Kafka’s internal design and architecture. by attending you will learn why companies like LinkedIn, ING, Dominos Pizza, Nordea, & Royal Bank of Canada are now sending trillions of messages per day to Apache Kafka. In addition you will come to understand the underlying design in Kafka that leads to such high throughput.



    This talk provides a comprehensive overview of Kafka architecture and internal functions, including:



    •Topics, partitions and segments
    •The commit log and streams
    •Brokers and broker replication
    •Producer basics
    •Consumers, consumer groups and offset
  • Confluent Control Centre & KSQLDB Recorded: Oct 13 2020 32 mins
    Ala Alsharif, Confluent
    Join Ala Alsharif of Confluent for this jam-packed technology in practice session in which you will experience:
    •A demo of Confluent Control Panel
    •A demo of KSQLDB
    •An insight into the ease with which you can build event streaming applications
    •An overview of Confluent’s stream processing capability
  • Confluent Control Centre & KSQLDB Recorded: Oct 13 2020 32 mins
    Ala Alsharif, Confluent
    Join Ala Alsharif of Confluent for this jam-packed technology in practice session in which you will experience:
    •A demo of Confluent Control Panel
    •A demo of KSQLDB
    •An insight into the ease with which you can build event streaming applications
    •An overview of Confluent’s stream processing capability
We provide a central nervous system for streaming real-time data.
Confluent, founded by the creators of open source Apache Kafka®, provides the leading streaming platform that enables enterprises to maximize the value of data. Confluent Platform empowers leaders in industries such as retail, logistics, manufacturing, financial services, technology and media, to move data from isolated systems into a real-time data pipeline where they can act on it immediately.

Backed by Benchmark, Index Ventures and Sequoia, Confluent is based in Palo Alto, California. To learn more, please visit www.confluent.io.

Embed in website or blog

Successfully added emails: 0
Remove all
  • Title: Streamsheets and Apache Kafka – Interactively build real-time Dashboards and Str
  • Live at: Sep 24 2020 2:05 pm
  • Presented by: Kai Waehner, Technology Evangelist, Confluent Kristian Raue, Founder & Chief Technologist, cedalo
  • From:
Your email has been sent.
or close