Hi [[ session.user.profile.firstName ]]

Real-Time Data Streaming in the Insurance Industry

Speakers: Christian Nicoll, Director of Platform Engineering & Operations, Generali Switzerland + Kai Waehner, Technology Evangelist, Confluent + Christopher Knauf, DACH Sales Director, Attunity

Insurance companies are facing similar challenges like all other disrupted market segments like the change of customer expectations and hence the need of differentiating itself new as a brand in a challenging market environment. But at the same time, it underlies a very strict regulatory pressure.

Generali Switzerland, as many market leaders in every industry, have understood the power of data to reimagine their markets, customers, products, and business model and managed this change by building their Connection Platform within one year.

Christian Nicoll, Director of Platform Engineering & Operations at Generali Switzerland guides us through their journey of setting up an event-driven architecture to support their digital transformation project.

Attend this online talk and learn more about:
-How Generali managed it to assemble various parts to one platform
-The architecture of the Generali Connection Platform, including Confluent, Kafka, and Attunity.
-Their challenges, best practices, and lessons learned
-Generali’s plans of expanding and scaling the Connection Platform
-Additional Use Cases in regulated markets like retail banking
Recorded Dec 13 2018 52 mins
Your place is confirmed,
we'll send you email reminders
Presented by
Christian Nicoll, Generali Switzerland + Kai Waehner, Confluent + Christopher Knauf, Attunity
Presentation preview: Real-Time Data Streaming in the Insurance Industry

Network with like-minded attendees

  • [[ session.user.profile.displayName ]]
    Add a photo
    • [[ session.user.profile.displayName ]]
    • [[ session.user.profile.jobTitle ]]
    • [[ session.user.profile.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(session.user.profile) ]]
  • [[ card.displayName ]]
    • [[ card.displayName ]]
    • [[ card.jobTitle ]]
    • [[ card.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(card) ]]
  • Channel
  • Channel profile
  • No More Silos: Integrating Databases into Apache Kafka® Recorded: Apr 11 2019 57 mins
    Robin Moffatt, Developer Advocate, Confluent
    Companies new and old are all recognizing the importance of a low-latency, scalable, fault-tolerant data backbone, in the form of the Apache Kafka® streaming platform. With Apache Kafka, developers can integrate multiple sources and systems, which enables low latency analytics, event-driven architectures and the population of multiple downstream systems.

    In this talk, we’ll look at one of the most common integration requirements – connecting databases to Apache Kafka. We’ll consider the concept that all data is a stream of events, including that residing within a database. We’ll look at why we’d want to stream data from a database, including driving applications in Apache Kafka from events upstream. We’ll discuss the different methods for connecting databases to Apache Kafka, and the pros and cons of each. Techniques including Change-Data-Capture (CDC) and Apache Kafka Connect will be covered, as well as an exploration of the power of KSQL, streaming SQL for Apache Kafka, for performing transformations such as joins on the inbound data.

    Register now to learn:
    •Why databases are just a materialized view of a stream of events
    •The best ways to integrate databases with Apache Kafka
    •Anti-patterns to be aware of
    •The power of KSQL for transforming streams of data in Apache Kafka
  • Bridge to Cloud: Using Apache Kafka to Migrate to AWS Recorded: Apr 9 2019 57 mins
    Priya Shivakumar (Confluent) + Konstantine Karantasis (Confluent) + Rohit Pujari (AWS)
    Speakers: Priya Shivakumar, Director of Product, Confluent + Konstantine Karantasis, Software Engineer, Confluent + Rohit Pujari, Partner Solutions Architect, AWS

    Most companies start their cloud journey with a new use case, or a new application. Sometimes these applications can run independently in the cloud, but often times they need data from the on premises datacenter. Existing applications will slowly migrate, but will need a strategy and the technology to enable a multi-year migration.

    In this session, we will share how companies around the world are using Confluent Cloud, a fully managed Apache Kafka service, to migrate to AWS. By implementing a central-pipeline architecture using Apache Kafka to sync on-prem and cloud deployments, companies can accelerate migration times and reduce costs.

    In this online talk we will cover:
    •How to take the first step in migrating to AWS
    •How to reliably sync your on premises applications using a persistent bridge to cloud
    •Learn how Confluent Cloud can make this daunting task simple, reliable and performant
    •See a demo of the hybrid-cloud and multi-region deployment of Apache Kafka
  • Bridge to Cloud: Using Apache Kafka to Migrate to GCP Recorded: Mar 27 2019 56 mins
    Priya Shivakumar, Director of Product, Confluent + Ryan Lippert, Product Marketing, Google Cloud
    Most companies start their cloud journey with a new use case, or a new application. Sometimes these applications can run independently in the cloud, but often times they need data from the on premises datacenter. Existing applications will slowly migrate, but will need a strategy and the technology to enable a multi-year migration.

    In this session, we will share how companies around the world are using Confluent Cloud, a fully managed Apache Kafka® service, to migrate to Google Cloud Platform. By implementing a central-pipeline architecture using Apache Kafka to sync on-prem and cloud deployments, companies can accelerate migration times and reduce costs.

    Register now to learn:
    -How to take the first step in migrating to GCP
    -How to reliably sync your on premises applications using a persistent bridge to cloud
    -How Confluent Cloud can make this daunting task simple, reliable and performant
  • Using Apache Kafka to Optimize Real-Time Analytics in Financial Services & IoT Recorded: Mar 20 2019 53 mins
    Peter Simpson, VP Panopticon Streaming Analytics, Datawatch + Tom Underhill, Partner Solutions Architect, Confluent
    When it comes to the fast-paced nature of capital markets and IoT, the ability to analyze data in real time is critical to gaining an edge. It’s not just about the quantity of data you can analyze at once, it’s about the speed, scale, and quality of the data you have at your fingertips.

    Modern streaming data technologies like Apache Kafka and the broader Confluent platform can help detect opportunities and threats in real time. They can improve profitability, yield, and performance. Combining Kafka with Panopticon visual analytics provides a powerful foundation for optimizing your operations.

    Use cases in capital markets include transaction cost analysis (TCA), risk monitoring, surveillance of trading and trader activity, compliance, and optimizing profitability of electronic trading operations. Use cases in IoT include monitoring manufacturing processes, logistics, and connected vehicle telemetry and geospatial data.

    This online talk will include in depth practical demonstrations of how Confluent and Panopticon together support several key applications. You will learn:

    -Why Apache Kafka is widely used to improve performance of complex operational systems
    -How Confluent and Panopticon open new opportunities to analyze operational data in real time
    -How to quickly identify and react immediately to fast-emerging trends, clusters, and anomalies
    -How to scale data ingestion and data processing
    -Build new analytics dashboards in minutes
  • Express Scripts: Driving Digital Transformation from Mainframe to Microservices Recorded: Mar 12 2019 59 mins
    Ankur Kaneria, Principal Architect, Express Scripts + Kevin Petrie, Attunity + Alan Hsia, Confluent
    Express Scripts is reimagining its data architecture to bring best-in-class user experience and provide the foundation of next-generation applications. The challenge lies in the ability to efficiently and cost-effectively access the ever-increasing amount of data.

    This online talk will showcase how Apache Kafka® plays a key role within Express Scripts’ transformation from mainframe to a microservices-based ecosystem, ensuring data integrity between two worlds. It will discuss how change data capture (CDC) technology is leveraged to stream data changes to Confluent Platform, allowing a low-latency data pipeline to be built.

    Watch now to learn:

    -Why Apache Kafka is an ideal data integration platform for microservices
    -How Express Scripts is building cloud-based microservices when the system of record is a relational database residing on an on-premise mainframe
    -How Confluent Platform allows for data integrity between disparate platforms and meets real time SLAs and low-latency requirements
    -How Attunity Replicate software is leveraged to stream data changes to Apache Kafka, allowing you to build a low-latency data pipeline
  • Modernizing Your Application Architecture with Microservices Recorded: Feb 28 2019 39 mins
    Joe deBuzna, VP Field Engineering, HVR + Chong Yan, Solution Architect, Confluent
    Organizations are quickly adopting microservice architectures to achieve better customer service and improve user experience while limiting downtime and data loss. However, transitioning from a monolithic architecture based on stateful databases to truly stateless microservices can be challenging and requires the right set of solutions.

    In this webinar, learn from field experts as they discuss how to convert the data locked in traditional databases into event streams using HVR and Apache Kafka®. They will show you how to implement these solutions through a real-world demo use case of microservice adoption.

    You will learn:

    -How log-based change data capture (CDC) converts database tables into event streams
    -How Kafka serves as the central nervous system for microservices
    -How the transition to microservices can be realized without throwing away your legacy infrastructure
  • Bridge to Cloud: Using Apache Kafka to Migrate to GCP Recorded: Feb 14 2019 57 mins
    Priya Shivakumar, Director of Product, Confluent + Ryan Lippert, Product Marketing, Google Cloud
    Most companies start their cloud journey with a new use case, or a new application. Sometimes these applications can run independently in the cloud, but often times they need data from the on premises datacenter. Existing applications will slowly migrate, but will need a strategy and the technology to enable a multi-year migration.

    In this session, we will share how companies around the world are using Confluent Cloud, a fully managed Apache Kafka® service, to migrate to Google Cloud Platform. By implementing a central-pipeline architecture using Apache Kafka to sync on-prem and cloud deployments, companies can accelerate migration times and reduce costs.

    Register now to learn:
    -How to take the first step in migrating to GCP
    -How to reliably sync your on premises applications using a persistent bridge to cloud
    -How Confluent Cloud can make this daunting task simple, reliable and performant
  • Using Apache Kafka to Optimize Real-Time Analytics in Financial Services & IoT Recorded: Feb 6 2019 54 mins
    Peter Simpson, VP Panopticon Streaming Analytics, Datawatch + Tom Underhill, Partner Solutions Architect, Confluent
    When it comes to the fast-paced nature of capital markets and IoT, the ability to analyze data in real time is critical to gaining an edge. It’s not just about the quantity of data you can analyze at once, it’s about the speed, scale, and quality of the data you have at your fingertips.

    Modern streaming data technologies like Apache Kafka and the broader Confluent platform can help detect opportunities and threats in real time. They can improve profitability, yield, and performance. Combining Kafka with Panopticon visual analytics provides a powerful foundation for optimizing your operations.

    Use cases in capital markets include transaction cost analysis (TCA), risk monitoring, surveillance of trading and trader activity, compliance, and optimizing profitability of electronic trading operations. Use cases in IoT include monitoring manufacturing processes, logistics, and connected vehicle telemetry and geospatial data.

    This online talk will include in depth practical demonstrations of how Confluent and Panopticon together support several key applications. You will learn:

    -Why Apache Kafka is widely used to improve performance of complex operational systems
    -How Confluent and Panopticon open new opportunities to analyze operational data in real time
    -How to quickly identify and react immediately to fast-emerging trends, clusters, and anomalies
    -How to scale data ingestion and data processing
    -Build new analytics dashboards in minutes
  • Real-Time Data Streaming in the Insurance Industry Recorded: Jan 31 2019 51 mins
    Christian Nicoll, Generali Switzerland + Kai Waehner, Confluent + Christopher Knauf, Attunity
    Speakers: Christian Nicoll, Director of Platform Engineering & Operations, Generali Switzerland + Kai Waehner, Technology Evangelist, Confluent + Christopher Knauf, DACH Sales Director, Attunity

    Insurance companies are facing similar challenges like all other disrupted market segments like the change of customer expectations and hence the need of differentiating itself new as a brand in a challenging market environment. But at the same time, it underlies a very strict regulatory pressure.

    Generali Switzerland, as many market leaders in every industry, have understood the power of data to reimagine their markets, customers, products, and business model and managed this change by building their Connection Platform within one year.

    Christian Nicoll, Director of Platform Engineering & Operations at Generali Switzerland guides us through their journey of setting up an event-driven architecture to support their digital transformation project.

    Attend this online talk and learn more about:
    -How Generali managed it to assemble various parts to one platform
    -The architecture of the Generali Connection Platform, including Confluent, Kafka, and Attunity.
    -Their challenges, best practices, and lessons learned
    -Generali’s plans of expanding and scaling the Connection Platform
    -Additional Use Cases in regulated markets like retail banking
  • Driving Business Transformation with Real-Time Analytics Recorded: Jan 16 2019 56 mins
    Nick Dearden, Confluent + John Thuma, Arcadia Data, Thomas Clarke, RCG Global Services
    Digital transformation is more than just a buzzword, it’s become a necessity in order to compete in the modern era. At the heart of digital transformation is real-time data. Your organization must respond in real time to every customer experience transaction, sale, and market movement in order to stay competitive.

    Streaming data technologies like Apache Kafka® and Confluent KSQL, the streaming SQL engine for Apache Kafka, are being used to detect and react to events as they occur. Combining this technology with the analytics insights from RCG and visualizations from Arcadia Data delivers a powerful foundation for driving real time business decisions. Use cases span across industries and include retail transaction cost analysis, automotive maintenance and loyalty program management, and credit card fraud detection.

    Join experts from Confluent, RCG and Arcadia Data for a discussion and demo on how companies are integrating streaming data technologies to transform their business.

    Watch now to learn:

    -Why Apache Kafka is widely used for real-time event monitoring and decisioning
    -How to integrate real-time analytics and visualizations to drive business processes
    -How KSQL, streaming SQL for Kafka, can easily transform and filter streams of data in real time
  • ATM Fraud Detection with Apache Kafka and KSQL Recorded: Jan 16 2019 57 mins
    Robin Moffatt, Developer Advocate, Confluent
    Detecting fraudulent activity in real time can save a business significant amounts of money, but has traditionally been an area requiring a lot of complex programming and frameworks, particularly at scale. Using KSQL, it's possible to use just SQL to build scalable real-time applications.

    In this talk, we'll look at:
    -What KSQL is, and how its ability to join streams of events can be used to detect possibly fraudulent activity based on a stream of ATM transactions.
    -How easy it is to integrate Kafka with other systems—both upstream and downstream—using Kafka Connect to stream from a database into Kafka, and from Kafka into Elasticsearch.

    * Join 7000 other Confluent Platform users on our Slack group: https://slackpass.io/confluentcommunity

    * Find the code and instructions to run the entire demo: https://github.com/confluentinc/demo-scene/blob/master/ksql-atm-fraud-detection

    * Get started with KSQL here: https://www.confluent.io/ksql
  • Achieve Sub-Second Analytics on Apache Kafka with Confluent and Imply Recorded: Jan 10 2019 54 mins
    Rachel Pedreschi, Senior Director, Solutions Engineering, Imply.io + Josh Treichel, Partner Solutions Architect, Confluent
    Analytic pipelines running purely on batch processing systems can suffer from hours of data lag, resulting in accuracy issues with analysis and overall decision-making. Join us for a demo to learn how easy it is to integrate your Apache Kafka® streams in Apache Druid (incubating) to provide real-time insights into the data.

    In this online talk, you’ll hear about ingesting your Kafka streams into Imply’s scalable analytic engine and gaining real-time insights via a modern user interface.

    Register now to learn about:

    -The benefits of combining a real-time streaming platform with a comprehensive analytics stack
    -Building an analytics pipeline by integrating Confluent Platform and Imply
    -How KSQL, streaming SQL for Kafka, can easily transform and filter streams of data in real time
    -Querying and visualizing streaming data in Imply
    -Practical ways to implement Confluent Platform and Imply to address common use cases such as analyzing network flows, collecting and monitoring IoT data and visualizing clickstream data

    Confluent Platform, developed by the creators of Kafka, enables the ingest and processing of massive amounts of real-time event data. Imply, the complete analytics stack built on Druid, can ingest, store, query and visualize streaming data from Confluent Platform, enabling end-to-end real-time analytics. Together, Confluent and Imply can provide low latency data delivery, data transform, and data querying capabilities to power a range of use cases.
  • Digital Transformation Mindset - More Than Just Technology Recorded: Dec 20 2018 64 mins
    Neil Avery, Confluent + Tapas Chandra, Wells Fargo + Shubho Chatterjee, Independence Blue Cross + Vishal Venkatram, GE Power
    Many enterprises faced with silo’ed, batch-oriented, legacy systems struggle to compete in this new digital-first world. Adhering to the ‘If it’s not broken don’t fix it’ mentality leaves the door wide open for native digital challengers to grow and succeed. To stay competitive, your organization must respond in real time to every customer experience transaction, sale, and market movement. But how do you get there? First, you must change your mindset.

    As streaming platforms become central to data strategies, companies both small and large are re-thinking their enterprise architecture with real-time context at the forefront. Monoliths are evolving into microservices. Datacenters are moving to the cloud. What was once a ‘batch’ mindset is quickly being replaced with stream processing as the demands of the business impose real-time requirements on technology leaders.

    Join Argyle, in partnership with Confluent, in our 2018 CIO Virtual Event: The Digital Transformation Mindset – More Than Just Technology. During the webinar we’ll learn how leading companies across industries rely on a streaming platform to make event-driven architectures central to:

    • How data strategies and IT initiatives are improving the digital customer experiences
    • How executives are reducing risk with real time monitoring and anomaly detection
    • Increasing operational agility with microservices and IoT architectures within organizations
  • Real-Time Data Streaming in the Insurance Industry Recorded: Dec 13 2018 52 mins
    Christian Nicoll, Generali Switzerland + Kai Waehner, Confluent + Christopher Knauf, Attunity
    Speakers: Christian Nicoll, Director of Platform Engineering & Operations, Generali Switzerland + Kai Waehner, Technology Evangelist, Confluent + Christopher Knauf, DACH Sales Director, Attunity

    Insurance companies are facing similar challenges like all other disrupted market segments like the change of customer expectations and hence the need of differentiating itself new as a brand in a challenging market environment. But at the same time, it underlies a very strict regulatory pressure.

    Generali Switzerland, as many market leaders in every industry, have understood the power of data to reimagine their markets, customers, products, and business model and managed this change by building their Connection Platform within one year.

    Christian Nicoll, Director of Platform Engineering & Operations at Generali Switzerland guides us through their journey of setting up an event-driven architecture to support their digital transformation project.

    Attend this online talk and learn more about:
    -How Generali managed it to assemble various parts to one platform
    -The architecture of the Generali Connection Platform, including Confluent, Kafka, and Attunity.
    -Their challenges, best practices, and lessons learned
    -Generali’s plans of expanding and scaling the Connection Platform
    -Additional Use Cases in regulated markets like retail banking
  • Bridge to Cloud: Using Apache Kafka to Migrate to AWS Recorded: Dec 13 2018 58 mins
    Priya Shivakumar (Confluent) + Konstantine Karantasis (Confluent) + Rohit Pujari (AWS)
    Speakers: Priya Shivakumar, Director of Product, Confluent + Konstantine Karantasis, Software Engineer, Confluent + Rohit Pujari, Partner Solutions Architect, AWS

    Most companies start their cloud journey with a new use case, or a new application. Sometimes these applications can run independently in the cloud, but often times they need data from the on premises datacenter. Existing applications will slowly migrate, but will need a strategy and the technology to enable a multi-year migration.

    In this session, we will share how companies around the world are using Confluent Cloud, a fully managed Apache Kafka service, to migrate to AWS. By implementing a central-pipeline architecture using Apache Kafka to sync on-prem and cloud deployments, companies can accelerate migration times and reduce costs.

    In this online talk we will cover:
    •How to take the first step in migrating to AWS
    •How to reliably sync your on premises applications using a persistent bridge to cloud
    •Learn how Confluent Cloud can make this daunting task simple, reliable and performant
    •See a demo of the hybrid-cloud and multi-region deployment of Apache Kafka
  • Apache Kafka: Past, Present and Future Recorded: Nov 29 2018 62 mins
    Jun Rao, Co-founder, Confluent
    In 2010, LinkedIn began developing Apache Kafka®. In 2011, Kafka was released an Apache open source project. Since then, the use of Kafka has grown rapidly in a variety of businesses. Now more than 30% of Fortune 500 companies are already using Kafka.

    In this 60-minute online talk, Confluent Co-founder Jun Rao will:
    -Explain how Kafka became the predominant publish/subscribe messaging system that it is today
    -Introduce Kafka's most recent additions to its set of enterprise-level features
    -Demonstrate how to evolve your Kafka implementation into a complete real-time streaming data platform that functions as the central nervous system for your organization
  • Event-driven Business: How Leading Companies Are Adopting Streaming Strategies Recorded: Nov 15 2018 61 mins
    John Santaferraro, Research Director, EMA + Lyndon Hedderly, Director of Customer Solutions, Confluent
    With the evolution of data-driven strategies, event-based business models are influential in innovative organizations. These new business models are built around the availability of real-time information on customers, payments and supply chains. As businesses look to expand traditional revenues, sourcing events from enterprise applications, mobile apps, IoT devices and social media in real time becomes essential to staying ahead of the competition.

    Join John Santaferraro, Research Director at leading IT analyst firm Enterprise Management Associates (EMA), and Lyndon Hedderly, Director of Customer Solutions at Confluent, to learn how business and technology leaders are adopting streaming strategies and how the world of streaming data implementations have changed for the better.

    You will also learn how organizations are:
    -Adopting streaming as a strategic decision
    -Using streaming data for a competitive advantage
    -Using real-time processing for their applications
    -Evolving roadblocks for streaming data
    -Creating business value with a streaming platform
  • Bridge to Cloud: Using Apache Kafka to Migrate to AWS Recorded: Nov 14 2018 58 mins
    Priya Shivakumar (Confluent) + Konstantine Karantasis (Confluent) + Rohit Pujari (AWS)
    Speakers: Priya Shivakumar, Director of Product, Confluent + Konstantine Karantasis, Software Engineer, Confluent + Rohit Pujari, Partner Solutions Architect, AWS

    Most companies start their cloud journey with a new use case, or a new application. Sometimes these applications can run independently in the cloud, but often times they need data from the on premises datacenter. Existing applications will slowly migrate, but will need a strategy and the technology to enable a multi-year migration.

    In this session, we will share how companies around the world are using Confluent Cloud, a fully managed Apache Kafka service, to migrate to AWS. By implementing a central-pipeline architecture using Apache Kafka to sync on-prem and cloud deployments, companies can accelerate migration times and reduce costs.

    In this online talk we will cover:
    •How to take the first step in migrating to AWS
    •How to reliably sync your on premises applications using a persistent bridge to cloud
    •Learn how Confluent Cloud can make this daunting task simple, reliable and performant
    •See a demo of the hybrid-cloud and multi-region deployment of Apache Kafka
  • Apache Kafka® Delivers a Single Source of Truth for The New York Times Recorded: Nov 13 2018 60 mins
    Boerge Svingen, Director of Engineering, The New York Times
    With 3.6 million paid print and digital subscriptions, how did The New York Times remain a leader in an evolving industry that once relied on print? It fundamentally changed its infrastructure at the core to keep up with the new expectations of the digital age and its consumers. Now every piece of content ever published by The New York Times throughout the past 166 years and counting is stored in Apache Kafka®.

    Join The New York Times' Director of Engineering Boerge Svingen to learn how the innovative news giant of America transformed the way it sources content while still maintaining searchability, accuracy and accessibility through a variety of applications and services—all through the power of a real-time streaming platform.

    In this talk, Boerge will:
    -Provide an overview of what the publishing infrastructure used to look like
    -Deep dive into the log-based architecture of The New York Times’ Publishing Pipeline
    -Explain the schema, monolog and skinny log used for storing articles
    -Share challenges and lessons learned
    -Answer live questions submitted by the audience
  • Deploying and Operating KSQL Recorded: Oct 25 2018 57 mins
    Nick Dearden, Director of Engineering - Confluent
    In this session, Nick Dearden covers the planning and operation of your KSQL deployment, including under-the-hood architectural details. You will learn about the various deployment models, how to track and monitor your KSQL applications, how to scale in and out and how to think about capacity planning.

    This is part 3 out of 3 in the Empowering Streams through KSQL series.
We provide a central nervous system for streaming real-time data.
Confluent, founded by the creators of open source Apache Kafka®, provides the leading streaming platform that enables enterprises to maximize the value of data. Confluent Platform empowers leaders in industries such as retail, logistics, manufacturing, financial services, technology and media, to move data from isolated systems into a real-time data pipeline where they can act on it immediately.

Backed by Benchmark, Index Ventures and Sequoia, Confluent is based in Palo Alto, California. To learn more, please visit www.confluent.io.

Embed in website or blog

Successfully added emails: 0
Remove all
  • Title: Real-Time Data Streaming in the Insurance Industry
  • Live at: Dec 13 2018 10:25 pm
  • Presented by: Christian Nicoll, Generali Switzerland + Kai Waehner, Confluent + Christopher Knauf, Attunity
  • From:
Your email has been sent.
or close