Hi [[ session.user.profile.firstName ]]

Confluent

  • Date
  • Rating
  • Views
  • Real-Time Data Streaming in the Insurance Industry
    Real-Time Data Streaming in the Insurance Industry
    Christian Nicoll, Generali Switzerland + Kai Waehner, Confluent + Christopher Knauf, Attunity Recorded: Dec 13 2018 52 mins
    Speakers: Christian Nicoll, Director of Platform Engineering & Operations, Generali Switzerland + Kai Waehner, Technology Evangelist, Confluent + Christopher Knauf, DACH Sales Director, Attunity

    Insurance companies are facing similar challenges like all other disrupted market segments like the change of customer expectations and hence the need of differentiating itself new as a brand in a challenging market environment. But at the same time, it underlies a very strict regulatory pressure.

    Generali Switzerland, as many market leaders in every industry, have understood the power of data to reimagine their markets, customers, products, and business model and managed this change by building their Connection Platform within one year.

    Christian Nicoll, Director of Platform Engineering & Operations at Generali Switzerland guides us through their journey of setting up an event-driven architecture to support their digital transformation project.

    Attend this online talk and learn more about:
    -How Generali managed it to assemble various parts to one platform
    -The architecture of the Generali Connection Platform, including Confluent, Kafka, and Attunity.
    -Their challenges, best practices, and lessons learned
    -Generali’s plans of expanding and scaling the Connection Platform
    -Additional Use Cases in regulated markets like retail banking
  • Bridge to Cloud: Using Apache Kafka to Migrate to AWS
    Bridge to Cloud: Using Apache Kafka to Migrate to AWS
    Priya Shivakumar (Confluent) + Konstantine Karantasis (Confluent) + Rohit Pujari (AWS) Recorded: Dec 13 2018 58 mins
    Speakers: Priya Shivakumar, Director of Product, Confluent + Konstantine Karantasis, Software Engineer, Confluent + Rohit Pujari, Partner Solutions Architect, AWS

    Most companies start their cloud journey with a new use case, or a new application. Sometimes these applications can run independently in the cloud, but often times they need data from the on premises datacenter. Existing applications will slowly migrate, but will need a strategy and the technology to enable a multi-year migration.

    In this session, we will share how companies around the world are using Confluent Cloud, a fully managed Apache Kafka service, to migrate to AWS. By implementing a central-pipeline architecture using Apache Kafka to sync on-prem and cloud deployments, companies can accelerate migration times and reduce costs.

    In this online talk we will cover:
    •How to take the first step in migrating to AWS
    •How to reliably sync your on premises applications using a persistent bridge to cloud
    •Learn how Confluent Cloud can make this daunting task simple, reliable and performant
    •See a demo of the hybrid-cloud and multi-region deployment of Apache Kafka
  • Apache Kafka: Past, Present and Future
    Apache Kafka: Past, Present and Future
    Jun Rao, Co-founder, Confluent Recorded: Nov 29 2018 62 mins
    In 2010, LinkedIn began developing Apache Kafka®. In 2011, Kafka was released an Apache open source project. Since then, the use of Kafka has grown rapidly in a variety of businesses. Now more than 30% of Fortune 500 companies are already using Kafka.

    In this 60-minute online talk, Confluent Co-founder Jun Rao will:
    -Explain how Kafka became the predominant publish/subscribe messaging system that it is today
    -Introduce Kafka's most recent additions to its set of enterprise-level features
    -Demonstrate how to evolve your Kafka implementation into a complete real-time streaming data platform that functions as the central nervous system for your organization
  • Event-driven Business: How Leading Companies Are Adopting Streaming Strategies
    Event-driven Business: How Leading Companies Are Adopting Streaming Strategies
    John Santaferraro, Research Director, EMA + Lyndon Hedderly, Director of Customer Solutions, Confluent Recorded: Nov 15 2018 61 mins
    With the evolution of data-driven strategies, event-based business models are influential in innovative organizations. These new business models are built around the availability of real-time information on customers, payments and supply chains. As businesses look to expand traditional revenues, sourcing events from enterprise applications, mobile apps, IoT devices and social media in real time becomes essential to staying ahead of the competition.

    Join John Santaferraro, Research Director at leading IT analyst firm Enterprise Management Associates (EMA), and Lyndon Hedderly, Director of Customer Solutions at Confluent, to learn how business and technology leaders are adopting streaming strategies and how the world of streaming data implementations have changed for the better.

    You will also learn how organizations are:
    -Adopting streaming as a strategic decision
    -Using streaming data for a competitive advantage
    -Using real-time processing for their applications
    -Evolving roadblocks for streaming data
    -Creating business value with a streaming platform
  • Bridge to Cloud: Using Apache Kafka to Migrate to AWS
    Bridge to Cloud: Using Apache Kafka to Migrate to AWS
    Priya Shivakumar (Confluent) + Konstantine Karantasis (Confluent) + Rohit Pujari (AWS) Recorded: Nov 14 2018 58 mins
    Speakers: Priya Shivakumar, Director of Product, Confluent + Konstantine Karantasis, Software Engineer, Confluent + Rohit Pujari, Partner Solutions Architect, AWS

    Most companies start their cloud journey with a new use case, or a new application. Sometimes these applications can run independently in the cloud, but often times they need data from the on premises datacenter. Existing applications will slowly migrate, but will need a strategy and the technology to enable a multi-year migration.

    In this session, we will share how companies around the world are using Confluent Cloud, a fully managed Apache Kafka service, to migrate to AWS. By implementing a central-pipeline architecture using Apache Kafka to sync on-prem and cloud deployments, companies can accelerate migration times and reduce costs.

    In this online talk we will cover:
    •How to take the first step in migrating to AWS
    •How to reliably sync your on premises applications using a persistent bridge to cloud
    •Learn how Confluent Cloud can make this daunting task simple, reliable and performant
    •See a demo of the hybrid-cloud and multi-region deployment of Apache Kafka
  • Apache Kafka® Delivers a Single Source of Truth for The New York Times
    Apache Kafka® Delivers a Single Source of Truth for The New York Times
    Boerge Svingen, Director of Engineering, The New York Times Recorded: Nov 13 2018 60 mins
    With 3.6 million paid print and digital subscriptions, how did The New York Times remain a leader in an evolving industry that once relied on print? It fundamentally changed its infrastructure at the core to keep up with the new expectations of the digital age and its consumers. Now every piece of content ever published by The New York Times throughout the past 166 years and counting is stored in Apache Kafka®.

    Join The New York Times' Director of Engineering Boerge Svingen to learn how the innovative news giant of America transformed the way it sources content while still maintaining searchability, accuracy and accessibility through a variety of applications and services—all through the power of a real-time streaming platform.

    In this talk, Boerge will:
    -Provide an overview of what the publishing infrastructure used to look like
    -Deep dive into the log-based architecture of The New York Times’ Publishing Pipeline
    -Explain the schema, monolog and skinny log used for storing articles
    -Share challenges and lessons learned
    -Answer live questions submitted by the audience
  • Deploying and Operating KSQL
    Deploying and Operating KSQL
    Nick Dearden, Director of Engineering - Confluent Recorded: Oct 25 2018 57 mins
    In this session, Nick Dearden covers the planning and operation of your KSQL deployment, including under-the-hood architectural details. You will learn about the various deployment models, how to track and monitor your KSQL applications, how to scale in and out and how to think about capacity planning.

    This is part 3 out of 3 in the Empowering Streams through KSQL series.
  • Achieve Sub-Second Analytics on Apache Kafka with Confluent and Imply
    Achieve Sub-Second Analytics on Apache Kafka with Confluent and Imply
    Rachel Pedreschi, Senior Director, Solutions Engineering, Imply.io + Josh Treichel, Partner Solutions Architect, Confluent Recorded: Oct 24 2018 54 mins
    Analytic pipelines running purely on batch processing systems can suffer from hours of data lag, resulting in accuracy issues with analysis and overall decision-making. Join us for a demo to learn how easy it is to integrate your Apache Kafka® streams in Apache Druid (incubating) to provide real-time insights into the data.

    In this online talk, you’ll hear about ingesting your Kafka streams into Imply’s scalable analytic engine and gaining real-time insights via a modern user interface.

    Register now to learn about:

    -The benefits of combining a real-time streaming platform with a comprehensive analytics stack
    -Building an analytics pipeline by integrating Confluent Platform and Imply
    -How KSQL, streaming SQL for Kafka, can easily transform and filter streams of data in real time
    -Querying and visualizing streaming data in Imply
    -Practical ways to implement Confluent Platform and Imply to address common use cases such as analyzing network flows, collecting and monitoring IoT data and visualizing clickstream data

    Confluent Platform, developed by the creators of Kafka, enables the ingest and processing of massive amounts of real-time event data. Imply, the complete analytics stack built on Druid, can ingest, store, query and visualize streaming data from Confluent Platform, enabling end-to-end real-time analytics. Together, Confluent and Imply can provide low latency data delivery, data transform, and data querying capabilities to power a range of use cases.
  • Live Coding a KSQL Application
    Live Coding a KSQL Application
    Nick Dearden, Director of Engineering & Hojjat Jafarpour, KSQL Project Lead - Confluent Recorded: Oct 18 2018 58 mins
    Join us as we build a complete streaming application with KSQL. There will be plenty of hands-on action, plus a description of our thought process and design choices along the way. Look out for advice on best practices and handy tips and tricks as we go.

    This is part 2 out of 3 in the Empowering Streams through KSQL series.
  • Five Trends in Real Time Applications
    Five Trends in Real Time Applications
    David Menninger, SVP and Research Director, Ventana Research + Joanna Schloss, Subject Matter Expert, Confluent Recorded: Oct 17 2018 60 mins
    Can your organization react to customer events as they occur?
    Can your organization detect anomalies before they cause problems?
    Can your organization process streaming data in real time?

    Real time and event-driven architectures are emerging as key components in developing streaming applications. Nearly half of organizations consider it essential to process event data within seconds of its occurrence. Yet less than one third are satisfied with their ability to do so today. In this webinar featuring Dave Menninger of Ventana Research, learn from the firm’s benchmark research about what streaming data is and why it is important. Joanna Schloss also joins to discuss how event-streaming platforms deliver real time actionability on data as it arrives into the business. Join us to hear how other organizations are managing streaming data and how you can adopt and deploy real time processing capabilities.

    In this webinar you will:
    -Get valuable market research data about how other organizations are managing streaming data
    -Learn how real time processing is a key component of a digital transformation strategy
    -Hear real world use cases of streaming data in action
    -Review architectural approaches for adding real time, streaming data capabilities to your applications

Embed in website or blog