Hi [[ session.user.profile.firstName ]]

Healthcare at the Speed of Events

The exponential adoption of the event streaming platform has been remarkable over the last decade as industry and government alike seek to move towards real-time event driven capabilities. Healthcare has been no different with the approach transforming many aspects of how data is handled to improve the treatment and experience of patients. In this talk we will cover what event streaming is, and how Apache Kafka and Confluent are being applied and adopted in healthcare.

We will discuss real world examples of how event streaming is helping shape healthcare; including managing benefits, sharing electronic health records, and combating pandemics.
Recorded May 12 2021 44 mins
Your place is confirmed,
we'll send you email reminders
Presented by
Will LaForest, Public Sector CTO, Confluent
Presentation preview: Healthcare at the Speed of Events

Network with like-minded attendees

  • [[ session.user.profile.displayName ]]
    Add a photo
    • [[ session.user.profile.displayName ]]
    • [[ session.user.profile.jobTitle ]]
    • [[ session.user.profile.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(session.user.profile) ]]
  • [[ card.displayName ]]
    • [[ card.displayName ]]
    • [[ card.jobTitle ]]
    • [[ card.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(card) ]]
  • Channel
  • Channel profile
  • The Future of Kafka Recorded: Oct 7 2021 56 mins
    Tim Berglund, Senior Director of Developer Experience, Confluent
    What is the future of Kafka beyond messaging? Join Tim Berglund, Head of OSS Kafka Developer Community, to talk about what the future holds for Kafka. This talk will cover some of the concepts and capabilities of ksqlDB, which allows many interesting forward-thinking use cases. We’ll show how you can apply transformations to a stream of events from one Kafka topic to another. We will also be discussing using ksqlDB connectors to bring in data from other systems and use that data to join and enrich streams.
  • Unlock Data by Connecting Confluent Cloud with Azure Cosmos DB Recorded: Sep 23 2021 33 mins
    Nathan Nam, Senior Product Manager, Connectors, Confluent and Abinav Rameesh, Senior Program Manager, Azure Cosmos DB
    Building modern cloud applications requires companies to unlock their data from every aspect of their business with real-time access. Confluent Cloud integrated with Azure Cosmos DB enables companies to automate and integrate data and events across any system, at any scale in near real-time. Connecting Confluent Cloud with Cosmos DB allows companies to streamline infrastructure, increase development velocity, unveil new use cases, and analyze their data.

    Join this webinar with Confluent and Microsoft experts to:
    - Learn how companies are unlocking their data with Kafka
    - Understand integration strategies that you can adopt
    - See a demo of the Cosmos DB connector and how it can safely deliver data and events in real-time
  • Build Fully Managed Data Pipelines with MongoDB Atlas and Confluent Recorded: Sep 21 2021 33 mins
    Robert Walters, Senior Product Manager, Connectors, MongoDB and Nathan Nam, Senior Product Manager, Connectors, Confluent
    Today’s data sources are fast-moving and dispersed, which can leave businesses and engineers struggling to deliver data and applications in real-time. While this can be hard, we know it doesn’t have to be - because we’ve already made it easy.

    With source and sink connectors deployed with just a few clicks, Confluent and MongoDB Atlas are making it simple to stream data from any source directly into MongoDB, regardless of which cloud you use. Our fully managed solution, built on Apache Kafka, brings forward the best of the multi-cloud document database service, MongoDB Atlas, by filling it with real-time data. Together, we can set your data in motion, and with that, help you build fast moving applications enriched with historical context and gain insights that give your business a competitive advantage.

    Join this webinar to learn:

    - What challenges exist with disconnected data sources and legacy systems
    - How businesses are leveraging best-in-class streaming and database technologies to address modern business expectations
    - How to build robust and reactive modern data pipelines and applications with a fully managed service
  • Industry Forum Series - Insurance Recorded: Sep 16 2021 46 mins
    Patrick Druley, Senior Solution Engineer, Confluent
    Insurance companies have always been data centric businesses. Disruption is the new normal as digital native insurance companies gain traction in the market. One challenge facing all property and casualty insurers is the many forms of fraud that raise premium prices. The FBI estimates that around $40 Billion is lost annually* to fraud related to property and casualty insurance. Capturing events in real time and looking through an event driven lens can dramatically impact one's ability to detect and remediate fraud.

    Apache Kafka® is recognized as the world’s leading real-time, fault-tolerant, highly-scalable event streaming platform. It is adopted across thousands of companies worldwide to collect different types of events - member profile updates, claim submissions, etc. - into Kafka in real-time. This architecture enables applications, data platforms and organizations to react to events in real-time. This can improve customer experience, drive revenue and reduce costs across the business.

    In this talk, we’ll discuss the power of events to reimagine your data and how to achieve digital transformation with Apache Kafka and Confluent, in either a self-managed or fully-managed cloud offering. We’ll also look at different types of insurance fraud and how real time event streaming can enable different approaches to reducing them.
  • Legacy Modernization Forum Series - Microservices Recorded: Sep 9 2021 59 mins
    Gaurav Gargate, Senior Director, Engineering, Confluent
    Microservices architecture enables organizations to evolve their systems away from the slow and unresponsive shared-state architectures of the past. Event-driven microservices decouple systems, teams, and products, and streamlines application development. Take a a deep dive into how you can use Confluent Cloud to explore event-driven architectures and how they are being applied across industries.
  • Preparing for KeyBank's Digital Banking Future with Google and Confluent Recorded: Aug 25 2021 40 mins
    KeyBank, Google, and Confluent
    Hear from KeyBank and how they are leveraging Confluent Platform and Google’s Business Application Platform (Google Cloud’s Anthos and Apigee) to drive ongoing digitization initiatives such as real-time fraud detection, lead management, and microservices.
  • Production Safe Confluent Cloud Recorded: Aug 24 2021 79 mins
    Utkarsh Nadkarni, Solutions Engineering Manager, Confluent
    For many organizations, Apache Kafka® is the backbone and source of truth for data systems across the enterprise. Protecting your event streaming platform is critical for data security and often required by governing bodies. This session will review security categories and the essential features of Kafka and Confluent Platform that enable you to secure your event streaming platform, including

    Kafka Brokers and Confluent Servers

    Role Based Access Control (RBAC)


    Alerts and Monitoring

    Data Governance & Compliance
  • Multi-Cloud: Kafka Everywhere Recorded: Aug 19 2021 50 mins
    Fotios Filacours, Sr. Solutions Engineer
    Multi-cloud computing is revolutionizing the way businesses operate, allowing organizations full agility, cost efficiency, security, and flexibility to leverage the best components for each use case. Hybrid and multi-cloud seeing the fastest adoption than any other technology in this domain. Learn what multi-cloud computing is complete with definitions and examples, how it works, pros and cons, and how to start streamlining multi-cloud deployments.
  • Best Practices for Deploying Apps in Confluent Cloud Recorded: Aug 17 2021 66 mins
    Sean Prabba, Solutions Engineer Manager, Confluent
    As a fully managed service available in the biggest cloud providers, including Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP), Confluent Cloud can be self-serve and is deployable within seconds. This session will discuss and illustrate and you can easily point your client applications at Confluent Cloud, and the rest is taken care of: load is automatically distributed across brokers, consumer groups automatically rebalance when a consumer is added or removed, the state stores used by applications using the Kafka Streams APIs are automatically backed up to Confluent Cloud with changelog topics, and failures are automatically mitigated. Confluent Cloud abstracts away the details of operating the platform—no more choosing instance types, storage options, network optimizations, and number of nodes. It is as elastic as your workload, and you pay only for the resources that you use. In true serverless fashion, you just need to understand your data requirements.
  • Mainframe Offload & Integration (Ep 2: Legacy Modernization Series) Recorded: Aug 12 2021 48 mins
    Kai Waehner, Senior Technology Evangelist, Confluent
    Episode 2 of the Legacy Modernization Forum Series

    Mainframes are still hard at work processing over 70 percent of the world’s most essential computing transactions every day. However, high cost, monolithic architectures and missing experts are key challenges for mainframe applications.

    Come join us and learn how mainframe offloading with Apache Kafka and its ecosystem can be used to keep a more modern data store in real-time sync with the mainframe, enabling microservices and delivering the data to other systems such as data warehouses and search indexes.
  • Bridge to Cloud: Confluent Platform to Confluent Cloud (Episode 2: Cloud Series) Recorded: Aug 10 2021 77 mins
    Justin Lee, Solutions Engineer, Confluent
    Episode 2 of the Confluent Cloud Series

    As companies adopt the cloud, they may discover that migrating to the cloud is not a simple, one-time project - it's a much harder task than building a greenfield cloud-native application. They need to build a seamless bridge between on-prem and cloud deployments.

    Confluent Cloud is fully managed Apache Kafka® with complete, enterprise event streaming components. It accelerates developers building event streaming applications, liberates them from operations burden, and enables mobility between multi-cloud and on-prem deployments.

    By attending this talk, you will learn:

    --What is Bridge to Cloud?
    --How to get started Confluent's fully managed Kafka service and connect data systems, process. real-time data streams with KSQL, monitor performance, and data governance.
    --Intro on how to design, configure, manage, and monitor event streaming applications
    --Intro to how to set up and run Confluent Cloud, complete with Bridge-to-Cloud examples.
  • How USCIS Powered a Digital Transition to eProcessing with Kafka Recorded: Aug 5 2021 30 mins
    Rob Brown & Robert Cole, US Citizenship and Immigration Services
    Last year, U.S. Citizenship and Immigration Services (USCIS) adopted a new strategy to accelerate our transition to a digital business model. This eProcessing strategy connects previously siloed technology systems to provide a complete digital experience that will shorten decision timelines, increase transparency, and more efficiently handle the 8 million requests for immigration benefits the agency receives each year.

    To pursue this strategy effectively, we had to rethink and overhaul our IT landscape, one that has much in common with those other large enterprises in both the public and private sectors. We had to move away from antiquated ETL processes and overnight batch processing. And we needed to move away from the jumble of ESB, message queues, and spaghetti-stringed direct connections that were used for interservice communication.

    Today, eProcessing is powered by real-time event streaming with Apache Kafka and Confluent Platform. We are building out our data mesh with microservices, CDC, and an event-driven architecture. This common core platform has reduced the cognitive load on development teams, who can now spend more time on delivering quality code and new features, less on DevSecOps and infrastructure activities. As teams have started to align around this platform, a culture of reusability has grown. We’ve seen a reduction in duplication of effort -- in some cases by up to 50% -- across the organization from case management to risk and fraud.
  • Evolving from Messaging to Event Streaming (Ep 1: Legacy Modernization Series) Recorded: Aug 3 2021 57 mins
    Perry Krol, Manager Solutions Engineering, Confluent | Michael Hussey, Senior Systems Engineer, Confluent
    Episode 1 of the Legacy Modernization Forum Series

    In the face of heightened demands for greater scalability, improved resiliency and continuous stream processing, traditional messaging and legacy architectures are being disrupted by real-time event streaming and its ability to meet these new demands.

    Join us for an interactive discussion to explore Apache Kafka, the foundation of Confluent Cloud and Confluent Platform, and compare it to traditional messaging queues.
  • Enabling Insight to Support World-Class Supercomputing Recorded: Jul 29 2021 23 mins
    Stefan Ceballos, Oak Ridge National Laboratory
    The Oak Ridge Leadership Facility (OLCF) in the National Center for Computational Sciences (NCCS) division at Oak Ridge National Laboratory (ORNL) houses world-class high-performance computing (HPC) resources and has a history of operating top-ranked supercomputers on the TOP500 list, including the world's current fastest, Summit, an IBM AC922 machine with a peak of 200 petaFLOPS. With the exascale era rapidly approaching, the need for a robust and scalable big data platform for operations data is more important than ever. In the past when a new HPC resource was added to the facility, pipelines from data sources spanned multiple data sinks which oftentimes resulted in data silos, slow operational data onboarding, and non-scalable data pipelines for batch processing. Using Apache Kafka as the message bus of the division's new big data platform has allowed for easier decoupling of scalable data pipelines, faster data onboarding, and stream processing with the goal to continuously improve insight into the HPC resources and their supporting systems. This talk will focus on the NCCS division's transition to Apache Kafka over the past few years to enhance the OLCF's current capabilities and prepare for Frontier, OLCF's future exascale system; including the development and deployment of a full big data platform in a Kubernetes environment from both a technical and cultural shift perspective. This talk will also cover the mission of the OLCF, the operational data insights related to high-performance computing that the organization strives for, and several use-cases that exist in production today.
  • The Race to Real-Time: Real-Time Payments and Event Streaming Recorded: Jul 27 2021 68 mins
    Nidhi Agarwal, Volante Technologies | Brian Pyle, Confluent | Chris Matta, Confluent
    The banking and financial services industries are under pressure to adapt to supporting real-time payments along with new and updated standards like ISO 20022, RTP and FedNow. There are common challenges for banks looking to modernize their legacy payment systems in order to handle real-time payments. These challenges require firms to re-evaluate the tools in their toolbox and build a more modern, flexible and scalable payment system without a big-bang rip and replace.

    Intelligence, the logic of encoding, decoding and transforming payment types and formats, and orchestrating interactions with bank systems, so that massive payment volumes can be processed with sub-second SLAs. The systems should be flexible to model processes that may be unique to every organization and to facilitate the building of value-added payment services.

    The delivery system needs to be able to reliably move data from multiple channels and multiple technologies (files, RDBMS, message queues etc) into and out of the Brains of the Operation to process real-time payments and integrate systems across the bank. The platform needs to be secure, scalable and 100% reliable by design - The Central Nervous System – Confluent, Kafka.

    What You Will Learn By Attending?
    --How financial institutions are choosing the right technology in developing their next-generation payment architectures in support of real-time corporate payment processing, on and off the cloud
    --What is critical in deploying the payment solution, where do you start when building your next generation environment and addressing emerging issues within the banking industry?
    --What is Apache Kafka® and why this is quickly becoming the most pervasive technology in core banking transformation.

    Agility is vital, so join us to learn how to clear settlements in seconds vs. days and set your company up for success with the real-time payment journey
  • The Ongoing Disruption of Retail - A Shift to Real-Time Data Streaming Recorded: Jul 20 2021 62 mins
    Kai Waehner, Field CTO, Confluent
    As part of our Confluent Industry Series, we are excited to invite you to join the Retail Forum. In this era of omni-channel retail, consumers are creating more data than ever before. For a retailer, real-time streaming data across the business can help drive insights into customer behaviors and improve business outcomes.

    The future of retail is data, and that future requires the ability to process and use data events in real time. The essence of this effort is event streaming, the continuous, centralized processing of data so it can be used to trigger all kinds of events in real time.

    In this session industry luminary and author Kai Waehner will share real-world use cases and architectures from Walmart and other enterprises to improve the Supply Chain Management (SCM) optimization along with industry insights, use cases and share best practices. Confluent delivers an event streaming platform that helps retailers connect all their data in real time.
  • Applied Apache Kafka for Government: Cyber Recorded: Jul 15 2021 59 mins
    Bert Hayes,Solutions Engineer, Confluent
    Learn how to filter out the noise, cut out the cruft, and focus on the good stuff. In this demo, ksqlDB will be used to enrich, filter, and transform packet data as it is streamed from a Network IDS.
  • The Journey to Cloud Native Kafka: Confluent Cloud (Episode 1: Cloud Series) Recorded: Jul 13 2021 67 mins
    Mayur Mahadeshwar, Senior Solutions Engineer - Financial Services, Confluent
    Episode 1 in the Confluent Cloud Series

    In today’s world we are seeing a big shift towards the Cloud. With this shift to the cloud also comes a big shift in the expectation we have for our software. Turning complex distributed data systems like Apache Kafka into elastically scalable, fully managed services takes a lot of work, because most open source infrastructure simply isn’t built to be cloud native.

    Come learn how Confluent has built Cloud Native and SaaS fundamentals for Apache Kafka to create a fully managed service Confluent Cloud. Furthermore, we will take a deeper look into how companies are adopting Confluent Cloud as part of their Cloud strategies.
  • Applied Apache Kafka for Government: Geospatial Recorded: Jun 24 2021 54 mins
    Will LaForest, Public Sector CTO, Confluent
    While the GEOINT domain has adopted Kafka to deliver and process data, performing actual geospatial operations has required integrating external tools. This session will demonstrate the power of geospatial UDFs in making real time decisions natively in Kafka and show how KSQL makes it easy to handle events in Kafka that have not had a geospatial capability.
  • Cloud-Native Kafka: Simplicity, Scale, Speed & Savings with Confluent Recorded: Jun 15 2021 33 mins
    Priya Shivakumar, Head of Product, Confluent
    How much does it cost your organization to run Apache Kafka®? When you calculate that number, do you account for the impact of downtime? What about the cost of teams managing all that data infrastructure?

    Just as high-growth organizations have realized that data storage and warehousing are more economical and effective in the cloud, the same shift applies to Kafka. To shift the focus back to what matters most—business innovation—organizations must let go of self-managed Kafka.

    In this webinar, you’ll learn about the benefits of leveraging a cloud-native service for Kafka, and how you can lower your total cost of ownership (TCO) by 60% with Confluent Cloud while streamlining your DevOps efforts. Priya Shivakumar, Head of Product, Confluent Cloud, will share two short demos to illustrate how you can:

    - Get started quickly, spinning up Kafka with a few clicks in the cloud
    - Scale elastically and effortlessly, without complex sizing and provisioning
    - Pay for only data streamed, with scale-to-zero pricing and three cluster types fo fit any scale, budget, or use case

    Join us to learn how your organization can run at top speed and save money by transitioning your self-managed Kafka deployment to a fully managed, cloud-native Kafka service.
Confluent North America
Confluent North America

Embed in website or blog

Successfully added emails: 0
Remove all
  • Title: Healthcare at the Speed of Events
  • Live at: May 12 2021 4:00 pm
  • Presented by: Will LaForest, Public Sector CTO, Confluent
  • From:
Your email has been sent.
or close