Hi [[ session.user.profile.firstName ]]

Express Scripts: Driving Digital Transformation from Mainframe to Microservices

Express Scripts is reimagining its data architecture to bring best-in-class user experience and provide the foundation of next-generation applications. The challenge lies in the ability to efficiently and cost-effectively access the ever-increasing amount of data.

This online talk will showcase how Apache Kafka® plays a key role within Express Scripts’ transformation from mainframe to a microservices-based ecosystem, ensuring data integrity between two worlds. It will discuss how change data capture (CDC) technology is leveraged to stream data changes to Confluent Platform, allowing a low-latency data pipeline to be built.

Watch now to learn:

-Why Apache Kafka is an ideal data integration platform for microservices
-How Express Scripts is building cloud-based microservices when the system of record is a relational database residing on an on-premise mainframe
-How Confluent Platform allows for data integrity between disparate platforms and meets real time SLAs and low-latency requirements
-How Attunity Replicate software is leveraged to stream data changes to Apache Kafka, allowing you to build a low-latency data pipeline
Recorded Jun 11 2019 58 mins
Your place is confirmed,
we'll send you email reminders
Presented by
Ankur Kaneria, Principal Architect, Express Scripts + Kevin Petrie, Attunity + Alan Hsia, Confluent
Presentation preview: Express Scripts: Driving Digital Transformation from Mainframe to Microservices

Network with like-minded attendees

  • [[ session.user.profile.displayName ]]
    Add a photo
    • [[ session.user.profile.displayName ]]
    • [[ session.user.profile.jobTitle ]]
    • [[ session.user.profile.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(session.user.profile) ]]
  • [[ card.displayName ]]
    • [[ card.displayName ]]
    • [[ card.jobTitle ]]
    • [[ card.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(card) ]]
  • Channel
  • Channel profile
  • On Track with Apache Kafka®: Building a Streaming ETL Solution with Rail Data May 29 2020 10:00 pm UTC 58 mins
    Robin Moffatt, Developer Advocate, Confluent
    As data engineers, we frequently need to build scalable systems working with data from a variety of sources and with various ingest rates, sizes, and formats. This talk takes an in-depth look at how Apache Kafka can be used to provide a common platform on which to build data infrastructure driving both real-time analytics as well as event-driven applications.

    Using a public feed of railway data it will show how to ingest data from message queues such as ActiveMQ with Kafka Connect, as well as from static sources such as S3 and REST endpoints. We'll then see how to use stream processing to transform the data into a form useful for streaming to analytics in tools such as Elasticsearch and Neo4j. The same data will be used to drive a real-time notifications service through Telegram.

    If you're wondering how to build your next scalable data platform, how to reconcile the impedance mismatch between stream and batch, and how to wrangle streams of data—this talk is for you!
  • Spring Kafka Beyond the Basics - Lessons learned on our Kafka Journey at ING Ban Recorded: May 26 2020 57 mins
    Tim van Baarsen
    In this talk, Tim van Baarsen will take you on a journey beyond the basics of Spring Kafka and will share his knowledge, pitfalls and lessons learned based on real-life Kafka projects that are running in production for many years at ING Bank in the Netherlands.
  • Elastically Scaling Kafka Using Confluent Recorded: May 22 2020 60 mins
    Josh Rosenberg, Group Product Marketing, Confluent + Ricardo Ferreira, Developer Advocate, Confluent
    The number of mission-critical apps and the amount of data underpinning them has grown exponentially in the increasingly digital world - with no sign of slowing down. But rigid data architectures slow organizations down, forcing them to spend too much up front on resources they don’t use, and causing lag or downtime across their apps. In today’s hypercompetitive digital world where customer loyalty connects with the best performing applications, every millisecond delay could be another lost customer.

    Adjusting to the real-time needs of your mission-critical apps is only possible with an architecture that scales elastically. Confluent re-engineered Apache Kafka into an elastically scalable, next-gen event streaming platform that processes real-time data wherever it lives - making it accessible for any budget or use case.

    Register now to learn how to:
    -Quickly deploy Kafka in Confluent Cloud with just a few clicks and elastically scale your workloads
    -Effortlessly connect your critical data sources and sinks to Kafka to build a complete data pipeline for your real-time apps
    -Easily process streaming data with a simple interactive SQL interface using fully-managed KSQL
    -Accelerate the deployment of standardized, self-managed Kafka clusters as cloud-native systems on Kubernetes to achieve elastic scale with Confluent Platform
  • Enabling Smarter Cities and Connected Vehicles with Apache Kafka Recorded: May 13 2020 57 mins
    Kai Waehner, Technology Evangelist, Confluent + Rob Cowert, Systems Engineer, Confluent
    Many cities are investing in technologies to transform their cities into smart city environments in which data collection and analysis is utilized to manage assets and resources efficiently. Modern technology can help connect the right data, at the right time, to the right people, processes and systems. Innovations around smart cities and the Internet of Things give cities the ability to improve motor safety, unify and manage transportation systems and traffic, save energy and provide a better experience for the residents.

    By utilizing an event streaming platform, like Confluent, cities are able to process data in real-time from thousands of sources, such as sensors. By aggregating that data and analyzing real-time data streams, more informed decisions can be made and fine-tuned operations developed for a positive impact on everyday challenges faced by cities.

    Watch this webinar to learn how to:
    -Overcome challenges for building a smarter city
    -Build a real time infrastructure to correlate relevant events
    -Connect thousands of devices, machines, and people
    -Leverage open source and fully-managed solutions from the Apache Kafka ecosystem
  • Spring Kafka Beyond the Basics - Lessons learned on our Kafka Journey at ING Ban Recorded: May 10 2020 58 mins
    Tim van Baarsen
    In this talk, Tim van Baarsen will take you on a journey beyond the basics of Spring Kafka and will share his knowledge, pitfalls and lessons learned based on real-life Kafka projects that are running in production for many years at ING Bank in the Netherlands.
  • Simplified Hybrid Cloud Migration with Confluent and Google Cloud Recorded: May 7 2020 55 mins
    Josh Treichel, Confluent + Kir Titievsky, Google Cloud + Oguz Kayral, Unity Technologies
    In uncertain times, business flexibility is at a premium. What is a proper cloud migration strategy? How can you safely prepare and migrate applications to hybrid cloud? Apache Kafka and Confluent provide flexible deployment options for a simplified, multi-cloud migration. Move workloads to Google Cloud Platform without a change of technologies, reducing risk, increasing business options, and optimizing workloads for GCP and hybrid cloud.

    Unity Technologies, a video game and software development company, and creator of the world's leading real-time 3D development platform that reaches nearly 3 billion devices worldwide, recently turned to event stream processing via Apache Kafka and Confluent Platform to migrate to GCP.

    Join Unity, Confluent and GCP to learn how to reduce risk and increase business options with a hybrid cloud strategy.

    Register now to learn:
    -Challenges and considerations with any cloud migration strategy
    -Why building a robust data pipeline across any cloud and on-premises systems, and replicating streaming data from any Kafka cluster to GCP matters
    -How utilizing Google technology including Google ML Engine, Google AutoML and Google BigQuery, in addition to partner ISV services like MongoDB, Snowflake, Datastax, can make a difference in your business
    -What Confluent's enterprise streaming platform offers above and beyond Apache Kafka
    -Why Unity moved to Google Cloud, what drove their migration, and the lessons they learned
  • What’s New in Confluent Platform 5.5 Recorded: May 7 2020 39 mins
    Nick Bryan, Product Marketing Manager, Confluent + David Araujo, Sr. Product Manager, Confluent
    Join the Confluent Product Marketing team as we provide an overview of Confluent Platform 5.5, which makes Apache Kafka and event streaming more broadly accessible to developers with enhancements to data compatibility, multi-language development, and ksqlDB.

    Building an event-driven architecture with Apache Kafka allows you to transition from traditional silos and monolithic applications to modern microservices and event streaming applications. With these benefits has come an increased demand for Kafka developers from a wide range of industries. The Dice Tech Salary Report recently ranked Kafka as the highest-paid technological skill of 2019, a year removed from ranking it second.

    With Confluent Platform 5.5, we are making it even simpler for developers to connect to Kafka and start building event streaming applications, regardless of their preferred programming languages or the underlying data formats used in their applications.

    This session will cover the key features of this latest release, including:
    -Support for Protobuf and JSON schemas in Confluent Schema Registry and throughout our entire platform
    -Exactly once semantics for non-Java clients
    -Admin functions in REST Proxy (preview)
    -ksqlDB 0.7 and ksqlDB Flow View in Confluent Control Center
  • Bosch Power Tools Enables Real-time Analytics on IoT Event Streams Recorded: May 7 2020 43 mins
    Ralph Debusmann, IoT Solution Architect, Bosch Power Tools + Jeff Bean, Confluent + Igor Canadi, Rockset
    Bosch Power Tools, a division of The Bosch Group, is among the world’s leading providers of power tools, power tool accessories, and measuring tools. To accelerate its IoT initiatives, Bosch uses event streaming with manufacturing data from IoT sensor devices to monitor and respond to inefficiencies in the production system, changes in the supply chain, and product quality reports in the field. In this online talk, Bosch’s Ralph Debusmann outlines their architectural vision for bringing many data streams into a single platform, surrounded by databases that can power complex real-time analytics.
  • Bosch Power Tools Enables Real-time Analytics on IoT Event Streams Recorded: Apr 30 2020 43 mins
    Ralph Debusmann, IoT Solution Architect, Bosch Power Tools + Jeff Bean, Confluent + Igor Canadi, Rockset
    Bosch Power Tools, a division of The Bosch Group, is among the world’s leading providers of power tools, power tool accessories, and measuring tools. To accelerate its IoT initiatives, Bosch uses event streaming with manufacturing data from IoT sensor devices to monitor and respond to inefficiencies in the production system, changes in the supply chain, and product quality reports in the field. In this online talk, Bosch’s Ralph Debusmann outlines their architectural vision for bringing many data streams into a single platform, surrounded by databases that can power complex real-time analytics.
  • Bridge to Cloud: Using Apache Kafka to Migrate to AWS Recorded: Apr 23 2020 57 mins
    Priya Shivakumar (Confluent) + Konstantine Karantasis (Confluent) + Rohit Pujari (AWS)
    Speakers: Priya Shivakumar, Director of Product, Confluent + Konstantine Karantasis, Software Engineer, Confluent + Rohit Pujari, Partner Solutions Architect, AWS

    Most companies start their cloud journey with a new use case, or a new application. Sometimes these applications can run independently in the cloud, but often times they need data from the on premises datacenter. Existing applications will slowly migrate, but will need a strategy and the technology to enable a multi-year migration.

    In this session, we will share how companies around the world are using Confluent Cloud, a fully managed Apache Kafka service, to migrate to AWS. By implementing a central-pipeline architecture using Apache Kafka to sync on-prem and cloud deployments, companies can accelerate migration times and reduce costs.

    In this online talk we will cover:
    •How to take the first step in migrating to AWS
    •How to reliably sync your on premises applications using a persistent bridge to cloud
    •Learn how Confluent Cloud can make this daunting task simple, reliable and performant
    •See a demo of the hybrid-cloud and multi-region deployment of Apache Kafka
  • Introducing Events and Stream Processing into Nationwide Building Society. Recorded: Apr 15 2020 49 mins
    Rob Jackson, Head of Application Architecture at Nationwide Building Society
    Open Banking regulations compel the UK’s largest banks, and building societies to enable their customers to share personal information with other regulated companies securely. As a result companies such as Nationwide Building Society are re-architecting their processes and infrastructure around customer needs to reduce the risk of losing relevance and the ability to innovate.

    In this online talk, you will learn why, when facing Open Banking regulation and rapidly increasing transaction volumes, Nationwide decided to take load off their back-end systems through real-time streaming of data changes into Apache Kafka®. You will hear how Nationwide started their journey with Apache Kafka®, beginning with the initial use case of creating a real-time data cache using Change Data Capture, Confluent Platform and Microservices. Rob Jackson, Head of Application Architecture, will also cover how Confluent enabled Nationwide to build the stream processing backbone that is being used to re-engineer the entire banking experience including online banking, payment processing and mortgage applications.
  • The Rise of Real-Time Event-Driven Architecture Recorded: Apr 2 2020 34 mins
    Tim Berglund, Sr. Director Developer Experience, Confluent
    Businesses operate in real-time and the software they use is catching up. Rather than processing data only at the end of the day, enterprises are seeking to react to it continuously as the data arrives.

    This is the emerging world of stream processing. Apache Kafka® was built with the vision to become the central nervous system that makes data available in real-time to all the applications that need to use it.

    This talk explains how companies are using the concepts of events and streams to transform their business to meet the demands of this digital future and how Apache Kafka® serves as a foundation to streaming data applications.
  • What's New in Confluent Platform 5.4 Recorded: Mar 26 2020 56 mins
    Mau Barra, Group Manager, Product Marketing, Confluent
    Join the Confluent Product team as we provide a technical overview of Confluent Platform 5.4, which delivers groundbreaking enhancements in the areas of security, disaster recovery and scalability.

    Building an event-driven architecture with Apache Kafka allows you to transition from traditional silos and monolithic applications to modern microservices and event streaming applications.

    However, large enterprises need to complement Kafka with foundational architectural attributes required for production, along with tools that help them run Kafka efficiently at scale.

    This session will cover the key features of this latest release, including:
    -Role-Based Access Control
    -Structured Audit Logs
    -Multi-Region Clusters
    -Schema Validation
    -Tiered Storage
  • Bosch Power Tools Enables Real-time Analytics on IoT Event Streams Recorded: Feb 27 2020 44 mins
    Ralph Debusmann, IoT Solution Architect, Bosch Power Tools + Jeff Bean, Confluent + Igor Canadi, Rockset
    Bosch Power Tools, a division of The Bosch Group, is among the world’s leading providers of power tools, power tool accessories, and measuring tools. To accelerate its IoT initiatives, Bosch uses event streaming with manufacturing data from IoT sensor devices to monitor and respond to inefficiencies in the production system, changes in the supply chain, and product quality reports in the field. In this online talk, Bosch’s Ralph Debusmann outlines their architectural vision for bringing many data streams into a single platform, surrounded by databases that can power complex real-time analytics.
  • Building an Event Driven Global Data Fabric with Apache Kafka Recorded: Feb 27 2020 40 mins
    Will LaForest, CTO Confluent Public Sector
    Government agencies are grappling with a growing challenge of distributing data across a geographically diverse set of locations around the US and globally. In order to ensure mission success, data needs to flow to all of these locations rapidly. Additionally, latency, bandwidth and reliability of communication can prove to be a challenge for agencies. A global data fabric is an emerging approach to help connect mission to data across multiple locations and deliver uniformity and consistency at scale.

    This on-demand webinar will cover:

    An overview of Apache Kafka and and how an event streaming platform can support your agencies mission
    Considerations around handling varying quality communication links
    Synchronous vs asynchronous data replication
    New multi-region capabilities in Confluent Platform for Global Data Fabric
  • Enabling Event Streaming at AO.com Recorded: Feb 19 2020 50 mins
    Jon Vines, AO.com
    Learn how AO.com are enabling real-time event-driven applications to improve customer experience using Confluent Platform.
    The introduction of Apache Kafka and the Confluent platform is supporting AO.com in modernizing the technical approach to delighting its customers. A key part of this enablement is the introduction of an event-streaming eco-system enabling event-driven applications and architecture.
    Throughout this session, we’ll look at the challenges AO.com faced when looking to adopt Kafka, their use of Confluent Platform including Kafka Connect and KSQL and the adoption of Confluent Cloud. We’ll look at the first steps, where the team are at now and what the future looks like.
  • Enabling Smarter Cities and Connected Vehicles with Apache Kafka Recorded: Feb 18 2020 58 mins
    Kai Waehner, Technology Evangelist, Confluent + Rob Cowert, Systems Engineer, Confluent
    Many cities are investing in technologies to transform their cities into smart city environments in which data collection and analysis is utilized to manage assets and resources efficiently. Modern technology can help connect the right data, at the right time, to the right people, processes and systems. Innovations around smart cities and the Internet of Things give cities the ability to improve motor safety, unify and manage transportation systems and traffic, save energy and provide a better experience for the residents.

    By utilizing an event streaming platform, like Confluent, cities are able to process data in real-time from thousands of sources, such as sensors. By aggregating that data and analyzing real-time data streams, more informed decisions can be made and fine-tuned operations developed for a positive impact on everyday challenges faced by cities.

    Watch this webinar to learn how to:
    -Overcome challenges for building a smarter city
    -Build a real time infrastructure to correlate relevant events
    -Connect thousands of devices, machines, and people
    -Leverage open source and fully-managed solutions from the Apache Kafka ecosystem
  • What's New in Confluent Platform 5.4 Recorded: Feb 6 2020 57 mins
    Mau Barra, Group Manager, Product Marketing, Confluent
    Join the Confluent Product team as we provide a technical overview of Confluent Platform 5.4, which delivers groundbreaking enhancements in the areas of security, disaster recovery and scalability.

    Building an event-driven architecture with Apache Kafka allows you to transition from traditional silos and monolithic applications to modern microservices and event streaming applications.

    However, large enterprises need to complement Kafka with foundational architectural attributes required for production, along with tools that help them run Kafka efficiently at scale.

    This session will cover the key features of this latest release, including:
    -Role-Based Access Control
    -Structured Audit Logs
    -Multi-Region Clusters
    -Schema Validation
    -Tiered Storage
  • SIEM Modernization: Build a Situationally Aware Organization with Apache Kafka® Recorded: Jan 30 2020 35 mins
    Jeffrey Needham, Confluent
    Of all security breaches, 85% are conducted with compromised credentials, often at the administration level or higher. A lot of IT groups think “security” means authentication, authorization and encryption (AAE), but these are often tick-boxes that rarely stop breaches. The internal threat surfaces of data streams or disk drives in a raidset in a data center are not the threat surface of interest.

    Cyber or Threat organizations must conduct internal investigations of IT, subcontractors and supply chains without implicating the innocent. Therefore, they are organizationally air-gapped from IT. Some surveys indicate up to 10% of IT is under investigation at any given time.

    Deploying a signal processing platform, such as Confluent Platform, allows organizations to evaluate data as soon as it becomes available enabling them to assess and mitigate risk before it arises. In Cyber or Threat Intelligence, events can be considered signals, and when analysts are hunting for threat actors, these don't appear as a single needle in a haystack, but as a series of needles. In this paradigm, streams of signals aggregate into signatures. This session shows how various sub-systems in Apache Kafka can be used to aggregate, integrate and attribute these signals into signatures of interest.

    Watch now to learn:
    -The current threat landscape
    -The difference between Security and Threat Intelligence
    -The value of Confluent Platform as an ideal complement to hardware endpoint detection systems and batch-based SIEM warehouses
  • Apache Kafka Architecture & Fundamentals Explained Recorded: Dec 30 2019 57 mins
    Joe Desmond, Technical Trainer, Confluent
    This session explains Apache Kafka’s internal design and architecture. Companies like LinkedIn are now sending more than 1 trillion messages per day to Apache Kafka. Learn about the underlying design in Kafka that leads to such high throughput.

    This talk provides a comprehensive overview of Kafka architecture and internal functions, including:
    -Topics, partitions and segments
    -The commit log and streams
    -Brokers and broker replication
    -Producer basics
    -Consumers, consumer groups and offsets

    This session is part 2 of 4 in our Fundamentals for Apache Kafka series.
We provide a central nervous system for streaming real-time data.
Confluent, founded by the creators of open source Apache Kafka®, provides the leading streaming platform that enables enterprises to maximize the value of data. Confluent Platform empowers leaders in industries such as retail, logistics, manufacturing, financial services, technology and media, to move data from isolated systems into a real-time data pipeline where they can act on it immediately.

Backed by Benchmark, Index Ventures and Sequoia, Confluent is based in Palo Alto, California. To learn more, please visit www.confluent.io.

Embed in website or blog

Successfully added emails: 0
Remove all
  • Title: Express Scripts: Driving Digital Transformation from Mainframe to Microservices
  • Live at: Jun 11 2019 9:00 am
  • Presented by: Ankur Kaneria, Principal Architect, Express Scripts + Kevin Petrie, Attunity + Alan Hsia, Confluent
  • From:
Your email has been sent.
or close