Hi [[ session.user.profile.firstName ]]

Bridge to Cloud: Using Apache Kafka to Migrate to GCP

Most companies start their cloud journey with a new use case, or a new application. Sometimes these applications can run independently in the cloud, but often times they need data from the on premises datacenter. Existing applications will slowly migrate, but will need a strategy and the technology to enable a multi-year migration.

In this session, we will share how companies around the world are using Confluent Cloud, a fully managed Apache Kafka® service, to migrate to Google Cloud Platform. By implementing a central-pipeline architecture using Apache Kafka to sync on-prem and cloud deployments, companies can accelerate migration times and reduce costs.

Register now to learn:
-How to take the first step in migrating to GCP
-How to reliably sync your on premises applications using a persistent bridge to cloud
-How Confluent Cloud can make this daunting task simple, reliable and performant
Recorded Mar 27 2019 56 mins
Your place is confirmed,
we'll send you email reminders
Presented by
Priya Shivakumar, Director of Product, Confluent + Ryan Lippert, Product Marketing, Google Cloud
Presentation preview: Bridge to Cloud: Using Apache Kafka to Migrate to GCP

Network with like-minded attendees

  • [[ session.user.profile.displayName ]]
    Add a photo
    • [[ session.user.profile.displayName ]]
    • [[ session.user.profile.jobTitle ]]
    • [[ session.user.profile.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(session.user.profile) ]]
  • [[ card.displayName ]]
    • [[ card.displayName ]]
    • [[ card.jobTitle ]]
    • [[ card.companyName ]]
    • [[ userProfileTemplateHelper.getLocation(card) ]]
  • Channel
  • Channel profile
  • An Introduction to KSQL & Kafka Streams Processing with Ticketmaster Aug 6 2019 10:00 am UTC 63 mins
    Dani Traphagen, Sr. Systems Engineer, Confluent + Chris Smith, VP Engineering Data Science, Ticketmaster
    In this all too fabulous talk with Ticketmaster, we will be addressing the wonderful and new wonders of KSQL vs. KStreams.

    If you are new-ish to Apache Kafka® you may ask yourself, “What is a large Apache Kafka deployment?” And you may tell yourself, “This is not my beautiful KSQL use case!” And you may tell yourself, “This is not my beautiful KStreams use case!” And you may ask yourself, “What is a beautiful Apache Kafka use case?” And you may ask yourself, “Am I right about this architecture? Am I wrong?” And you may say to yourself, “My God! What have I done?”

    In this session, we’re going to delve into all these issues and more with Chris Smith, VP of Engineering Data Science at Ticketmaster.

    Watch now to learn:
    -Ticketmaster Apache Kafka Architecture
    -KSQL Architecture and Use Cases
    -KSQL Performance Considerations
    -When to KSQL and When to Live the KStream
    -How Ticketmaster uses KSQL and KStreams in production to reduce development friction in machine learning products
  • GCP for Apache Kafka® Users: Stream Ingestion and Processing Jul 30 2019 9:00 am UTC 59 mins
    Ricardo Ferreira, Developer Advocate, Confluent + Karthi Thyagarajan, Solutions Architect, Google Cloud
    In private and public clouds, stream analytics commonly means stateless processing systems organized around Apache Kafka® or a similar distributed log service. GCP took a somewhat different tack, with Cloud Pub/Sub, Dataflow, and BigQuery, distributing the responsibility for processing among ingestion, processing and database technologies.

    We compare the two approaches to data integration and show how Dataflow allows you to join and transform and deliver data streams among on-prem and cloud Apache Kafka clusters, Cloud Pub/Sub topics and a variety of databases. The session will have a mix of architectural discussions and practical code reviews of Dataflow-based pipelines.
  • Building Event-Driven Applications with Apache Kafka & Confluent Platform Jul 24 2019 10:00 am UTC 43 mins
    Viktor Gamov, Developer Advocate, Confluent
    Apache Kafka® has become the de facto technology for real-time event streaming. Confluent Platform, developed by the creators of Apache Kafka, is an event-streaming platform that enables the ingest and processing of massive amounts of data in real time.

    In this session, we will cover the easiest ways to start developing event-driven applications with Apache Kafka using Confluent Platform. We will also demo a contextual event-driven application built using our ecosystem of connectors, REST proxy, and a variety of native clients.

    Register now to learn:
    -How to create Apache Kafka topics in minutes and process event streams in real time
    -Check the health of an Apache Kafka broker using Confluent Control Center
    -The latest enhancements to Confluent Platform that make it easier to run Apache Kafka at scale
  • End-to-End Integration from the IoT Edge to Confluent Cloud Recorded: Jul 16 2019 28 mins
    Kai Waehner, Technology Evangelist, Confluent + Konstantin Karantasis, Software Engineer, Confluent
    This interactive whiteboard presentation discusses use cases leveraging the Apache Kafka® open source ecosystem as a streaming platform to process IoT data. The session shows architectural alternatives of how devices like cars, machines or mobile devices connect to Apache Kafka via IoT standards like MQTT or OPC-UA.

    Learn how to analyze the IoT data either natively on Apache Kafka with Kafka Streams / KSQL or other tools leveraging Kafka Connect. Kai Waehner will also discuss the benefits of Confluent Cloud and other tools like Confluent Replicator or MQTT Proxy to build bidirectional real time integration from the edge to the cloud.

    Watch now to:
    -Understand end-to-end use cases from different industries where you integrate IoT devices with enterprise IT using open source technologies and standards
    -See how Apache Kafka enables bidirectional end-to-end integration processing from IoT data to various backend applications in the cloud
    -Compare different architectural alternatives and see their benefits and caveats
    -Learn about various standards, APIs and tools of integrating and processing IoT data with different open source components of the Apache Kafka ecosystem
    -Understand the benefits of Confluent Cloud, which provides a highly available and scalable Apache Kafka ecosystem as a managed service
  • Architecting Microservices Applications with Instant Analytics Recorded: Jul 10 2019 55 mins
    Tim Berglund, Sr. Director Developer Experience, Confluent + Rachel Pedreschi, Worldwide Director of Field Engineering, Imply
    The next generation architecture for exploring and visualizing event-driven data in real-time requires the right technology. Microservices deliver significant deployment and development agility, but raise questions of how data will move between services and how it will be analyzed. This online talk explores how Apache Druid and Apache Kafka® can turn a microservices ecosystem into a distributed real-time application with instant analytics. Apache Kafka and Druid form the backbone of an architecture that meet the demands imposed on the next generation applications you are building right now. Join industry experts Tim Berglund, Confluent, and Rachel Pedreschi, Imply, as they discuss architecting microservices apps with Druid and Apache Kafka.
  • How to Fail at Kafka Recorded: Jun 27 2019 20 mins
    Pete Godfrey, Systems Engineer, Confluent
    Apache Kafka® is used by thousands of companies across the world but, how difficult is it to operate? Which parameters do you need to set? What can go wrong? This online talk is based on real-world experience of Kafka deployments and explores a collection of common mistakes that are made when running Kafka in production and some best practices to avoid them.

    Watch now to learn:

    -How to ensure your Kafka data is never lost
    -How to write code to cope when things go wrong
    -How to ensure data governance between producers and consumers
    -How to monitor your cluster

    Join Apache Kafka expert, Pete Godfrey, for this engaging talk and delve into best practice ideas and insights.
  • No More Silos: Integrating Databases into Apache Kafka® Recorded: Jun 25 2019 56 mins
    Robin Moffatt, Developer Advocate, Confluent
    Companies new and old are all recognizing the importance of a low-latency, scalable, fault-tolerant data backbone, in the form of the Apache Kafka® streaming platform. With Apache Kafka, developers can integrate multiple sources and systems, which enables low latency analytics, event-driven architectures and the population of multiple downstream systems.

    In this talk, we’ll look at one of the most common integration requirements – connecting databases to Apache Kafka. We’ll consider the concept that all data is a stream of events, including that residing within a database. We’ll look at why we’d want to stream data from a database, including driving applications in Apache Kafka from events upstream. We’ll discuss the different methods for connecting databases to Apache Kafka, and the pros and cons of each. Techniques including Change-Data-Capture (CDC) and Apache Kafka Connect will be covered, as well as an exploration of the power of KSQL, streaming SQL for Apache Kafka, for performing transformations such as joins on the inbound data.

    Register now to learn:
    •Why databases are just a materialized view of a stream of events
    •The best ways to integrate databases with Apache Kafka
    •Anti-patterns to be aware of
    •The power of KSQL for transforming streams of data in Apache Kafka
  • Architecting Microservices Applications with Instant Analytics Recorded: Jun 19 2019 56 mins
    Tim Berglund, Sr. Director Developer Experience, Confluent + Rachel Pedreschi, Worldwide Director of Field Engineering, Imply
    The next generation architecture for exploring and visualizing event-driven data in real-time requires the right technology. Microservices deliver significant deployment and development agility, but raise questions of how data will move between services and how it will be analyzed. This online talk explores how Apache Druid and Apache Kafka® can turn a microservices ecosystem into a distributed real-time application with instant analytics. Apache Kafka and Druid form the backbone of an architecture that meet the demands imposed on the next generation applications you are building right now. Join industry experts Tim Berglund, Confluent, and Rachel Pedreschi, Imply, as they discuss architecting microservices apps with Druid and Apache Kafka.
  • Connecting Apache Kafka to Cash Recorded: Jun 13 2019 29 mins
    Lyndon Hedderly, Director Customer Solutions, Confluent
    Real-time data has value. But how do you quantify that value in order to create a business case for becoming data, or event driven? This talk explores why valuing Kafka is important - but covers some of the problems in quantifying the value of a data infrastructure platform.

    Despite the challenges, we will explore some examples of where we have attributed a quantified monetary amount to Kafka across specific business use cases, within Retail, Banking and Automotive.

    Whether organizations are using data to create new business products and services, improving user experiences, increasing productivity, or managing risk, we’ll see that fast and interconnected data, or ‘event streaming’ is increasingly important. We will conclude with the five steps to creating a business case around Kafka use cases.
  • Express Scripts: Driving Digital Transformation from Mainframe to Microservices Recorded: Jun 11 2019 58 mins
    Ankur Kaneria, Principal Architect, Express Scripts + Kevin Petrie, Attunity + Alan Hsia, Confluent
    Express Scripts is reimagining its data architecture to bring best-in-class user experience and provide the foundation of next-generation applications. The challenge lies in the ability to efficiently and cost-effectively access the ever-increasing amount of data.

    This online talk will showcase how Apache Kafka® plays a key role within Express Scripts’ transformation from mainframe to a microservices-based ecosystem, ensuring data integrity between two worlds. It will discuss how change data capture (CDC) technology is leveraged to stream data changes to Confluent Platform, allowing a low-latency data pipeline to be built.

    Watch now to learn:

    -Why Apache Kafka is an ideal data integration platform for microservices
    -How Express Scripts is building cloud-based microservices when the system of record is a relational database residing on an on-premise mainframe
    -How Confluent Platform allows for data integrity between disparate platforms and meets real time SLAs and low-latency requirements
    -How Attunity Replicate software is leveraged to stream data changes to Apache Kafka, allowing you to build a low-latency data pipeline
  • Event Streaming Microservices with Apache Kafka on Kubernetes Recorded: Jun 6 2019 55 mins
    Michael Ng, Product Manager, Confluent + Kamala Dasika, Product Marketing, Cloud Platform and Ecosystem, Pivotal
    Microservices, events, containers, and orchestrators are dominating our vernacular today. As operations teams adapt to support these technologies in production, cloud-native platforms like Pivotal Cloud Foundry and Kubernetes have quickly risen to serve as force multipliers of automation, productivity and value.

    Apache Kafka® is providing developers a critically important component as they build and modernize applications to cloud-native architecture.

    This talk will explore:
    • Why cloud-native platforms and why run Apache Kafka on Kubernetes?
    • What kind of workloads are best suited for this combination?
    • Tips to determine the path forward for legacy monoliths in your application portfolio
    • Demo: Running Apache Kafka as a Streaming Platform on Kubernetes
  • An Introduction to KSQL & Kafka Streams Processing with Ticketmaster Recorded: May 29 2019 63 mins
    Dani Traphagen, Sr. Systems Engineer, Confluent + Chris Smith, VP Engineering Data Science, Ticketmaster
    In this all too fabulous talk with Ticketmaster, we will be addressing the wonderful and new wonders of KSQL vs. KStreams.

    If you are new-ish to Apache Kafka® you may ask yourself, “What is a large Apache Kafka deployment?” And you may tell yourself, “This is not my beautiful KSQL use case!” And you may tell yourself, “This is not my beautiful KStreams use case!” And you may ask yourself, “What is a beautiful Apache Kafka use case?” And you may ask yourself, “Am I right about this architecture? Am I wrong?” And you may say to yourself, “My God! What have I done?”

    In this session, we’re going to delve into all these issues and more with Chris Smith, VP of Engineering Data Science at Ticketmaster.

    Watch now to learn:
    -Ticketmaster Apache Kafka Architecture
    -KSQL Architecture and Use Cases
    -KSQL Performance Considerations
    -When to KSQL and When to Live the KStream
    -How Ticketmaster uses KSQL and KStreams in production to reduce development friction in machine learning products
  • Building Event-Driven Applications with Apache Kafka & Confluent Platform Recorded: May 29 2019 44 mins
    Viktor Gamov, Developer Advocate, Confluent
    Apache Kafka® has become the de facto technology for real-time event streaming. Confluent Platform, developed by the creators of Apache Kafka, is an event-streaming platform that enables the ingest and processing of massive amounts of data in real time.

    In this session, we will cover the easiest ways to start developing event-driven applications with Apache Kafka using Confluent Platform. We will also demo a contextual event-driven application built using our ecosystem of connectors, REST proxy, and a variety of native clients.

    Register now to learn:
    -How to create Apache Kafka topics in minutes and process event streams in real time
    -Check the health of an Apache Kafka broker using Confluent Control Center
    -The latest enhancements to Confluent Platform that make it easier to run Apache Kafka at scale
  • Der Betrieb von Apache Kafka® auf Kubernetes Recorded: May 23 2019 35 mins
    Hellmar Becker, Senior Sales Engineer, Confluent
    Im Rahmen ihrer Modernisierungsprojekte setzen immer mehr Unternehmen auf eine event-getriebene und für den ansteigenden Datendurchsatz leicht skalierbare Architektur und damit auf die Open-Source-Lösung Apache Kafka®.

    So etabliert wie Apache Kafka als Event-Streaming-Plattform ist, so hat sich in den letzten Jahren Kubernetes zum de-facto Standard für die Verwaltung von Container-Umgebungen entwickelt - sowohl bei aufstrebenden Startups aus dem Silicon Valley, als auch bei etablierten Großkonzernen.

    Erfahren Sie in diesem Online Talk von unserem Kafka-Experten:
    -wie leicht man Apache Kafka und Confluent Platform auf Kubernetes bereitstellen und betreiben kann
    -wie eine Microservices-basierte Architektur echte Skalierbarkeit erlaubt
    -welche Vorteile weitere Komponenten wie die Schema Registry oder KSQL bieten
    -wie Confluent Cloud dank Kubernetes Unternehmen noch flexibler und zukunftssicherer macht
  • Introduction to KSQL: Streaming SQL for Apache Kafka® Recorded: May 21 2019 48 mins
    Tom Green, Solution Engineer, Confluent
    Join Tom Green, Solution Engineer at Confluent for this Lunch and Learn talk covering KSQL. Confluent KSQL is the streaming SQL engine that enables real-time data processing against Apache Kafka®. It provides an easy-to-use, yet powerful interactive SQL interface for stream processing on Kafka, without the need to write code in a programming language such as Java or Python. KSQL is scalable, elastic, fault-tolerant, and it supports a wide range of streaming operations, including data filtering, transformations, aggregations, joins, windowing, and sessionization.

    Watch now to learn:
    -How to query streams, using SQL, without writing code.
    -How KSQL provides automated scalability and out-of-the-box high availability for streaming queries
    -How KSQL can be used to join streams of data from different sources
    -The differences between Streams and Tables in Apache Kafka
  • GCP for Apache Kafka® Users: Stream Ingestion and Processing Recorded: May 21 2019 60 mins
    Ricardo Ferreira, Developer Advocate, Confluent + Karthi Thyagarajan, Solutions Architect, Google Cloud
    In private and public clouds, stream analytics commonly means stateless processing systems organized around Apache Kafka® or a similar distributed log service. GCP took a somewhat different tack, with Cloud Pub/Sub, Dataflow, and BigQuery, distributing the responsibility for processing among ingestion, processing and database technologies.

    We compare the two approaches to data integration and show how Dataflow allows you to join and transform and deliver data streams among on-prem and cloud Apache Kafka clusters, Cloud Pub/Sub topics and a variety of databases. The session will have a mix of architectural discussions and practical code reviews of Dataflow-based pipelines.
  • Fast Data – Fast Cars: Wie Apache Kafka die Datenwelt Revolutioniert Recorded: Apr 30 2019 57 mins
    David Schmitz, Principal Architect, Audi Electronics Venture GmbH + Kai Waehner, Technology Evangelist, Confluent
    Für die Automobilindustrie ist die digitale Transformation wie für jede andere Branche zugleich eine digitale Revolution: Neue Marktspieler, neue Technologien und die in immer größeren Mengen anfallenden Daten schaffen neue Chancen, aber auch neue Herausforderungen – und erfordern neben neuen IT-Architekturen auch völlig neue Denkansätze.

    60% der Fortune500-Unternehmen setzen zur Umsetzung ihrer Daten-Streaming-Projekte auf die umfassende verteilte Streaming-Plattform Apache Kafka®, darunter auch die AUDI AG.

    Erfahren Sie in diesem Webinar:
    -Wie Kafka als Grundlage sowohl für Daten-Pipelines als auch für Anwendungen dient, die Echtzeit-Datenströme konsumieren und verarbeiten.
    -Wie Kafka Connect und Kafka Streams geschäftskritische Anwendungen unterstützt
    -Wie Audi mithilfe von Kafka und Confluent eine Fast Data IoT-Plattform umgesetzt hat, die den Bereich „Connected Car“ revolutioniert
  • Apache Kafka® im Unternehmenseinsatz: 10 Lektionen Recorded: Apr 24 2019 22 mins
    Charalampos Papadopoulos, System Engineer Big Data / Analytics, SVA + Marie Fraune, System Engineer Big Data / Analytics, SVA
    Im digitalen Zeitalter sind Events allgegenwärtig. Unternehmen haben ihre Geschäftsmodelle neu ausgerichtet, sind dynamischer, technologie- und dienstleistungsorientierter geworden, was in event-getriebenen, komplexen Geschäftsabläufen in Echtzeit resultiert.

    Dabei setzen 60% der Fortune100-Unternehmen auf Event-Streaming-Plattformen als grundlegende Technologie - Apache Kafka hat sich hier als de-facto Standard etabliert.Welche Stolpersteine und Herausforderungen beim produktiven Einsatz von Apache Kafka zu beachten sind, hat Confluent-Partner SVA in einem kurzweiligen Webinar zusammen gefasst.

    Security, Retention Time oder Exactly Once-Semantiken sind nur drei der Themen, welche wir beleuchten werden.
  • An Introduction to KSQL & Kafka Streams Processing with Ticketmaster Recorded: Apr 23 2019 64 mins
    Dani Traphagen, Sr. Systems Engineer, Confluent + Chris Smith, VP Engineering Data Science, Ticketmaster
    In this all too fabulous talk with Ticketmaster, we will be addressing the wonderful and new wonders of KSQL vs. KStreams.

    If you are new-ish to Apache Kafka® you may ask yourself, “What is a large Apache Kafka deployment?” And you may tell yourself, “This is not my beautiful KSQL use case!” And you may tell yourself, “This is not my beautiful KStreams use case!” And you may ask yourself, “What is a beautiful Apache Kafka use case?” And you may ask yourself, “Am I right about this architecture? Am I wrong?” And you may say to yourself, “My God! What have I done?”

    In this session, we’re going to delve into all these issues and more with Chris Smith, VP of Engineering Data Science at Ticketmaster.

    Watch now to learn:
    -Ticketmaster Apache Kafka Architecture
    -KSQL Architecture and Use Cases
    -KSQL Performance Considerations
    -When to KSQL and When to Live the KStream
    -How Ticketmaster uses KSQL and KStreams in production to reduce development friction in machine learning products
  • No More Silos: Integrating Databases into Apache Kafka® Recorded: Apr 11 2019 57 mins
    Robin Moffatt, Developer Advocate, Confluent
    Companies new and old are all recognizing the importance of a low-latency, scalable, fault-tolerant data backbone, in the form of the Apache Kafka® streaming platform. With Apache Kafka, developers can integrate multiple sources and systems, which enables low latency analytics, event-driven architectures and the population of multiple downstream systems.

    In this talk, we’ll look at one of the most common integration requirements – connecting databases to Apache Kafka. We’ll consider the concept that all data is a stream of events, including that residing within a database. We’ll look at why we’d want to stream data from a database, including driving applications in Apache Kafka from events upstream. We’ll discuss the different methods for connecting databases to Apache Kafka, and the pros and cons of each. Techniques including Change-Data-Capture (CDC) and Apache Kafka Connect will be covered, as well as an exploration of the power of KSQL, streaming SQL for Apache Kafka, for performing transformations such as joins on the inbound data.

    Register now to learn:
    •Why databases are just a materialized view of a stream of events
    •The best ways to integrate databases with Apache Kafka
    •Anti-patterns to be aware of
    •The power of KSQL for transforming streams of data in Apache Kafka
We provide a central nervous system for streaming real-time data.
Confluent, founded by the creators of open source Apache Kafka®, provides the leading streaming platform that enables enterprises to maximize the value of data. Confluent Platform empowers leaders in industries such as retail, logistics, manufacturing, financial services, technology and media, to move data from isolated systems into a real-time data pipeline where they can act on it immediately.

Backed by Benchmark, Index Ventures and Sequoia, Confluent is based in Palo Alto, California. To learn more, please visit www.confluent.io.

Embed in website or blog

Successfully added emails: 0
Remove all
  • Title: Bridge to Cloud: Using Apache Kafka to Migrate to GCP
  • Live at: Mar 27 2019 10:00 am
  • Presented by: Priya Shivakumar, Director of Product, Confluent + Ryan Lippert, Product Marketing, Google Cloud
  • From:
Your email has been sent.
or close