Hi [[ session.user.profile.firstName ]]

Confluent

  • Date
  • Rating
  • Views
  • GCP for Apache Kafka® Users: Stream Ingestion and Processing
    GCP for Apache Kafka® Users: Stream Ingestion and Processing
    Ricardo Ferreira, Developer Advocate, Confluent + Karthi Thyagarajan, Solutions Architect, Google Cloud Recorded: May 21 2019 60 mins
    In private and public clouds, stream analytics commonly means stateless processing systems organized around Apache Kafka® or a similar distributed log service. GCP took a somewhat different tack, with Cloud Pub/Sub, Dataflow, and BigQuery, distributing the responsibility for processing among ingestion, processing and database technologies.

    We compare the two approaches to data integration and show how Dataflow allows you to join and transform and deliver data streams among on-prem and cloud Apache Kafka clusters, Cloud Pub/Sub topics and a variety of databases. The session will have a mix of architectural discussions and practical code reviews of Dataflow-based pipelines.
  • Fast Data – Fast Cars: Wie Apache Kafka die Datenwelt Revolutioniert
    Fast Data – Fast Cars: Wie Apache Kafka die Datenwelt Revolutioniert
    David Schmitz, Principal Architect, Audi Electronics Venture GmbH + Kai Waehner, Technology Evangelist, Confluent Recorded: Apr 30 2019 57 mins
    Für die Automobilindustrie ist die digitale Transformation wie für jede andere Branche zugleich eine digitale Revolution: Neue Marktspieler, neue Technologien und die in immer größeren Mengen anfallenden Daten schaffen neue Chancen, aber auch neue Herausforderungen – und erfordern neben neuen IT-Architekturen auch völlig neue Denkansätze.

    60% der Fortune500-Unternehmen setzen zur Umsetzung ihrer Daten-Streaming-Projekte auf die umfassende verteilte Streaming-Plattform Apache Kafka®, darunter auch die AUDI AG.

    Erfahren Sie in diesem Webinar:
    -Wie Kafka als Grundlage sowohl für Daten-Pipelines als auch für Anwendungen dient, die Echtzeit-Datenströme konsumieren und verarbeiten.
    -Wie Kafka Connect und Kafka Streams geschäftskritische Anwendungen unterstützt
    -Wie Audi mithilfe von Kafka und Confluent eine Fast Data IoT-Plattform umgesetzt hat, die den Bereich „Connected Car“ revolutioniert
  • Apache Kafka® im Unternehmenseinsatz: 10 Lektionen
    Apache Kafka® im Unternehmenseinsatz: 10 Lektionen
    Charalampos Papadopoulos, System Engineer Big Data / Analytics, SVA + Marie Fraune, System Engineer Big Data / Analytics, SVA Recorded: Apr 24 2019 22 mins
    Im digitalen Zeitalter sind Events allgegenwärtig. Unternehmen haben ihre Geschäftsmodelle neu ausgerichtet, sind dynamischer, technologie- und dienstleistungsorientierter geworden, was in event-getriebenen, komplexen Geschäftsabläufen in Echtzeit resultiert.

    Dabei setzen 60% der Fortune100-Unternehmen auf Event-Streaming-Plattformen als grundlegende Technologie - Apache Kafka hat sich hier als de-facto Standard etabliert.Welche Stolpersteine und Herausforderungen beim produktiven Einsatz von Apache Kafka zu beachten sind, hat Confluent-Partner SVA in einem kurzweiligen Webinar zusammen gefasst.

    Security, Retention Time oder Exactly Once-Semantiken sind nur drei der Themen, welche wir beleuchten werden.
  • An Introduction to KSQL & Kafka Streams Processing with Ticketmaster
    An Introduction to KSQL & Kafka Streams Processing with Ticketmaster
    Dani Traphagen, Sr. Systems Engineer, Confluent + Chris Smith, VP Engineering Data Science, Ticketmaster Recorded: Apr 23 2019 64 mins
    In this all too fabulous talk with Ticketmaster, we will be addressing the wonderful and new wonders of KSQL vs. KStreams.

    If you are new-ish to Apache Kafka® you may ask yourself, “What is a large Apache Kafka deployment?” And you may tell yourself, “This is not my beautiful KSQL use case!” And you may tell yourself, “This is not my beautiful KStreams use case!” And you may ask yourself, “What is a beautiful Apache Kafka use case?” And you may ask yourself, “Am I right about this architecture? Am I wrong?” And you may say to yourself, “My God! What have I done?”

    In this session, we’re going to delve into all these issues and more with Chris Smith, VP of Engineering Data Science at Ticketmaster.

    Watch now to learn:
    -Ticketmaster Apache Kafka Architecture
    -KSQL Architecture and Use Cases
    -KSQL Performance Considerations
    -When to KSQL and When to Live the KStream
    -How Ticketmaster uses KSQL and KStreams in production to reduce development friction in machine learning products
  • No More Silos: Integrating Databases into Apache Kafka®
    No More Silos: Integrating Databases into Apache Kafka®
    Robin Moffatt, Developer Advocate, Confluent Recorded: Apr 11 2019 57 mins
    Companies new and old are all recognizing the importance of a low-latency, scalable, fault-tolerant data backbone, in the form of the Apache Kafka® streaming platform. With Apache Kafka, developers can integrate multiple sources and systems, which enables low latency analytics, event-driven architectures and the population of multiple downstream systems.

    In this talk, we’ll look at one of the most common integration requirements – connecting databases to Apache Kafka. We’ll consider the concept that all data is a stream of events, including that residing within a database. We’ll look at why we’d want to stream data from a database, including driving applications in Apache Kafka from events upstream. We’ll discuss the different methods for connecting databases to Apache Kafka, and the pros and cons of each. Techniques including Change-Data-Capture (CDC) and Apache Kafka Connect will be covered, as well as an exploration of the power of KSQL, streaming SQL for Apache Kafka, for performing transformations such as joins on the inbound data.

    Register now to learn:
    •Why databases are just a materialized view of a stream of events
    •The best ways to integrate databases with Apache Kafka
    •Anti-patterns to be aware of
    •The power of KSQL for transforming streams of data in Apache Kafka
  • Bridge to Cloud: Using Apache Kafka to Migrate to AWS
    Bridge to Cloud: Using Apache Kafka to Migrate to AWS
    Priya Shivakumar (Confluent) + Konstantine Karantasis (Confluent) + Rohit Pujari (AWS) Recorded: Apr 9 2019 57 mins
    Speakers: Priya Shivakumar, Director of Product, Confluent + Konstantine Karantasis, Software Engineer, Confluent + Rohit Pujari, Partner Solutions Architect, AWS

    Most companies start their cloud journey with a new use case, or a new application. Sometimes these applications can run independently in the cloud, but often times they need data from the on premises datacenter. Existing applications will slowly migrate, but will need a strategy and the technology to enable a multi-year migration.

    In this session, we will share how companies around the world are using Confluent Cloud, a fully managed Apache Kafka service, to migrate to AWS. By implementing a central-pipeline architecture using Apache Kafka to sync on-prem and cloud deployments, companies can accelerate migration times and reduce costs.

    In this online talk we will cover:
    •How to take the first step in migrating to AWS
    •How to reliably sync your on premises applications using a persistent bridge to cloud
    •Learn how Confluent Cloud can make this daunting task simple, reliable and performant
    •See a demo of the hybrid-cloud and multi-region deployment of Apache Kafka
  • Bridge to Cloud: Using Apache Kafka to Migrate to GCP
    Bridge to Cloud: Using Apache Kafka to Migrate to GCP
    Priya Shivakumar, Director of Product, Confluent + Ryan Lippert, Product Marketing, Google Cloud Recorded: Mar 27 2019 56 mins
    Most companies start their cloud journey with a new use case, or a new application. Sometimes these applications can run independently in the cloud, but often times they need data from the on premises datacenter. Existing applications will slowly migrate, but will need a strategy and the technology to enable a multi-year migration.

    In this session, we will share how companies around the world are using Confluent Cloud, a fully managed Apache Kafka® service, to migrate to Google Cloud Platform. By implementing a central-pipeline architecture using Apache Kafka to sync on-prem and cloud deployments, companies can accelerate migration times and reduce costs.

    Register now to learn:
    -How to take the first step in migrating to GCP
    -How to reliably sync your on premises applications using a persistent bridge to cloud
    -How Confluent Cloud can make this daunting task simple, reliable and performant
  • Using Apache Kafka to Optimize Real-Time Analytics in Financial Services & IoT
    Using Apache Kafka to Optimize Real-Time Analytics in Financial Services & IoT
    Peter Simpson, VP Panopticon Streaming Analytics, Datawatch + Tom Underhill, Partner Solutions Architect, Confluent Recorded: Mar 20 2019 53 mins
    When it comes to the fast-paced nature of capital markets and IoT, the ability to analyze data in real time is critical to gaining an edge. It’s not just about the quantity of data you can analyze at once, it’s about the speed, scale, and quality of the data you have at your fingertips.

    Modern streaming data technologies like Apache Kafka and the broader Confluent platform can help detect opportunities and threats in real time. They can improve profitability, yield, and performance. Combining Kafka with Panopticon visual analytics provides a powerful foundation for optimizing your operations.

    Use cases in capital markets include transaction cost analysis (TCA), risk monitoring, surveillance of trading and trader activity, compliance, and optimizing profitability of electronic trading operations. Use cases in IoT include monitoring manufacturing processes, logistics, and connected vehicle telemetry and geospatial data.

    This online talk will include in depth practical demonstrations of how Confluent and Panopticon together support several key applications. You will learn:

    -Why Apache Kafka is widely used to improve performance of complex operational systems
    -How Confluent and Panopticon open new opportunities to analyze operational data in real time
    -How to quickly identify and react immediately to fast-emerging trends, clusters, and anomalies
    -How to scale data ingestion and data processing
    -Build new analytics dashboards in minutes
  • Express Scripts: Driving Digital Transformation from Mainframe to Microservices
    Express Scripts: Driving Digital Transformation from Mainframe to Microservices
    Ankur Kaneria, Principal Architect, Express Scripts + Kevin Petrie, Attunity + Alan Hsia, Confluent Recorded: Mar 12 2019 59 mins
    Express Scripts is reimagining its data architecture to bring best-in-class user experience and provide the foundation of next-generation applications. The challenge lies in the ability to efficiently and cost-effectively access the ever-increasing amount of data.

    This online talk will showcase how Apache Kafka® plays a key role within Express Scripts’ transformation from mainframe to a microservices-based ecosystem, ensuring data integrity between two worlds. It will discuss how change data capture (CDC) technology is leveraged to stream data changes to Confluent Platform, allowing a low-latency data pipeline to be built.

    Watch now to learn:

    -Why Apache Kafka is an ideal data integration platform for microservices
    -How Express Scripts is building cloud-based microservices when the system of record is a relational database residing on an on-premise mainframe
    -How Confluent Platform allows for data integrity between disparate platforms and meets real time SLAs and low-latency requirements
    -How Attunity Replicate software is leveraged to stream data changes to Apache Kafka, allowing you to build a low-latency data pipeline
  • Modernizing Your Application Architecture with Microservices
    Modernizing Your Application Architecture with Microservices
    Joe deBuzna, VP Field Engineering, HVR + Chong Yan, Solution Architect, Confluent Recorded: Feb 28 2019 39 mins
    Organizations are quickly adopting microservice architectures to achieve better customer service and improve user experience while limiting downtime and data loss. However, transitioning from a monolithic architecture based on stateful databases to truly stateless microservices can be challenging and requires the right set of solutions.

    In this webinar, learn from field experts as they discuss how to convert the data locked in traditional databases into event streams using HVR and Apache Kafka®. They will show you how to implement these solutions through a real-world demo use case of microservice adoption.

    You will learn:

    -How log-based change data capture (CDC) converts database tables into event streams
    -How Kafka serves as the central nervous system for microservices
    -How the transition to microservices can be realized without throwing away your legacy infrastructure

Embed in website or blog