“Under the Hood” of Vertica: Using Apache Kafka with Vertica

Presented by

Serge Bonte, Customer Experience Engineer, Vertica

About this talk

Are you making the most of streaming data, providing timely value to your business, and keeping ahead of the competition? Join us for this next session of “Under the Hood” of Vertica as we take a deep dive into Vertica’s integration with Apache Kafka, Hadoop, and other open source solutions to ease data integration challenges and enable continuous, real-time processing and transformation of data streams. Learn how Vertica works with Kafka to stream data into and out of Vertica for actionable insight.

Related topics:

More from this channel

Upcoming talks (0)
On-demand talks (156)
Subscribers (36172)
The Vertica Unified Analytics Platform is built to handle the most demanding analytic use cases and is trusted by thousands of leading data-driven enterprises around the world, including Etsy, Bank of America, Intuit, Uber, and more. Based on a massively scalable architecture with a broad set of analytical functions spanning event and time series, pattern matching, geospatial, and built-in machine learning capability, Vertica enables data analytics teams to easily apply these powerful functions to large and demanding analytical workloads. Vertica unites the major public clouds and on-premises data centers, as needed, and integrates data in cloud object storage and HDFS without forcing any data movement. Available as a SaaS option, or as a customer-managed system, Vertica helps teams combine growing data siloes for a more complete view of available data. Vertica features separation of compute and storage, so teams can spin up storage and compute resources as needed, then spin down afterwards to reduce costs.