Best Practices for Real-Time Data Loading with Kafka into Vertica

Logo
Presented by

Serge Bonte, Vertica Best Practices and Paige Roberts, Vertica Product Marketing

About this talk

If you load data into Vertica in real-time using Kafka, then this session is for you. Learn best practices for configuring the Vertica Kafka scheduler to load various kinds of data streams into Vertica, as well as how to properly size data frames to achieve efficient and fast loading of streaming data. You will also learn how to stream data from Vertica into Kafka and receive tips and tricks for how to monitor and manage data stream loading.

Related topics:

More from this channel

Upcoming talks (2)
On-demand talks (158)
Subscribers (36600)
The Vertica Unified Analytics Platform is built to handle the most demanding analytic use cases and is trusted by thousands of leading data-driven enterprises around the world, including Etsy, Bank of America, Uber, and more. Based on a massively scalable architecture with a broad set of analytical functions spanning event and time series, pattern matching, geospatial, and built-in machine learning capability, Vertica enables data analytics teams to easily apply these powerful functions to large and demanding analytical workloads. Vertica unites the major public clouds and on-premises data centers, as needed, and integrates data in cloud object storage and HDFS without forcing any data movement. Available as a SaaS option, or as a customer-managed system, Vertica helps teams combine growing data siloes for a more complete view of available data. Vertica features separation of compute and storage, so teams can spin up storage and compute resources as needed, then spin down afterwards to reduce costs.