Testing is an indispensable and potentially a very expensive part of the software development life cycle. Data-driven testing (DDT) is a software testing methodology that essentially separates the test data from the test script, which lends itself particularly well to conceptualize an automation framework, in which testing is triggered on sets of data sources. This automated framework resolves the lengthy and time-consuming process of creating individual test cases for each data set. Traditionally, the test data sources are stored in files or databases, but with today’s modern computing infrastructure, we want to look at event streams as the data source. One advantage of leveraging on event streams is the ability to ingest high volumes of data with low latency in the response time. This talk will introduce you to event streaming and processing, and, among a wide range of systems that it is suited for, we will focus on how it can help you build systems for doing specific QA tasks such as performance and volume/spike testing. We will work on building a simple test data pipeline using a powerful open source streaming platform called Apache Pulsar.