Kafka Testing using YAML

Integration testing can be difficult for distributed systems. I’ve found this to be true when testing Kafka to ensure data is produced correctly. I’ve tried using a consumer in my tests to verify data made it into the topic. Unfortunately, the Observer Effect made this unstable because of Kafka’s design to balance partitions between consumers.

I found using a Producer Interceptor solved my testing problem in a less invasive way. (Note that for unit tests, Kafka provides MockProducer and MockConsumer classes that are quite powerful and should be used when applicable.)

Kafka Producer Interceptor

Producers can have interceptors that are given the opportunity to process records and optionally modify them. We’ll use an interceptor that logs the record to a file. The file format will be YAML. YAML allows us to append elements to the file and the format is always valid. With JSON or XML we’d need closing syntax and that becomes more difficult to record a valid file.

The interceptor is configured using the properties given to the KafkaProducer constructor:

  • interceptor.classes = LoggingProducerInterceptor
  • interceptor.LoggingProducerInterceptor.file = /tmp/producer.log

Output Format

The YAML file is an array of JSON objects. The first entry is the configuration for the producer. Each entry after that has the record thread name, key and value. The thread can be used, in addition to the key and/or value, for querying in the test.

Writing the Test

Verification is done by reading the YAML file and finding the record you expect. With Groovy Iterable extensions and YAML/JSON, this is straightforward and readable.

There are two common patterns I found in my tests. First, limiting to the current test thread, which assumes the producer interceptor is called in the same thread as the test. Second, ignoring existing records.

I wrote a helper class that provides the YAML as a collection with optional filtering by thread and/or ignoring the existing records.

Here’s an example of a test using LoggingProducerOutput:

The example is simple but your verification doesn’t have to be. The complete value of the record is available for testing.


I had unstable integration tests that would fail about 10% of the time. Using a YAML file to record producer activity removed the instability when checking for correctly produced records. Also, it can be applied to other technology beyond Kafka.

Happy Testing!

About the Author

Patrick Double profile.

Patrick Double

Principal Technologist

I have been coding since 6th grade, circa 1986, professionally (i.e. college graduate) since 1998 when I graduated from the University of Nebraska-Lincoln. Most of my career has been in web applications using JEE. I work the entire stack from user interface to database.   I especially like solving application security and high availability problems.

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Blog Posts
Managing your Helm deployments with Helmfile
As we’ve been using Kubernetes to build software delivery platforms for our clients, we’ve found Helm to be a reasonable solution to templating and managing deployments to Kubernetes.  Although templating alternatives such as Kustomize have […]
AWS CodeBuild Test Reports for Gradle builds
Although AWS documentation has instructions for adding Test Reports for a maven build they currently lack instructions for a gradle build. You can find the maven instructions here: https://aws.amazon.com/blogs/devops/test-reports-with-aws-codebuild/ Assuming you have your gradle wrapper […]
Structuring SwiftUI Previews for API Calls
SwiftUI, together with Combine and Xcode 11+, provide a powerful toolset for quickly creating an app with a native UI. In Xcode 11+, the preview pane was introduced in order to provide live snapshots of […]
Seamlessly Integrating Micro Apps with iFrame
A recent client wanted to upgrade a small portion of their legacy application with a more modern UI and extra functionality, like fuzzy text search. There are a few approaches to incremental upgrades of legacy […]